Science.gov

Sample records for boson mapping approach

  1. Boson mapping treatment of atoms-photon systems

    SciTech Connect

    Reboiro, M.

    2007-02-12

    A system which consists of A identical atoms, each of them having three atomic levels, and one optical photon is analyzed in terms of an algebraic approach. The symmetries exhibit by the model are used to construct an exact boson mapping of the Hamiltonian. The image-Hamiltonian reduces to a form which contains one boson and two atomic levels.

  2. Composite fermion-boson mapping for fermionic lattice models.

    PubMed

    Zhao, J; Jiménez-Hoyos, C A; Scuseria, G E; Huerga, D; Dukelsky, J; Rombouts, S M A; Ortiz, G

    2014-11-12

    We present a mapping of elementary fermion operators onto a quadratic form of composite fermionic and bosonic cluster operators. The mapping is an exact isomorphism as long as the physical constraint of one composite particle per cluster is satisfied. This condition is treated on average in a composite particle mean-field approach, which consists of an ansatz that decouples the composite fermionic and bosonic sectors. The theory is tested on the 1D and 2D Hubbard models. Using a Bogoliubov determinant for the composite fermions and either a coherent or Bogoliubov state for the bosons, we obtain a simple and accurate procedure for treating the Mott insulating phase of the Hubbard model with mean-field computational cost.

  3. Kinetic Monte Carlo approach to nonequilibrium bosonic systems

    NASA Astrophysics Data System (ADS)

    Liew, T. C. H.; Flayac, H.; Poletti, D.; Savenko, I. G.; Laussy, F. P.

    2017-09-01

    We consider the use of a kinetic Monte Carlo approach for the description of nonequilibrium bosonic systems, taking nonresonantly excited exciton-polariton condensates and bosonic cascade lasers as examples. In the former case, the considered approach allows the study of the cross-over between incoherent and coherent regimes, which represents the formation of a quasicondensate that forms purely from the action of energy relaxation processes rather than interactions between the condensing particles themselves. In the latter case, we show theoretically that a bosonic cascade can develop an output coherent state.

  4. A Spin-Boson Screening approach for unraveling dominant vibrational energy transfer pathways in molecular materials

    NASA Astrophysics Data System (ADS)

    Chuntonov, Lev; Peskin, Uri

    2017-01-01

    Vibrational energy transfer driven by anharmonicity is the major mechanism of energy dissipation in polyatomic molecules and in molecules embedded in condensed phase environment. Energy transfer pathways are sensitive to the particular intra-molecular structure as well as to specific interactions between the molecule and its environment, and their identification is a challenging many-body problem. This work introduces a theoretical approach which enables to identify the dominant pathways for specified initial excitations, by screening the different possible relaxation channels. For each channel, the many-body Hamiltonian is mapped onto a respective all-vibrational Spin-Boson Hamiltonian, expressed in terms of the harmonic frequencies and the anharmonic coupling parameters obtained from the electronic structure of the molecule in its environment. A focus is given on the formulation of the relaxation rates when different limits of perturbation theory apply. In these cases the proposed Spin-Boson Screening approach becomes especially powerful.

  5. X-boson cumulant approach to the topological Kondo insulators

    NASA Astrophysics Data System (ADS)

    Ramos, E.; Franco, R.; Silva-Valencia, J.; Foglio, M. E.; Figueira, M. S.

    2014-12-01

    In this work we present a generalization of our previous work of the X-boson approach to the periodic Anderson model (PAM), adequate to study a novel class of intermetallic 4f and 5f orbitals materials: the topological Kondo insulators, whose paradigmatic material is the compound SmB6. For simplicity, we consider a version of the PAM on a 2D square lattice, adequate to describe Ce-based compounds in two dimensions. The starting point of the model is the 4f - Ce ions orbitals, with J = 5/2 multiplet, in the presence of spin-orbit coupling. Our technique works well for all of the parameters of the model and avoids the unwanted phase transitions of the slave boson mean field theory. We present a critical comparison of our results with those of the usual slave boson method, that has been intensively used to describe this class of materials. We also obtain a new valence first order transition which we attribute to the vec k dependence of the hybridization.

  6. Self-consistent Hartree-Fock approach for interacting bosons in optical lattices

    NASA Astrophysics Data System (ADS)

    Lü, Qin-Qin; Patton, Kelly R.; Sheehy, Daniel E.

    2014-12-01

    A theoretical study of interacting bosons in a periodic optical lattice is presented. Instead of the commonly used tight-binding approach (applicable near the Mott-insulating regime of the phase diagram), the present work starts from the exact single-particle states of bosons in a cubic optical lattice, satisfying the Mathieu equation, an approach that can be particularly useful at large boson fillings. The effects of short-range interactions are incorporated using a self-consistent Hartree-Fock approximation, and predictions for experimental observables such as the superfluid transition temperature, condensate fraction, and boson momentum distribution are presented.

  7. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  8. The Fermion-Boson Mapping Applied to Lagrangian Models for Charge-Density-Waves in One-Dimensional Systems

    NASA Astrophysics Data System (ADS)

    Belvedere, L. V.; Amaral, R. L. P. G.; de Queiroz, A. F.

    We use the two-dimensional Fermion-Boson mapping to perform a field theory analysis of the effective Lagrangian model for incommensurate charge-density waves (ICDW) in one-dimensional systems. We consider an approach in which both the phase of the complex phonon field and the electron field are dynamical degrees of freedom contributing to the quantum dynamics and symmetry-related features of the ICDW phenomenon. We obtain the bosonized and fermionized versions of the effective electron-phonon Lagrangian. The phase of the phonon field and the phase of the bosonized chiral density of the electron field condense as a soliton order parameter, carrying neither the charge nor the chirality of the electron-phonon system, leading to a periodic sine-Gordon potential. The phonon field is fermionized in terms of a chiral fermionic condensate and the effective model is mapped into the chiral Gross-Neveu (GN) model with two Fermi field species. The linked electron-phonon symmetry of the ICDW system is mapped into the chiral symmetry of the GN model. Within the functional integral formulation, we obtain for the vacuum expectation value of the phonon field < φ > = 0 and < φ φast > ≠ 0, due to the charge selection rule associated with the chiral electron-phonon symmetry. We show that the two-point correlation function of the phonon field satisfies the cluster decomposition property, as required by the chiral symmetry of the underlying GN model. The quantum description of the ICDW corresponds to charge transport through the lattice, due to the propagation of a ``Goldstone mode'' carrying the effective charge of the electron-phonon system, is accomplished by an electron-lattice energy redistribution. This accounts for a dynamical Peierls's energy gap generation.

  9. Map Projections: Approaches and Themes

    ERIC Educational Resources Information Center

    Steward, H. J.

    1970-01-01

    Map projections take on new meaning with location systems needed for satellites, other planets and space. A classroom approach deals first with the relationship between the earth and the globe, then with transformations to flat maps. Problems of preserving geometric qualities: distance, angles, directions are dealt with in some detail as are…

  10. Map Projections: Approaches and Themes

    ERIC Educational Resources Information Center

    Steward, H. J.

    1970-01-01

    Map projections take on new meaning with location systems needed for satellites, other planets and space. A classroom approach deals first with the relationship between the earth and the globe, then with transformations to flat maps. Problems of preserving geometric qualities: distance, angles, directions are dealt with in some detail as are…

  11. X-slave boson approach to the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Franco, R.; Figueira, M. S.; Foglio, M. E.

    2001-05-01

    The periodic anderson model (PAM) in the limit U=∞, can be studied by employing the Hubbard X operators to project out the unwanted states. In a previous work, we have studied the cumulant expansion of this Hamiltonian employing the hybridization as a perturbation, but probability conservation of the local states (completeness) is not usually satisfied when partial expansions like the "chain approximation (CHA)" are employed. To consider this problem, we use a technique similar to the one employed by Coleman to treat the same problem with slave-bosons in the mean-field approximation. Assuming a particular renormalization for hybridization, we obtain a description that avoids an unwanted phase transition that appears in the mean-field slave-boson method at intermediate temperatures.

  12. X-boson cumulant approach to the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Franco, R.; Figueira, M. S.; Foglio, M. E.

    2002-07-01

    The periodic Anderson model can be studied in the limit U=∞ by employing the Hubbard X operators to project out the unwanted states. We had already studied this problem by employing the cumulant expansion with the hybridization as perturbation, but the probability conservation of the local states (completeness) is not usually satisfied when partial expansions like the ``chain approximation'' (CHA) are employed. To rectify this situation, we modify the CHA by employing a procedure that was used in the mean-field approximation of Coleman's slave-boson method. Our technique reproduces the features of that method in its region of validity, but avoids the unwanted phase transition that appears in the same method both when μ>>Ef at low T and for all values of the parameters at intermediate temperatures. Our method also has a dynamic character that is absent from the mean-field slave-boson method.

  13. Nonunitary and unitary approach to Eigenvalue problem of Boson operators and squeezed coherent states

    NASA Technical Reports Server (NTRS)

    Wunsche, A.

    1993-01-01

    The eigenvalue problem of the operator a + zeta(boson creation operator) is solved for arbitrarily complex zeta by applying a nonunitary operator to the vacuum state. This nonunitary approach is compared with the unitary approach leading for the absolute value of zeta less than 1 to squeezed coherent states.

  14. Coherent state approach to the interacting boson model: Test of its validity in the transitional region

    SciTech Connect

    Inci, I.; Alonso, C. E.; Arias, J. M.; Fortunato, L.; Vitturi, A.

    2009-09-15

    The predictive power of the coherent state (CS) approach to the interacting boson model (IBM) is tested far from the IBM dynamical symmetry limits. The transitional region along the {gamma}-unstable path from U(5) to O(6) is considered. Excitation energy of the excited {beta} band and intraband and interband transitions obtained within the CS approach are compared with the exact results as a function of the boson number N. We find that the CS formalism provides approximations to the exact results that are correct up to the order 1/N in the transitional region, except in a narrow region close to the critical point.

  15. Study of molecular vibration by coupled cluster method: Bosonic approach

    NASA Astrophysics Data System (ADS)

    Banik, Subrata; Pal, Sourav; Prasad, M. Durga

    2015-01-01

    The vibrational coupled cluster method in bosonic representation is formulated to describe the molecular anharmonic vibrational spectra. The vibrational coupled cluster formalism is based on Watson Hamiltonian in normal coordinates. The vibrational excited states are described using coupled cluster linear response theory (CCLRT). The quality of the coupled cluster wave function is analyzed. Specifically, the mean displacement values of the normal coordinates and expectation values of the square of the normal coordinates of different vibrational states are calculated. A good agreement between the converged full CI results and coupled cluster results is found for the lower lying vibrational states.

  16. Correlated-pair approach to composite-boson scattering lengths

    NASA Astrophysics Data System (ADS)

    Shiau, Shiue-Yuan; Combescot, Monique; Chang, Yia-Chung

    2016-11-01

    We derive the scattering length of composite bosons (cobosons) within the framework of the composite-boson many-body formalism that uses correlated-pair states as a basis instead of free-fermion states. The integral equation constructed from this physically relevant basis makes transparent the role of fermion exchange in the coboson-coboson effective scattering. Three potentials used for Cooper pairs, fermionic-atom dimers, and semiconductor excitons are considered. While the s -wave scattering length for the BCS-like potential is just equal to its Born value, the other two are substantially smaller. For fermionic-atom dimers and semiconductor excitons, our results, calculated within a restricted correlated-pair basis, are in good agreement with those obtained from procedures numerically more demanding. We also propose model coboson-coboson scatterings that are separable and thus easily workable and that produce scattering lengths which match quantitatively well with the numerically obtained values for all fermion mass ratios. These separable model scatterings can facilitate future works on many-body effects in coboson gases.

  17. Perturbative Approaching for Boson Fields' System in a Lewis-Papapetrou Space-Time

    SciTech Connect

    Murariu, G.; Dariescu, M. A.; Dariescu, C.

    2010-08-04

    In this paper the first order solutions of a Klein--Gordon--Maxwell--Einstein coupled system equations were derived for boson fields in a Lewis Papapetrou space time. The results expand the previous static solutions obtained in literature. A main goal is represented by the symbolic script built for such approach.

  18. Mott transition in the dynamic Hubbard model within slave boson mean-field approach

    NASA Astrophysics Data System (ADS)

    Le, Duc-Anh

    2014-04-01

    At zero temperature, the Kotliar-Ruckenstein slave boson mean-field approach is applied to the dynamic Hubbard model. In this paper, the influences of the dynamics of the auxiliary boson field on the Mott transition are investigated. At finite boson frequency, the Mott-type features of the Hubbard model is found to be enhanced by increasing the pseudospin coupling parameter g. For sufficiently large pseudospin coupling g, the Mott transition occurs even for modest values of the bare Hubbard interaction U. The lack of electron-hole symmetry is highlighted through the quasiparticle weight. Our results are in good agreement with the ones obtained by two-site dynamical mean-field theory and determinant quantum Monte Carlo simulation.

  19. Quantum phase transitions in the sub-Ohmic spin-boson model: failure of the quantum-classical mapping.

    PubMed

    Vojta, Matthias; Tong, Ning-Hua; Bulla, Ralf

    2005-02-25

    The effective theories for many quantum phase transitions can be mapped onto those of classical transitions. Here we show that the naive mapping fails for the sub-Ohmic spin-boson model which describes a two-level system coupled to a bosonic bath with power-law spectral density, J(omega) proportional, variantomega(s). Using an epsilon expansion we prove that this model has a quantum transition controlled by an interacting fixed point at small s, and support this by numerical calculations. In contrast, the corresponding classical long-range Ising model is known to display mean-field transition behavior for 0 < s < 1/2, controlled by a noninteracting fixed point. The failure of the quantum-classical mapping is argued to arise from the long-ranged interaction in imaginary time in the quantum model.

  20. Quantum Phase Transitions in the Sub-Ohmic Spin-Boson Model: Failure of the Quantum-Classical Mapping

    NASA Astrophysics Data System (ADS)

    Vojta, Matthias; Tong, Ning-Hua; Bulla, Ralf

    2005-02-01

    The effective theories for many quantum phase transitions can be mapped onto those of classical transitions. Here we show that the naive mapping fails for the sub-Ohmic spin-boson model which describes a two-level system coupled to a bosonic bath with power-law spectral density, J(ω)∝ωs. Using an ɛ expansion we prove that this model has a quantum transition controlled by an interacting fixed point at small s, and support this by numerical calculations. In contrast, the corresponding classical long-range Ising model is known to display mean-field transition behavior for 0mapping is argued to arise from the long-ranged interaction in imaginary time in the quantum model.

  1. Quantum Langevin approach for non-Markovian quantum dynamics of the spin-boson model

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng-Yang; Chen, Mi; Yu, Ting; You, J. Q.

    2016-02-01

    One longstanding difficult problem in quantum dissipative dynamics is to solve the spin-boson model in a non-Markovian regime where a tractable systematic master equation does not exist. The spin-boson model is particularly important due to its crucial applications in quantum noise control and manipulation as well as its central role in developing quantum theories of open systems. Here we solve this important model by developing a non-Markovian quantum Langevin approach. By projecting the quantum Langevin equation onto the coherent states of the bath, we can derive a set of non-Markovian quantum Bloch equations containing no explicit noise variables. This special feature offers a tremendous advantage over the existing stochastic Schrödinger equations in numerical simulations. The physical significance and generality of our approach are briefly discussed.

  2. Verwey Metal-Insulator Transition in Magnetite from the Slave-Boson Approach

    NASA Astrophysics Data System (ADS)

    Sherafati, Mohammad; Satpathy, Sashi; Pettey, Dix

    2013-03-01

    We study the Verwey metal-insulator transition in magnetite (Ref.1) by solving a three-band extended Hubbard Hamiltonian for spinless fermions using the slave-boson approach, which also includes coupling to the local phonon modes. This model is suggested from the earlier density-functional studies of magnetite.(Ref.2) We first solve the 1D Hubbard model for the spinless fermions with nearest-neighbor interaction by both Gutzwiller variational and slave-boson methods and show that these two approaches yield different results unlike in the case of the standard Hubbard model, thereby clarifying some of the discrepancies in the literature (Ref.3), then we extend the formalism to three-band Hamiltonian for magnetite. The results suggest a metal-insulator transition at a critical value for the intersite interaction.

  3. The exhaustion problem in the periodic Anderson model: An X-boson approach

    NASA Astrophysics Data System (ADS)

    Franco, R.; Silva-Valencia, J.; Figueira, M. S.

    2006-10-01

    We study the thermodynamical properties of the periodic Anderson model (PAM), within the X-boson approach. The exhaustion problem is studied and we calculate the entropy and the specific heat for the heavy fermion Kondo regime (HF-K) of the PAM. We compute numerically the evolution of the Kondo lattice TKL and the Fermi liquid T* temperatures as function of the conduction electron occupation number nc. The results obtained are consistent with others reported in the literature for the Kondo lattice.

  4. Supersymmetric Ito equation: Bosonization and exact solutions

    SciTech Connect

    Ren Bo; Yu Jun; Lin Ji

    2013-04-15

    Based on the bosonization approach, the N=1 supersymmetric Ito (sIto) system is changed to a system of coupled bosonic equations. The approach can effectively avoid difficulties caused by intractable fermionic fields which are anticommuting. By solving the coupled bosonic equations, the traveling wave solutions of the sIto system are obtained with the mapping and deformation method. Some novel types of exact solutions for the supersymmetric system are constructed with the solutions and symmetries of the usual Ito equation. In the meanwhile, the similarity reduction solutions of the model are also studied with the Lie point symmetry theory.

  5. A new approach to shortest paths on networks based on the quantum bosonic mechanism

    NASA Astrophysics Data System (ADS)

    Jiang, Xin; Wang, Hailong; Tang, Shaoting; Ma, Lili; Zhang, Zhanli; Zheng, Zhiming

    2011-01-01

    This paper presents quantum bosonic shortest path searching (QBSPS), a natural, practical and highly heuristic physical algorithm for reasoning about the recognition of network structure via quantum dynamics. QBSPS is based on an Anderson-like itinerant bosonic system in which a boson's Green function is used as a navigation pointer for one to accurately approach the terminals. QBSPS is demonstrated by rigorous mathematical and physical proofs and plenty of simulations, showing how it can be used as a greedy routing to seek the shortest path between different locations. In methodology, it is an interesting and new algorithm rooted in the quantum mechanism other than combinatorics. In practice, for the all-pairs shortest-path problem in a random scale-free network with N vertices, QBSPS runs in O(μ(N) ln ln N) time. In application, we suggest that the corresponding experimental realizations are feasible by considering path searching in quantum optical communication networks; in this situation, the method performs a pure local search on networks without requiring the global structure that is necessary for current graph algorithms.

  6. Supersymmetric transformation approach to pseudopotentials in condensed matter physics and bosonic superconductivity in two dimensions

    NASA Astrophysics Data System (ADS)

    Zhu, Wei

    This thesis is divided into two parts. The first part, "Supersymmetric Transformation Approach to Pseudopotentials in Condensed Matter Physics", provides a new method to obtain pseudopotentials, The conventional methods of constructing pseudopotentials based on the spirit of Orthogonalized Plane Wave and Augmented Plane Wave, etc. as well as the modern version of the norm-conserving pseudopotentials through density functional theory are first reviewed. Our new supersymmetric approach is aimed at eliminating some of the disadvantages while retaining in full the advantages such as phase equivalence or norm-conserving properties of the pseudopotentials. Vast amounts of numerical computation can be eliminated as compared to the old methods. Details and examples are given. Part two, "Bosonic Superconductivity in Two Dimensions", describes a theory for high Tc superconductivity aimed at the current cuprates superconductors. The current status of the cuprates is first reviewed. A one-band Hubbard model is used to formulate the interaction among the holes doped into the layered compounds. Tightly bound pairs of size ˜ a few lattice spacings are obtained based on the Antiferromagnetic Background Approximation. They are shown to have the dsb{xsp2-ysp2} symmetry. Such boson-like pairs form the basis of charged boson models. After reviewing the properties of an ideal charged bose gas including a perfect Meissner effect for 3D, and a nearly perfect Meissner effect for 2D, we develop a theory for high Tc superconductivity without interlayer coupling as adapted, on the one hand, from Friedberg-Lee's mixed Boson-Fermion model to 2D and, on the other hand, from May's work on two-dimensional ideal charged bosons. In addition to the critical temperature Tsb{May} for transition to a phase exhibiting a near-perfect Meissner effect, a new transition temperature Tsb{c} depending on the finite area of the system and the temperature-dependent coherence length is introduced. The appearance

  7. Exciton-exciton scattering: Composite boson versus elementary boson

    NASA Astrophysics Data System (ADS)

    Combescot, M.; Betbeder-Matibet, O.; Combescot, R.

    2007-05-01

    This paper shows the necessity of introducing a quantum object, the “coboson,” to properly describe, through a fermion scheme, any composite particle, such as the exciton, which is made of two fermions. Although commonly dealt with as elementary bosons, these composite bosons—cobosons in short—differ from them due to their composite nature which makes the handling of their many-body effects quite different from the existing treatments valid for elementary bosons. As a direct consequence of this composite nature, there is no correct way to describe the interaction between cobosons as a potential V . This is rather dramatic because, with the Hamiltonian not written as H=H0+V , all the usual approaches to many-body effects fail. In particular, the standard form of the Fermi golden rule, written in terms of V , cannot be used to obtain the transition rates of two cobosons. To get them, we have had to construct an unconventional expression for this Fermi golden rule in which H only appears. Making use of this expression, we give here a detailed calculation of the time evolution of two excitons. We compare the results of this exact approach with the ones obtained by using an effective bosonic Hamiltonian in which the excitons are considered as elementary bosons with effective scatterings between them, these scatterings resulting from an elaborate mapping between the two-fermion space and the ideal boson space. We show that the relation between the inverse lifetime and the sum of the transition rates for elementary bosons differs from the one of the composite bosons by a factor of 1/2 , so that it is impossible to find effective scatterings between bosonic excitons giving these two physical quantities correctly, whatever the mapping from composite bosons to elementary bosons is. The present paper thus constitutes a strong mathematical proof that, in spite of a widely spread belief, we cannot forget the composite nature of these cobosons, even in the extremely low

  8. Double occupancy in dynamical mean-field theory and the dual boson approach

    NASA Astrophysics Data System (ADS)

    van Loon, Erik G. C. P.; Krien, Friedrich; Hafermann, Hartmut; Stepanov, Evgeny A.; Lichtenstein, Alexander I.; Katsnelson, Mikhail I.

    2016-04-01

    We discuss the calculation of the double occupancy using dynamical mean-field theory in finite dimensions. The double occupancy can be determined from the susceptibility of the auxiliary impurity model or from the lattice susceptibility. The former method typically overestimates, whereas the latter underestimates the double occupancy. We illustrate this for the square-lattice Hubbard model. We propose an approach for which both methods lead to identical results by construction and which resolves this ambiguity. This self-consistent dual boson scheme results in a double occupancy that is numerically close to benchmarks available in the literature.

  9. Non-equilibrium slave bosons approach to quantum pumping in interacting quantum dots

    NASA Astrophysics Data System (ADS)

    Citro, Roberta; Romeo, Francesco

    2016-03-01

    We review a time-dependent slave bosons approach within the non-equilibrium Green's function technique to analyze the charge and spin pumping in a strongly interacting quantum dot. We study the pumped current as a function of the pumping phase and of the dot energy level and show that a parasitic current arises, beyond the pure pumping one, as an effect of the dynamical constraints. We finally illustrate an all-electrical mean for spin-pumping and discuss its relevance for spintronics applications.

  10. Usage-Oriented Topic Maps Building Approach

    NASA Astrophysics Data System (ADS)

    Ellouze, Nebrasse; Lammari, Nadira; Métais, Elisabeth; Ben Ahmed, Mohamed

    In this paper, we present a collaborative and incremental construction approach of multilingual Topic Maps based on enrichment and merging techniques. In recent years, several Topic Map building approaches have been proposed endowed with different characteristics. Generally, they are dedicated to particular data types like text, semi-structured data, relational data, etc. We note also that most of these approaches take as input monolingual documents to build the Topic Map. The problem is that the large majority of resources available today are written in various languages, and these resources could be relevant even to non-native speakers. Thus, our work is driven towards a collaborative and incremental method for Topic Map construction from textual documents available in different languages. To enrich the Topic Map, we take as input a domain thesaurus and we propose also to explore the Topic Map usage which means available potential questions related to the source documents.

  11. The New Approach for Earhtquake Hazard Mapping

    NASA Astrophysics Data System (ADS)

    Handayani, B.; Karnawati, D.; Anderson, R.

    2008-05-01

    It is the fact the hazard map, such as Earthquake Hazard Map, may not always effectively implemented in the mitigation effort. All of the hazard maps are technical maps which is not always easy to be understood and followed by the community living in the vulnerable areas. Therefore, some effots must be done to guarantee the effectiveness of hazard map. This paper will discuss about the approach and method for developing more appropriate earthquake hazard map in Bantul Regency, Yogyakarta, Indonesia. Psychological mapping to identify levels and distributions of community trauma is proposed as the early reference for earhquake hazard mapping. By referring to this trauma zonation and combining with the seismicity and geological mapping, the earthquake hazard mapping can be established. It is also interesting that this approach is not only providing more appropriate hazard map, but also stimulating the community empowerement in the earthquake vulnerable areas. Several training for improving community awareness are also conducted as a part of the mapping process.

  12. Structural evolution in A ≈100 nuclei within the mapped interacting boson model based on the Gogny energy density functional

    NASA Astrophysics Data System (ADS)

    Nomura, K.; Rodríguez-Guzmán, R.; Robledo, L. M.

    2016-10-01

    The structure of even-even neutron-rich Ru, Mo, Zr, and Sr nuclei in the A ≈100 mass region is studied within the interacting boson model (IBM) with microscopic input from the self-consistent mean-field approximation based on the Gogny-D1M energy density functional. The deformation-energy surface in the quadrupole deformation space (β ,γ ) , computed within the constrained Hartree-Fock-Bogoliubov framework, is mapped onto the expectation value of the appropriately chosen IBM Hamiltonian with configuration mixing in the boson condensate state. The mapped IBM Hamiltonian is used to study the spectroscopic properties of Ru-11498, Mo-11296, Zr-11094, and Sr-10892. Several cases of γ -soft behavior are predicted in Ru and Mo nuclei while a pronounced coexistence between strongly prolate and weakly oblate deformed shapes is found for Zr and Sr nuclei. The method describes well the evolution of experimental yrast and nonyrast states as well as selected B (E 2 ) transition probabilities.

  13. A flexible approach to genome map assembly

    SciTech Connect

    Harley, E.; Bonner, A.J.

    1994-12-31

    A major goal of the Human Genome Project is to construct detailed physical maps of the human genome. A physical map is an assignment of DNA fragments to their locations on the genome. Complete maps of large genomes require the integration of many kinds of experimental data, each with its own forms of noise and experimental error. To facilitate this integration, we are developing a flexible approach to map assembly based on logic programming and data visualization. Logic programming provides a convenient, and mathematically rigorous way of reasoning about data, while data visualization provides layout algorithms for assembling and displaying genome maps. To demonstrate the approach, this paper describes numerous rules for map assembly implemented in a data-visualization system called Hy+. Using these rules, we have successfully assembled contigs (partial maps) from real and simulated mapping data-data that is noisy, imprecise and contradictory. The main advantage of the approach is that it allows a user to rapidly develop, implement and test new rules for genome map assembly, with a minimum of programming effort.

  14. A Tangible Approach to Concept Mapping

    NASA Astrophysics Data System (ADS)

    Tanenbaum, Karen; Antle, Alissa N.

    2009-05-01

    The Tangible Concept Mapping project investigates using a tangible user interface to engage learners in concept map creation. This paper describes a prototype implementation of the system, presents some preliminary analysis of its ease of use and effectiveness, and discusses how elements of tangible interaction support concept mapping by helping users organize and structure their knowledge about a domain. The role of physical engagement and embodiment in supporting the mental activity of creating the concept map is explored as one of the benefits of a tangible approach to learning.

  15. Structural evolution in germanium and selenium nuclei within the mapped interacting boson model based on the Gogny energy density functional

    NASA Astrophysics Data System (ADS)

    Nomura, K.; Rodríguez-Guzmán, R.; Robledo, L. M.

    2017-06-01

    The shape transitions and shape coexistence in the Ge and Se isotopes are studied within the interacting boson model (IBM) with the microscopic input from a self-consistent mean-field calculation based on the Gogny-D1M energy density functional. The mean-field energy surface as a function of the quadrupole shape variables β and γ , obtained from the constrained Hartree-Fock-Bogoliubov method, is mapped onto the expectation value of the IBM Hamiltonian with configuration mixing in the boson condensate state. The resultant Hamiltonian is used to compute excitation energies and electromagnetic properties of the selected nuclei Ge-9466 and Se-9668. Our calculation suggests that many nuclei exhibit γ softness. Coexistence between prolate and oblate shapes, as well as between spherical and γ -soft shapes, is also observed. The method provides a reasonable description of the observed systematics of the excitation energy of the low-lying energy levels and transition strengths for nuclei below the neutron shell closure N =50 , and provides predictions on the spectroscopy of neutron-rich Ge and Se isotopes with 52 ≤N ≤62 , where data are scarce or not available.

  16. Microscopic calculation of interacting boson model parameters by potential-energy surface mapping

    SciTech Connect

    Bentley, I.; Frauendorf, S.

    2011-06-15

    A coherent state technique is used to generate an interacting boson model (IBM) Hamiltonian energy surface which is adjusted to match a mean-field energy surface. This technique allows the calculation of IBM Hamiltonian parameters, prediction of properties of low-lying collective states, as well as the generation of probability distributions of various shapes in the ground state of transitional nuclei, the last two of which are of astrophysical interest. The results for krypton, molybdenum, palladium, cadmium, gadolinium, dysprosium, and erbium nuclei are compared with experiment.

  17. Dynamical approach to conformal gravity and the bosonic string effective action

    NASA Astrophysics Data System (ADS)

    Barbero, F.; Julve, J.; Tiemblo, A.; Tresguerres, R.

    1988-12-01

    We show that a theory invariant under the full local conformal group in a coset realization contains naturally the massless degrees of freedom of the closed bosonic string. This effective theory is alternative to the bosonic part of Supergravity N=1 as given by Manton and Chapline [4].

  18. Slave-boson mean-field theory versus variational-wave-function approach for the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Yang, Min-Fong; Sun, Shih-Jye; Hong, Tzay-Ming

    1993-12-01

    We show that a special kind of slave-boson mean-field approximation, which allows for the symmetry-broken states appropriate for a bipartite lattice, can give essentially the same results as those by the variational-wave-function approach proposed by Gula´csi, Strack, and Vollhardt [Phys. Rev. B 47, 8594 (1993)]. The advantages of our approach are briefly discussed.

  19. A Statistical Approach for Ambiguous Sequence Mappings

    USDA-ARS?s Scientific Manuscript database

    When attempting to map RNA sequences to a reference genome, high percentages of short sequence reads are often assigned to multiple genomic locations. One approach to handling these “ambiguous mappings” has been to discard them. This results in a loss of data, which can sometimes be as much as 45% o...

  20. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  1. Are Bosonic Replicas Faulty?

    NASA Astrophysics Data System (ADS)

    Osipov, Vladimir Al.; Kanzieper, Eugene

    2007-08-01

    Motivated by the ongoing discussion about a seeming asymmetry in the performance of fermionic and bosonic replicas, we present an exact, nonperturbative approach to both fermionic and bosonic zero-dimensional replica field theories belonging to the broadly interpreted β=2 Dyson symmetry class. We then utilize the formalism developed to demonstrate that the bosonic replicas do correctly reproduce the microscopic spectral density in the QCD-inspired chiral Gaussian unitary ensemble. This disproves the myth that the bosonic replica field theories are intrinsically faulty.

  2. An automated approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Sun, Weihua; Mckeown, Donald M.; Messinger, David W.

    2012-10-01

    Heavy rain from Tropical Storm Lee resulted in a major flood event for the southern tier of New York State in early September 2011 causing evacuation of approximately 20,000 people in and around the city of Binghamton. In support of the New York State Office of Emergency Management, a high resolution multispectral airborne sensor (WASP) developed by RIT was deployed over the flooded area to collect aerial images. One of the key benefits of these images is their provision for flood inundation area mapping. However, these images require a significant amount of storage space and the inundation mapping process is conventionally carried out using manual digitization. In this paper, we design an automated approach for flood inundation mapping from the WASP airborne images. This method employs Spectral Angle Mapper (SAM) for color RGB or multispectral aerial images to extract the flood binary map; then it uses a set of morphological processing and a boundary vectorization technique to convert the binary map into a shapefile. This technique is relatively fast and only requires the operator to select one pixel on the image. The generated shapefile is much smaller than the original image and can be imported to most GIS software packages. This enables critical flood information to be shared with and by disaster response managers very rapidly, even over cellular phone networks.

  3. Thermodynamics of a One-Dimensional System of Point Bosons: Comparison of the Traditional Approach with a New One

    NASA Astrophysics Data System (ADS)

    Tomchenko, Maksim

    2017-01-01

    We compare two approaches to the construction of the thermodynamics of a one-dimensional periodic system of spinless point bosons: the Yang-Yang approach and a new approach proposed by the author. In the latter, the elementary excitations are introduced so that there is only one type of excitations (as opposed to Lieb's approach with two types of excitations: particle-like and hole-like). At the weak coupling, these are the excitations of the Bogolyubov type. The equations for the thermodynamic quantities in these approaches are different, but their solutions coincide (this is shown below and is the main result). Moreover, the new approach is simpler. An important point is that the thermodynamic formulae in the new approach for any values of parameters are formulae for an ensemble of quasiparticles with Bose statistics, whereas a formulae in the traditional Yang-Yang approach have the Fermi-like one-particle form.

  4. Hydrochromic Approaches to Mapping Human Sweat Pores.

    PubMed

    Park, Dong-Hoon; Park, Bum Jun; Kim, Jong-Man

    2016-06-21

    colorimetric change near body temperature. This feature enables the use of this technique to generate high-quality images of sweat pores. This Account also focuses on the results of the most recent phase of this investigation, which led to the development of a simple yet efficient and reliable technique for sweat pore mapping. The method utilizes a hydrophilic polymer composite film containing fluorescein, a commercially available dye that undergoes a fluorometric response as a result of water-dependent interconversion between its ring-closed spirolactone (nonfluorescent) and ring-opened fluorone (fluorescent) forms. Surface-modified carbon nanodots (CDs) have also been found to be efficient for hydrochromic mapping of human sweat pores. The results discovered by Lou et al. [ Adv. Mater. 2015 , 27 , 1389 ] are also included in this Account. Sweat pore maps obtained from fingertips using these materials were found to be useful for fingerprint analysis. In addition, this hydrochromism-based approach is sufficiently sensitive to enable differentiation between sweat-secreting active pores and inactive pores. As a result, the techniques can be applied to clinical diagnosis of malfunctioning sweat pores. The directions that future research in this area will follow are also discussed.

  5. Composite-boson approach to molecular Bose-Einstein condensates in mixtures of ultracold Fermi gases

    NASA Astrophysics Data System (ADS)

    Bouvrie, P. Alexander; Tichy, Malte C.; Roditi, Itzhak

    2017-02-01

    We show that an ansatz based on independent composite bosons [Phys. Rep. 463, 215 (2008), 10.1016/j.physrep.2007.11.003] accurately describes the condensate fraction of molecular Bose-Einstein condensates in ultracold Fermi gases. The entanglement between the fermionic constituents of a single Feshbach molecule then governs the many-particle statistics of the condensate, from the limit of strong interaction to close to unitarity. This result strengthens the role of entanglement as the indispensable driver of composite-boson behavior. The condensate fraction of fermion pairs at zero temperature that we compute matches excellently previous results obtained by means of fixed-node diffusion Monte Carlo methods and the Bogoliubov depletion approximation. This paves the way towards the exploration of the BEC-BCS crossover physics in mixtures of cold Fermi gases with an arbitrary number of fermion pairs as well as the implementation of Hong-Ou-Mandel-like interference experiments proposed within coboson theory.

  6. New approach for anti-normally and normally ordering bosonic-operator functions in quantum optics

    NASA Astrophysics Data System (ADS)

    Xu, Shi-Min; Zhang, Yun-Hai; Xu, Xing-Lei; Li, Hong-Qi; Wang, Ji-Suo

    2016-12-01

    In this paper, we provide a new kind of operator formula for anti-normally and normally ordering bosonic-operator functions in quantum optics, which can help us arrange a bosonic-operator function f(λQ̂ + νP̂) in its anti-normal and normal ordering conveniently. Furthermore, mutual transformation formulas between anti-normal ordering and normal ordering, which have good universality, are derived too. Based on these operator formulas, some new differential relations and some useful mathematical integral formulas are easily derived without really performing these integrations. Project supported by the Natural Science Foundation of Shandong Province, China (Grant No. ZR2015AM025) and the Natural Science Foundation of Heze University, China (Grant No. XY14PY02).

  7. New approach to the resummation of logarithms in Higgs-boson decays to a vector quarkonium plus a photon

    NASA Astrophysics Data System (ADS)

    Bodwin, Geoffrey T.; Chung, Hee Sok; Ee, June-Haak; Lee, Jungil

    2017-03-01

    We present a calculation of the rates for Higgs-boson decays to a vector heavy-quarkonium state plus a photon, where the heavy-quarkonium states are the J /ψ and the ϒ (n S ) states, with n =1 , 2, or 3. The calculation is carried out in the light-cone formalism, combined with nonrelativistic QCD factorization, and is accurate at leading order in mQ2/mH2, where mQ is the heavy-quark mass and mH is the Higgs-boson mass. The calculation contains corrections through next-to-leading order in the strong-coupling constant αs and the square of the heavy-quark velocity v , and includes a resummation of logarithms of mH2/mQ2 at next-to-leading logarithmic accuracy. We have developed a new method, which makes use of Abel summation, accelerated through the use of Padé approximants, to deal with divergences in the resummed expressions for the quarkonium light-cone distribution amplitudes. This approach allows us to make definitive calculations of the resummation effects. Contributions from the order-αs and order-v2 corrections to the light-cone distribution amplitudes that we obtain with this new method differ substantially from the corresponding contributions that one obtains from a model light-cone distribution amplitude [M. König and M. Neubert, J. High Energy Phys. 08 (2015) 012, 10.1007/JHEP08(2015)012]. Our results for the real parts of the direct-process amplitudes are considerably smaller than those from one earlier calculation [G. T. Bodwin, H. S. Chung, J.-H. Ee, J. Lee, and F. Petriello, Phys. Rev. D 90, 113010 (2014), 10.1103/PhysRevD.90.113010], reducing the sensitivity to the Higgs-boson-heavy-quark couplings, and are somewhat smaller than those from another earlier calculation [M. König and M. Neubert, J. High Energy Phys. 08 (2015) 012, 10.1007/JHEP08(2015)012]. However, our results for the standard-model Higgs-boson branching fractions are in good agreement with those in M. König and M. Neubert, J. High Energy Phys. 08 (2015) 012, 10.1007/JHEP08(2015)012.

  8. Fridel sum rules for one- and two-channel Kondo models and unitarity paradox via bosonization-refermionization approach

    NASA Astrophysics Data System (ADS)

    Kharitonov, Maxim; Andrei, Natan; Coleman, Piers

    2013-03-01

    We calculate the single-particle Green's functions and scattering amplitudes of the one-channel and channel-anisotropic two-channel Kondo models at the Toulouse and Emery-Kivelson lines, respectively, where exact solutions via the bosonization-refermionization approach are admitted. We demonstrate that in this approach the Friedel sum rules - the relations between the trapped spin and ``flavor'' moments and the scattering phase shifts in the Fermi-liquid regime - arise naturally and elucidate on their subtleties. We also recover the ``unitarity paradox'' - the vanishing of the single-particle scattering amplitude at the channel-symmetric point of the two-channel Kondo model - stemming from non-Fermi-liquid behavior. We discuss the implications of these results for the development of composite pairing in heavy fermion systems. This work was supported by National Science Foundation grants DMR 0907179 (MK, PC) and DMR 1006684 (NA).

  9. Combinatorial approach to generalized Bell and Stirling numbers and boson normal ordering problem

    SciTech Connect

    Mendez, M.A.; Blasiak, P.; Penson, K.A.

    2005-08-01

    We consider the numbers arising in the problem of normal ordering of expressions in boson creation a{sup {dagger}} and annihilation a operators ([a,a{sup {dagger}}]=1). We treat a general form of a boson string (a{sup {dagger}}){sup r{sub n}}a{sup s{sub n}}...(a{sup {dagger}}){sup r{sub 2}}a{sup s{sub 2}}(a{sup {dagger}}){sup r{sub 1}}a{sup s{sub 1}} which is shown to be associated with generalizations of Stirling and Bell numbers. The recurrence relations and closed-form expressions (Dobinski-type formulas) are obtained for these quantities by both algebraic and combinatorial methods. By extensive use of methods of combinatorial analysis we prove the equivalence of the aforementioned problem to the enumeration of special families of graphs. This link provides a combinatorial interpretation of the numbers arising in this normal ordering problem.

  10. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  11. Learning topological maps: An alternative approach

    SciTech Connect

    Buecken, A.; Thrun, S.

    1996-12-31

    Our goal is autonomous real-time control of a mobile robot. In this paper we want to show a possibility to learn topological maps of a large-scale indoor environment autonomously. In the literature there are two paradigms how to store information on the environment of a robot: as a grid-based (geometric) or as a topological map. While grid-based maps are considerably easy to learn and maintain, topological maps are quite compact and facilitate fast motion-planning.

  12. On the multi-layer multi-configurational time-dependent Hartree approach for bosons and fermions

    NASA Astrophysics Data System (ADS)

    Manthe, Uwe; Weike, Thomas

    2017-02-01

    A multi-layer multi-configurational time-dependent Hartree (MCTDH) approach using a second quantization representation (SQR) based on optimized time-dependent orbitals is introduced. The approach combines elements of the multi-layer MCTDH-SQR approach of Wang and Thoss, which employs a preselected time-independent orbital basis, and the MCTDH for bosons and multi-configuration time-dependent Hartree-Fock approaches, which do not use multi-layering but employ time-dependent orbital bases. In contrast to existing MCTDH-type approaches, the results of the present approach for a given number of configurations are not invariant with respect to unitary transformations of the time-dependent orbital basis. Thus a natural orbital representation is chosen to achieve fast convergence with respect to the number of configurations employed. Equations of motion for the present ansatz, called (multi-layer) MCTDH in optimized second quantization representation, are derived. Furthermore, a scheme for the calculation of optimized unoccupied single-particle functions is given which can be used to avoid singularities in the equations of motion.

  13. Exact results in a slave boson saddle point approach for a strongly correlated electron model

    SciTech Connect

    Fresard, Raymond; Kopp, Thilo

    2008-08-15

    We revisit the Kotliar-Ruckenstein (KR) slave boson saddle point evaluation for a two-site correlated electron model. As the model can be solved analytically, it is possible to compare the KR saddle point results with the exact many-particle levels. The considered two-site cluster mimics an infinite-U single-impurity Anderson model with a nearest-neighbor Coulomb interaction: one site is strongly correlated with an infinite local Coulomb repulsion, which hybridizes with the second site, on which the local Coulomb repulsion vanishes. Making use of the flexibility of the representation, we introduce appropriate weight factors in the KR saddle point scheme. Ground-state and all excitation levels agree with the exact diagonalization results. Thermodynamics and correlation functions may be recovered in a suitably renormalized saddle point evaluation.

  14. Low quasiparticle coherence temperature in the one-band Hubbard model: A slave-boson approach

    NASA Astrophysics Data System (ADS)

    Mezio, Alejandro; McKenzie, Ross H.

    2017-07-01

    We use the Kotliar-Ruckenstein slave-boson formalism to study the temperature dependence of paramagnetic phases of the one-band Hubbard model for a variety of band structures. We calculate the Fermi liquid quasiparticle spectral weight Z and identify the temperature at which it decreases significantly to a crossover to a bad metal region. Near the Mott metal-insulator transition, this coherence temperature Tcoh is much lower than the Fermi temperature of the uncorrelated Fermi gas, as is observed in a broad range of strongly correlated electron materials. After a proper rescaling of temperature and interaction, we find a universal behavior that is independent of the band structure of the system. We obtain the temperature-interaction phase diagram as function of doping, and we compare the temperature dependence of the double occupancy, entropy, and charge compressibility with previous results obtained with dynamical mean-field theory. We analyze the stability of the method by calculating the charge compressibility.

  15. Quantitative Genetic Interaction Mapping Using the E-MAP Approach

    PubMed Central

    Collins, Sean R.; Roguev, Assen; Krogan, Nevan J.

    2010-01-01

    Genetic interactions represent the degree to which the presence of one mutation modulates the phenotype of a second mutation. In recent years, approaches for measuring genetic interactions systematically and quantitatively have proven to be effective tools for unbiased characterization of gene function and have provided valuable data for analyses of evolution. Here, we present protocols for systematic measurement of genetic interactions with respect to organismal growth rate for two yeast species. PMID:20946812

  16. Improving long time behavior of Poisson bracket mapping equation: A non-Hamiltonian approach

    SciTech Connect

    Kim, Hyun Woo; Rhee, Young Min

    2014-05-14

    Understanding nonadiabatic dynamics in complex systems is a challenging subject. A series of semiclassical approaches have been proposed to tackle the problem in various settings. The Poisson bracket mapping equation (PBME) utilizes a partial Wigner transform and a mapping representation for its formulation, and has been developed to describe nonadiabatic processes in an efficient manner. Operationally, it is expressed as a set of Hamilton's equations of motion, similar to more conventional classical molecular dynamics. However, this original Hamiltonian PBME sometimes suffers from a large deviation in accuracy especially in the long time limit. Here, we propose a non-Hamiltonian variant of PBME to improve its behavior especially in that limit. As a benchmark, we simulate spin-boson and photosynthetic model systems and find that it consistently outperforms the original PBME and its Ehrenfest style variant. We explain the source of this improvement by decomposing the components of the mapping Hamiltonian and by assessing the energy flow between the system and the bath. We discuss strengths and weaknesses of our scheme with a viewpoint of offering future prospects.

  17. Evolution of biomedical ontologies and mappings: Overview of recent approaches.

    PubMed

    Groß, Anika; Pruski, Cédric; Rahm, Erhard

    2016-01-01

    Biomedical ontologies are heavily used to annotate data, and different ontologies are often interlinked by ontology mappings. These ontology-based mappings and annotations are used in many applications and analysis tasks. Since biomedical ontologies are continuously updated dependent artifacts can become outdated and need to undergo evolution as well. Hence there is a need for largely automated approaches to keep ontology-based mappings up-to-date in the presence of evolving ontologies. In this article, we survey current approaches and novel directions in the context of ontology and mapping evolution. We will discuss requirements for mapping adaptation and provide a comprehensive overview on existing approaches. We will further identify open challenges and outline ideas for future developments.

  18. Kondo length in bosonic lattices

    NASA Astrophysics Data System (ADS)

    Giuliano, Domenico; Sodano, Pasquale; Trombettoni, Andrea

    2017-09-01

    Motivated by the fact that the low-energy properties of the Kondo model can be effectively simulated in spin chains, we study the realization of the effect with bond impurities in ultracold bosonic lattices at half filling. After presenting a discussion of the effective theory and of the mapping of the bosonic chain onto a lattice spin Hamiltonian, we provide estimates for the Kondo length as a function of the parameters of the bosonic model. We point out that the Kondo length can be extracted from the integrated real-space correlation functions, which are experimentally accessible quantities in experiments with cold atoms.

  19. New GIS approaches to wild land mapping in Europe

    Treesearch

    Steffen Fritz; Steve Carver; Linda See

    2000-01-01

    This paper outlines modifications and new approaches to wild land mapping developed specifically for the United Kingdom and European areas. In particular, national level reconnaissance and local level mapping of wild land in the UK and Scotland are presented. A national level study for the UK is undertaken, and a local study focuses on the Cairngorm Mountains in...

  20. FEASIBILITY AND APPROACH FOR MAPPING RADON POTENTIALS IN FLORIDA

    EPA Science Inventory

    The report gives results of an analysis of the feasibility and approach for developing statewide maps of radon potentials in Florida. he maps would provide a geographic basis for implementing new radon-protective building construction standards to reduce public health risks from ...

  1. Comparison of Mixed-Model Approaches for Association Mapping

    PubMed Central

    Stich, Benjamin; Möhring, Jens; Piepho, Hans-Peter; Heckenberger, Martin; Buckler, Edward S.; Melchinger, Albrecht E.

    2008-01-01

    Association-mapping methods promise to overcome the limitations of linkage-mapping methods. The main objectives of this study were to (i) evaluate various methods for association mapping in the autogamous species wheat using an empirical data set, (ii) determine a marker-based kinship matrix using a restricted maximum-likelihood (REML) estimate of the probability of two alleles at the same locus being identical in state but not identical by descent, and (iii) compare the results of association-mapping approaches based on adjusted entry means (two-step approaches) with the results of approaches in which the phenotypic data analysis and the association analysis were performed in one step (one-step approaches). On the basis of the phenotypic and genotypic data of 303 soft winter wheat (Triticum aestivum L.) inbreds, various association-mapping methods were evaluated. Spearman's rank correlation between P-values calculated on the basis of one- and two-stage association-mapping methods ranged from 0.63 to 0.93. The mixed-model association-mapping approaches using a kinship matrix estimated by REML are more appropriate for association mapping than the recently proposed QK method with respect to (i) the adherence to the nominal α-level and (ii) the adjusted power for detection of quantitative trait loci. Furthermore, we showed that our data set could be analyzed by using two-step approaches of the proposed association-mapping method without substantially increasing the empirical type I error rate in comparison to the corresponding one-step approaches. PMID:18245847

  2. Approaches to mapping genetically correlated complex traits

    PubMed Central

    George, Andrew W; Basu, Saonli; Li, Na; Rothstein, Joseph H; Sieberts, Solveig K; Stewart, William; Wijsman, Ellen M; Thompson, Elizabeth A

    2003-01-01

    Our Markov chain Monte Carlo (MCMC) methods were used in linkage analyses of the Framingham Heart Study data using all available pedigrees. Our goal was to detect and map loci associated with covariate-adjusted traits log triglyceride (lnTG) and high-density lipoprotein cholesterol (HDL) using multipoint LOD score analysis, Bayesian oligogenic linkage analysis and identity-by-descent (IBD) scoring methods. Each method used all marker data for all markers on a chromosome. Bayesian linkage analysis detected a linkage signal on chromosome 7 for lnTG and HDL, corroborating previously published results. However, these results were not replicated in a classical linkage analysis of the data or by using IBD scoring methods. We conclude that Bayesian linkage analysis provides a powerful paradigm for mapping trait loci but interpretation of the Bayesian linkage signals is subjective. In the absence of a LOD score method accommodating genetically complex traits and linkage heterogeneity, validation of these signals remains elusive. PMID:14975139

  3. ModMAP: A Systematic Approach to Individualized Teacher Education

    ERIC Educational Resources Information Center

    Kranyik, Robert D.; Kielty, Joseph W.

    1974-01-01

    A description of a competency based, individualized graduate degree program, Modular Multiple Alternative Program (ModMAP). The program focuses on the training of elementary teachers, and offers an alternative approach to graduate studies. (Author)

  4. Recent developments in MAP - MODULAR APPROACH to PHYSICS

    NASA Astrophysics Data System (ADS)

    Rae, Jennifer; Austen, Dave; Brouwer, Wytze

    2002-05-01

    We present recent developments in MAP - MODULAR APPROACH to PHYSICS - JAVA enhanced modules to be used as aids in teaching the first 3 terms of university physics. The MAP project is very comprehensive and consists of a modular approach to physics that utilizes JAVA applets, FLASH animations and HTML based tutorials. The overall instructional philosophy of MAP is constructivist and the project emphasizes active learner participation. In this talk we will provide a quick overview of the project and the results of recent pilot testing at several Canadian universities. It will also include a discussion of the VIDEO LAB aspect of MAP. This is a component that is integrated into MAP and permits students to capture and evaluate otherwise difficult to study phenomena on video.

  5. A contact map matching approach to protein structure similarity analysis.

    PubMed

    de Melo, Raquel C; Lopes, Carlos Eduardo R; Fernandes, Fernando A; da Silveira, Carlos Henrique; Santoro, Marcelo M; Carceroni, Rodrigo L; Meira, Wagner; Araújo, Arnaldo de A

    2006-06-30

    We modeled the problem of identifying how close two proteins are structurally by measuring the dissimilarity of their contact maps. These contact maps are colored images, in which the chromatic information encodes the chemical nature of the contacts. We studied two conceptually distinct image-processing algorithms to measure the dissimilarity between these contact maps; one was a content-based image retrieval method, and the other was based on image registration. In experiments with contact maps constructed from the protein data bank, our approach was able to identify, with greater than 80% precision, instances of monomers of apolipoproteins, globins, plastocyanins, retinol binding proteins and thioredoxins, among the monomers of Protein Data Bank Select. The image registration approach was only slightly more accurate than the content-based image retrieval approach.

  6. Optogenetic Approaches for Mesoscopic Brain Mapping.

    PubMed

    Kyweriga, Michael; Mohajerani, Majid H

    2016-01-01

    Recent advances in identifying genetically unique neuronal proteins has revolutionized the study of brain circuitry. Researchers are now able to insert specific light-sensitive proteins (opsins) into a wide range of specific cell types via viral injections or by breeding transgenic mice. These opsins enable the activation, inhibition, or modulation of neuronal activity with millisecond control within distinct brain regions defined by genetic markers. Here we present a useful guide to implement this technique into any lab. We first review the materials needed and practical considerations and provide in-depth instructions for acute surgeries in mice. We conclude with all-optical mapping techniques for simultaneous recording and manipulation of population activity of many neurons in vivo by combining arbitrary point optogenetic stimulation and regional voltage-sensitive dye imaging. It is our intent to make these methods available to anyone wishing to use them.

  7. Driven Boson Sampling

    NASA Astrophysics Data System (ADS)

    Barkhofen, Sonja; Bartley, Tim J.; Sansoni, Linda; Kruse, Regina; Hamilton, Craig S.; Jex, Igor; Silberhorn, Christine

    2017-01-01

    Sampling the distribution of bosons that have undergone a random unitary evolution is strongly believed to be a computationally hard problem. Key to outperforming classical simulations of this task is to increase both the number of input photons and the size of the network. We propose driven boson sampling, in which photons are input within the network itself, as a means to approach this goal. We show that the mean number of photons entering a boson sampling experiment can exceed one photon per input mode, while maintaining the required complexity, potentially leading to less stringent requirements on the input states for such experiments. When using heralded single-photon sources based on parametric down-conversion, this approach offers an ˜e -fold enhancement in the input state generation rate over scattershot boson sampling, reaching the scaling limit for such sources. This approach also offers a dramatic increase in the signal-to-noise ratio with respect to higher-order photon generation from such probabilistic sources, which removes the need for photon number resolution during the heralding process as the size of the system increases.

  8. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches

    NASA Astrophysics Data System (ADS)

    Igoshev, P. A.; Timirgazin, M. A.; Gilmutdinov, V. F.; Arzhnikov, A. K.; Irkhin, V. Yu

    2015-11-01

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed.

  9. A capacity mapping approach to public health training resources.

    PubMed

    Dato, Virginia M; Potter, Margaret A; Fertman, Carl I; Pistella, Christine L

    2002-01-01

    The capacity mapping approach can be used to identify existing community resources. As part of this approach, inventories are used to provide information for a capacity map. The authors describe the development of two inventories and a capacity map for public health workforce development. For the first inventory, the authors contacted 754 institutions to determine available public health training resources; 191 institutions reported resources, including 126 directly providing distance learning technologies and courses or modules addressing important competency domains. Distance learning technologies included video conferencing facilities (61%) and satellite download facilities (50%). For the second inventory, the authors obtained information on 129 distance-accessible public health training modules. The workforce development capacity map produced from these two inventories revealed substantial resources available for use by individuals or agencies wishing to improve training in public health competencies.

  10. A Nonparametric Approach for Mapping Quantitative Trait Loci

    PubMed Central

    Kruglyak, L.; Lander, E. S.

    1995-01-01

    Genetic mapping of quantitative trait loci (QTLs) is performed typically by using a parametric approach, based on the assumption that the phenotype follows a normal distribution. Many traits of interest, however, are not normally distributed. In this paper, we present a nonparametric approach to QTL mapping applicable to any phenotypic distribution. The method is based on a statistic Z(w), which generalizes the nonparametric Wilcoxon rank-sum test to the situation of whole-genome search by interval mapping. We determine the appropriate significance level for the statistic Z(w), by showing that its asymptotic null distribution follows an Ornstein-Uhlenbeck process. These results provide a robust, distribution-free method for mapping QTLs. PMID:7768449

  11. Projective mapping based on choice or preference: An affective approach to projective mapping.

    PubMed

    Varela, Paula; Berget, Ingunn; Hersleth, Margrethe; Carlehög, Mats; Asioli, Daniele; Næs, Tormod

    2017-10-01

    This work explores a new affective approach to projective mapping, based on consumers' choices or preferences. Two sessions, one week apart, were performed with the same consumers, using whole bread as a case study. Overall liking ratings (OL) were gathered in blind conditions and samples were also profiled by a trained panel using generic descriptive analysis. Three projective mapping tests were performed in different scenarios. Consumers' categorization and product descriptions were explored when consumers based their positioning on the products' similarities and differences (analytical approach, "classic napping") both in blind and informed conditions, and when consumers were focusing on their preference or choice (affective approach). The affective approach to projective mapping successfully revealed consumers' drivers of liking and choice from a holistic perspective, where consumers summarized their main drivers for categorizing products as they would do when choosing in real life situations, based on their preferences. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Tank Update System: A novel asset mapping approach for verifying and updating lakes using Google Maps

    NASA Astrophysics Data System (ADS)

    Reddy Pulsani, Bhaskar

    2016-06-01

    Mission Kakatiya is one of prestigious programs of Telangana state government under which restoration of tank across ten districts is being implemented. As part of the program, government plans to restore about 9,000 lakes. Therefore, to have a comprehensive list of lakes existing in Telangana state, Samagra Tank Survey was carried out. Data collected in this survey contained about 45,000 tanks. Since the mode of collection of data was not in a standard format and was made using excel, a web interface was created to fill the gaps and to standardise the data. A new approach for spatially identifying the lakes through Google maps was successfully implemented by developing a web interface. This approach is less common since it implements the nature of asset mapping for the lakes of Telangana state and shows the advantages of using online mapping applications such as Google maps in identifying and cross checking already existing lakes on it.

  13. Beyond mean-field dynamics of ultra-cold bosonic atoms in higher dimensions: facing the challenges with a multi-configurational approach

    NASA Astrophysics Data System (ADS)

    Bolsinger, V. J.; Krönke, S.; Schmelcher, P.

    2017-02-01

    Exploring the impact of dimensionality on the quantum dynamics of interacting bosons in traps including particle correlations is an interesting but challenging task. Due to the different participating length scales, the modelling of the short-range interactions in three dimensions plays a special role. We review different approaches for the latter and elaborate that for multi-configurational computational strategies, finite-range potentials are adequate resulting in the need for large grids to resolve the relevant length scales. This results in computational challenges, which include the exponential scaling of complexity with the number of atoms. We show that the recently developed ab initio multi-layer multi-configurational time-dependent Hartee method for bosons (ML-MCTDHB) (2013 J. Chem. Phys. 139 134103) can face both numerical challenges and present an efficient numerical implementation of ML-MCTDHB in three spatial dimensions, particularly suited to describe the quantum dynamics for elongated traps. The beneficial scaling of our approach is demonstrated by studying the tunnelling dynamics of bosonic ensembles in a double well. Comparing three-dimensional with quasi-one dimensional simulations, we find dimensionality-induced effects in the density. Furthermore, we study the crossover from weak transversal confinement, where a mean-field description of the system is sufficient, towards tight transversal confinement, where particle correlations and beyond mean-field effects are pronounced.

  14. Spin-boson dynamics: A unified approach from weak to strong coupling

    NASA Astrophysics Data System (ADS)

    Nesi, F.; Paladino, E.; Thorwart, M.; Grifoni, M.

    2007-11-01

    We present a novel approximation scheme to describe the influence of a harmonic bath on the dynamics of a two-level particle over almost the whole regime of temperatures and coupling to the environment, for a wide class of bath spectral densities. Starting from the exact path integral solution for the two-level system density matrix, effective intra-blip correlations are fully included, while inter-blip and blip-sojourn interactions are considered up to first order. In the proper regimes, an excellent agreement with conventional perturbative approaches and ab initio path-integral results is found.

  15. Phase diagram of ultracold atoms in optical lattices: Comparative study of slave fermion and slave boson approaches to Bose-Hubbard model

    SciTech Connect

    Yu Yue; Chui, S. T.

    2005-03-01

    We perform a comparative study of the finite temperature behavior of ultracold Bose atoms in optical lattices by the slave fermion and the slave boson approaches to the Bose-Hubbard model. The phase diagram of the system is presented. Although both approaches are equivalent without approximations, the mean field theory based on the slave fermion technique is quantitatively more appropriate. Conceptually, the slave fermion approach automatically excludes the double occupancy of two identical fermions on the same lattice site. By comparing to known results in limiting cases, we find the slave fermion approach better than the slave boson approach. For example, in the non-interacting limit, the critical temperature of the superfluid-normal liquid transition calculated by the slave fermion approach is closer to the well-known ideal Bose gas result. At zero-temperature limit of the critical interaction, strength from the slave fermion approach is also closer to that from the direct calculation using a zero-temperature mean field theory.

  16. Associated production of a quarkonium and a Z boson at one loop in a quark-hadron-duality approach

    NASA Astrophysics Data System (ADS)

    Lansberg, Jean-Philippe; Shao, Hua-Sheng

    2016-10-01

    In view of the large discrepancy about the associated production of a prompt J/ψ and a Z boson between the ATLAS data at √{s}=8 TeV and theoretical predictions for Single Parton Scattering (SPS) contributions, we perform an evaluation of the corresponding cross section at one loop accuracy (Next-to-Leading Order, NLO) in a quark-hadron-duality approach, also known as the Colour-Evaporation Model (CEM). This work is motivated by (i) the extremely disparate predictions based on the existing NRQCD fits conjugated with the absence of a full NLO NRQCD computation and (ii) the fact that we believe that such an evaluation provides a likely upper limit of the SPS cross section. In addition to these theory improvements, we argue that the ATLAS estimation of the Double Parton Scattering (DPS) yield may be underestimated by a factor as large as 3 which then reduces the size of the SPS yield extracted from the ATLAS data. Our NLO SPS evaluation also allows us to set an upper limit on σ eff driving the size of the DPS yield. Overall, the discrepancy between theory and experiment may be smaller than expected, which calls for further analyses by ATLAS and CMS, for which we provide predictions, and for full NLO computations in other models. As an interesting side product of our analysis, we have performed the first NLO computation of dσ /dP T for prompt single- J/ψ production in the CEM from which we have fit the CEM non-pertubative parameter at NLO using the most recent ATLAS data.

  17. Interacting Boson Model and nucleons

    NASA Astrophysics Data System (ADS)

    Otsuka, Takaharu

    2012-10-01

    An overview on the recent development of the microscopic derivation of the Interacting Boson Model is presented with some remarks not found elsewhere. The OAI mapping is reviewed very briefly, including the basic correspondence from nucleon-pair to boson. The new fermionboson mapping method is introduced, where intrinsic states of nucleons and bosons for a wide variation of shapes play an important role. Nucleon intrinsic states are obtained from mean field models, which is Skyrme model in examples to be shown. This method generates IBM-2 Hamiltonian which can describe and predict various situations of quadrupole collective states, including U(5), SU(3), O(6) and E(5) limits. The method is extended so that rotational response (cranking) can be handled, which enables us to describe rotational bands of strongly deformed nuclei. Thus, we have obtained a unified framework for the microscopic derivation of the IBM covering all known situations of quadrupole collectivity at low energy.

  18. A linear programming approach for optimal contrast-tone mapping.

    PubMed

    Wu, Xiaolin

    2011-05-01

    This paper proposes a novel algorithmic approach of image enhancement via optimal contrast-tone mapping. In a fundamental departure from the current practice of histogram equalization for contrast enhancement, the proposed approach maximizes expected contrast gain subject to an upper limit on tone distortion and optionally to other constraints that suppress artifacts. The underlying contrast-tone optimization problem can be solved efficiently by linear programming. This new constrained optimization approach for image enhancement is general, and the user can add and fine tune the constraints to achieve desired visual effects. Experimental results demonstrate clearly superior performance of the new approach over histogram equalization and its variants.

  19. Technology Mapping: An Approach for Developing Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    Technology mapping[TM] is proposed as an approach for developing technological pedagogical content knowledge (TPCK). The study discusses in detail instructional design guidelines in relation to the enactment of TM, and reports on empirical findings from a study with 72 pre-service primary teachers within the context of teaching them how to teach…

  20. Technology Mapping: An Approach for Developing Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    Technology mapping[TM] is proposed as an approach for developing technological pedagogical content knowledge (TPCK). The study discusses in detail instructional design guidelines in relation to the enactment of TM, and reports on empirical findings from a study with 72 pre-service primary teachers within the context of teaching them how to teach…

  1. An improved probability mapping approach to assess genome mosaicism

    PubMed Central

    Zhaxybayeva, Olga; Gogarten, J Peter

    2003-01-01

    Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events. PMID:12974984

  2. An automated approach to mapping external terminologies to the UMLS.

    PubMed

    Taboada, María; Lalín, Rosario; Martínez, Diego

    2009-06-01

    Nowadays, providing interoperability between different biomedical terminologies is a critical issue for efficient information sharing. One problem making interoperability difficult is the lack of automated methods simplifying the mapping process. In this study, we propose an automated approach to mapping external terminologies to the Unified Medical Language System (UMLS). Our approach applies a sequential combination of two basic matching methods classically used in ontology matching. First, a lexical technique identifies similar strings between the external terminology and the UMLS. Second, a structure-based technique validates, in part, the lexical alignment by computing paths to top-level concepts and checking the compatibility of these top-level concepts across the external terminology and the UMLS. The method was applied to the mapping of the large-scale biomedical thesaurus EMTREE to the complete UMLS Metathesaurus. In total, 47.9% coverage of EMTREE terms was reached, leading to 80% coverage of EMTREE concepts. Our method has revealed a high compatibility in 6 out of 15 top-level categories across terminologies. The validation of lexical mappings ranges over 75.8% of the total lexical alignment. Overall, the method rules out a total of 6927 (7.9%) lexical mappings, with a global precision of 78%.

  3. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  4. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  5. Noise pollution mapping approach and accuracy on landscape scales.

    PubMed

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations.

  6. Einstein's Gravitational Field Approach to Dark Matter and Dark Energy-Geometric Particle Decay into the Vacuum Energy Generating Higgs Boson and Heavy Quark Mass

    NASA Astrophysics Data System (ADS)

    Christensen, Walter James

    2015-08-01

    During an interview at the Niels Bohr Institute David Bohm stated, "according to Einstein, particles should eventually emerge as singularities, or very strong regions of stable pulses of (the gravitational) field" [1]. Starting from this premise, we show spacetime, indeed, manifests stable pulses (n-valued gravitons) that decay into the vacuum energy to generate all three boson masses (including Higgs), as well as heavy-quark mass; and all in precise agreement with the 2010 CODATA report on fundamental constants. Furthermore, our relativized quantum physics approach (RQP) answers to the mystery surrounding dark energy, dark matter, accelerated spacetime, and why ordinary matter dominates over antimatter.

  7. A Bayesian approach to traffic light detection and mapping

    NASA Astrophysics Data System (ADS)

    Hosseinyalamdary, Siavash; Yilmaz, Alper

    2017-03-01

    Automatic traffic light detection and mapping is an open research problem. The traffic lights vary in color, shape, geolocation, activation pattern, and installation which complicate their automated detection. In addition, the image of the traffic lights may be noisy, overexposed, underexposed, or occluded. In order to address this problem, we propose a Bayesian inference framework to detect and map traffic lights. In addition to the spatio-temporal consistency constraint, traffic light characteristics such as color, shape and height is shown to further improve the accuracy of the proposed approach. The proposed approach has been evaluated on two benchmark datasets and has been shown to outperform earlier studies. The results show that the precision and recall rates for the KITTI benchmark are 95.78 % and 92.95 % respectively and the precision and recall rates for the LARA benchmark are 98.66 % and 94.65 % .

  8. Symmetry-improved 2PI approach to the Goldstone-boson IR problem of the SM effective potential

    NASA Astrophysics Data System (ADS)

    Pilaftsis, Apostolos; Teresi, Daniele

    2016-05-01

    The effective potential of the Standard Model (SM), from three loop order and higher, suffers from infrared (IR) divergences arising from quantum effects due to massless would-be Goldstone bosons associated with the longitudinal polarizations of the W± and Z bosons. Such IR pathologies also hinder accurate evaluation of the two-loop threshold corrections to electroweak quantities, such as the vacuum expectation value of the Higgs field. However, these divergences are an artifact of perturbation theory, and therefore need to be consistently resummed in order to obtain an IR-safe effective potential. The so-called Two-Particle-Irreducible (2PI) effective action provides a rigorous framework to consistently perform such resummations, without the need to resort to ad hoc subtractions or running into the risk of over-counting contributions. By considering the recently proposed symmetry-improved 2PI formalism, we address the problem of the Goldstone-boson IR divergences of the SM effective potential in the gaugeless limit of the theory. In the same limit, we evaluate the IR-safe symmetry-improved 2PI effective potential, after taking into account quantum loops of chiral fermions, as well as the renormalization of spurious custodially breaking effects triggered by fermionic Yukawa interactions. Finally, we compare our results with those obtained with other methods presented in the literature.

  9. Wildfire susceptibility mapping: comparing deterministic and stochastic approaches

    NASA Astrophysics Data System (ADS)

    Pereira, Mário; Leuenberger, Michael; Parente, Joana; Tonini, Marj

    2016-04-01

    Estimating the probability of wildfire-occurrence in a certain area under particular environmental conditions represents a modern tool to support forest protection plans and to reduce fires consequences. This can be performed by the implementation of wildfire susceptibility mapping, normally achieved employing more or less sophisticated models which combine the predisposing variables (as raster datasets) into a geographic information systems (GIS). The selection of the appropriate variables includes the evaluation of success and the implementation of prediction curves, as well as independent probabilistic validations for different scenarios. These methods allow to define the spatial pattern of wildfire-occurrences, characterize the susceptibility of the territory, namely for specific fire causes/types, and can also account for other factors such as human behavior and social aspects. We selected Portugal as the study region which, due to its favorable climatic, topographic and vegetation conditions, is by far the European country most affected by wildfires. In addition, Verde and Zêzere (2010) performed a first assessment and validation of wildfire susceptibility and hazard in Portugal which can be used as benchmarking. The objectives of the present study comprise: (1) assessing the structural forest fire risk in Portugal using updated datasets, namely, with higher spatial resolution (80 m to 25 m), most recent vegetation cover (Corine Land Cover), longer fire history (1975-2013); and, (2) comparing linear vs non-linear approaches for wildfire susceptibility mapping. The data we used includes: (i) a DEM derived from the Shuttle Radar Topographic Mission in a resolution of 1 arc-seconds (DEM-SRTM 25 m) to assess elevation and slope; (ii) the Corine Land Cover inventory provided by the European Environment Agency (http://www.eea.europa.eu/pt) to produce the land use land cover map; (iii) the National Mapping Burnt Areas (NMBA) provided by the Institute for the

  10. A new approach to reduce the mapping error of landslide inventory maps

    NASA Astrophysics Data System (ADS)

    Santangelo, Michele; Marchesini, Ivan; Bucci, Francesco; Cardinali, Mauro; Rossi, Mauro; Taylor, Faith; Malamud, Bruce; Guzzetti, Fausto

    2013-04-01

    Landslide inventory maps are key in documenting the type and extent of mass movements in local to regional areas, for both geomorphological studies and landslide hazard assessment. Geomorphologists usually prepare landslide inventories by aerial photo interpretation (API) of stereoscopic images aided by field surveys. Criteria adopted for visual image analyses are derived from the heuristic interpretation of photographic and morphological features of the image, such as shape, size, color tone, texture and pattern. The established (traditional) procedure for transferring photo-interpreted information to a GIS environment involves the manual drawing of information from the aerial photograph to the topographic base map. In this stage, mapping (i.e., positioning, shape, size) errors can occur due to (i) the change in scale, from the aerial photographs to the topographic map, (ii) object deformation in the stereoscopic model, due to the vertical exaggeration and the conical projection of the aerial photographs, (iii) differences in topography in the different cartographic media (aerial photographs and base maps). We recently developed a method to reduce mapping errors which exploits the ortho-rectification of the aerial photograph and the photo-interpreted thematic layers, thus avoiding manual transferring of information to the topographic map. The technique was evaluated in a test area of about 50 km2 in the neighboring of Taormina (Sicily, Southern Italy), where the information concerning mass movement was transferred to two inventory maps using the traditional and ortho-rectification technique. More than 500 landslides pairs have been compared in this test region, ranging in landlside area between 102 and 107 m2. The mapping error associated with the mapped features has been evaluated by calculating the mismatch index for each landslide pair as: E = (A U B)-(A ? B)/(A U B), where A is a landslide of the inventory obtained using the manual drawing approach and B is a

  11. A corpus-based approach for automated LOINC mapping.

    PubMed

    Fidahussein, Mustafa; Vreeman, Daniel J

    2014-01-01

    To determine whether the knowledge contained in a rich corpus of local terms mapped to LOINC (Logical Observation Identifiers Names and Codes) could be leveraged to help map local terms from other institutions. We developed two models to test our hypothesis. The first based on supervised machine learning was created using Apache's OpenNLP Maxent and the second based on information retrieval was created using Apache's Lucene. The models were validated by a random subsampling method that was repeated 20 times and that used 80/20 splits for training and testing, respectively. We also evaluated the performance of these models on all laboratory terms from three test institutions. For the 20 iterations used for validation of our 80/20 splits Maxent and Lucene ranked the correct LOINC code first for between 70.5% and 71.4% and between 63.7% and 65.0% of local terms, respectively. For all laboratory terms from the three test institutions Maxent ranked the correct LOINC code first for between 73.5% and 84.6% (mean 78.9%) of local terms, whereas Lucene's performance was between 66.5% and 76.6% (mean 71.9%). Using a cut-off score of 0.46 Maxent always ranked the correct LOINC code first for over 57% of local terms. This study showed that a rich corpus of local terms mapped to LOINC contains collective knowledge that can help map terms from other institutions. Using freely available software tools, we developed a data-driven automated approach that operates on term descriptions from existing mappings in the corpus. Accurate and efficient automated mapping methods can help to accelerate adoption of vocabulary standards and promote widespread health information exchange.

  12. A corpus-based approach for automated LOINC mapping

    PubMed Central

    Fidahussein, Mustafa; Vreeman, Daniel J

    2014-01-01

    Objective To determine whether the knowledge contained in a rich corpus of local terms mapped to LOINC (Logical Observation Identifiers Names and Codes) could be leveraged to help map local terms from other institutions. Methods We developed two models to test our hypothesis. The first based on supervised machine learning was created using Apache's OpenNLP Maxent and the second based on information retrieval was created using Apache's Lucene. The models were validated by a random subsampling method that was repeated 20 times and that used 80/20 splits for training and testing, respectively. We also evaluated the performance of these models on all laboratory terms from three test institutions. Results For the 20 iterations used for validation of our 80/20 splits Maxent and Lucene ranked the correct LOINC code first for between 70.5% and 71.4% and between 63.7% and 65.0% of local terms, respectively. For all laboratory terms from the three test institutions Maxent ranked the correct LOINC code first for between 73.5% and 84.6% (mean 78.9%) of local terms, whereas Lucene's performance was between 66.5% and 76.6% (mean 71.9%). Using a cut-off score of 0.46 Maxent always ranked the correct LOINC code first for over 57% of local terms. Conclusions This study showed that a rich corpus of local terms mapped to LOINC contains collective knowledge that can help map terms from other institutions. Using freely available software tools, we developed a data-driven automated approach that operates on term descriptions from existing mappings in the corpus. Accurate and efficient automated mapping methods can help to accelerate adoption of vocabulary standards and promote widespread health information exchange. PMID:23676247

  13. Connectome-based lesion-symptom mapping (CLSM): A novel approach to map neurological function.

    PubMed

    Gleichgerrcht, Ezequiel; Fridriksson, Julius; Rorden, Chris; Bonilha, Leonardo

    2017-01-01

    Lesion-symptom mapping is a key tool in understanding the relationship between structure and function in neuroscience as it can provide objective evidence about which regions are crucial for a given process. Initial limitations with this approach were largely overcome by voxel-based lesion-symptom mapping (VLSM), a method introduced in the early 2000s, which allows for a whole-brain approach to study the association between damaged areas and behavioral impairment by applying an independent statistical test at every voxel. By doing so, this technique eliminated the need to predefine regions of interest or classify patients into groups based on arbitrary cutoff scores. VLSM has nonetheless its own limitations; chiefly, a bias towards recognizing cortical necrosis/gliosis but with poor sensitivity for detecting injury along long white matter tracts, thus ignoring cortical disconnection, which can per se lead to behavioral impairment. Here, we propose a complementary method that, instead, establishes a statistical relationship between the strength of connections between all brain regions of the brain (as defined by a standard brain atlas) and the array of behavioral performance seen in patients with brain injury: connectome-based lesion-symptom mapping (CLSM). Whole-brain CLSM therefore has the potential to identify key connections for behavior independently of a priori assumptions with applicability across a broad spectrum of neurological and psychiatric diseases. We propose that this approach can further our understanding of brain-structure relationships and is worth exploring in clinical and theoretical contexts.

  14. Interacting boson model from energy density functionals: {gamma}-softness and the related topics

    SciTech Connect

    Nomura, K.

    2012-10-20

    A comprehensive way of deriving the Hamiltonian of the interacting boson model (IBM) is described. Based on the fact that the multi-nucleon induced surface deformation in finite nucleus is simulated by effective boson degrees of freedom, the potential energy surface calculated with self-consistent mean-field method employing a given energy density functional (EDF) is mapped onto the IBM analog, and thereby the excitation spectra and transition rates with good symmetry quantum numbers are calculated. Recent applications of the proposed approach are reported: (i) an alternative robust interpretation of the {gamma}-soft nuclei and (ii) shape coexistence in lead isotopes.

  15. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (K-Map) Generation Skill

    ERIC Educational Resources Information Center

    Görgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  16. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (k-map) Generation Skill

    ERIC Educational Resources Information Center

    Gorgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  17. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (K-Map) Generation Skill

    ERIC Educational Resources Information Center

    Görgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  18. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (k-map) Generation Skill

    ERIC Educational Resources Information Center

    Gorgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  19. Map of isotachs - statistical approach and meteorological information transfer

    SciTech Connect

    Menezes, A.A.; da Silva, J.I.; Coutinho, C.E.O.

    1985-09-01

    This report gives a statistical treatment of available wind data from airports in Brazil and provides a map of isotachs for extreme yearly wind velocities. A comparison between the statistical models of Frechet and Gumbel is carried out, leading to the adoption of the latter. The low density of meteorological stations used in this approach restricts the knowledge of wind activity. This fact was accounted for in the analytical method for spatial transfer of climatic data. Recommendations are given on how to enlarge the amount of available data.

  20. Decoherence of spin-deformed bosonic model

    SciTech Connect

    Dehdashti, Sh.; Mahdifar, A.; Bagheri Harouni, M.; Roknizadeh, R.

    2013-07-15

    The decoherence rate and some parameters affecting it are investigated for the generalized spin-boson model. We consider the spin-bosonic model when the bosonic environment is modeled by the deformed harmonic oscillators. We show that the state of the environment approaches a non-linear coherent state. Then, we obtain the decoherence rate of a two-level system which is in contact with a deformed bosonic environment which is either in thermal equilibrium or in the ground state. By using some recent realization of f-deformed oscillators, we show that some physical parameters strongly affect the decoherence rate of a two-level system. -- Highlights: •Decoherence of the generalized spin-boson model is considered. •In this model the environment consists of f-oscillators. •Via the interaction, the state of the environment approaches non-linear coherent states. •Effective parameters on decoherence are considered.

  1. Hyperspectral image super-resolution: a hybrid color mapping approach

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Kwan, Chiman; Budavari, Bence

    2016-07-01

    NASA has been planning a hyperspectral infrared imager mission which will provide global coverage using a hyperspectral imager with 60-m resolution. In some practical applications, such as special crop monitoring or mineral mapping, 60-m resolution may still be too coarse. There have been many pansharpening algorithms for hyperspectral images by fusing high-resolution (HR) panchromatic or multispectral images with low-resolution (LR) hyperspectral images. We propose an approach to generating HR hyperspectral images by fusing high spatial resolution color images with low spatial resolution hyperspectral images. The idea is called hybrid color mapping (HCM) and involves a mapping between a high spatial resolution color image and a low spatial resolution hyperspectral image. Several variants of the color mapping idea, including global, local, and hybrid, are proposed and investigated. It was found that the local HCM yielded the best performance. Comparison of the local HCM with >10 state-of-the-art algorithms using five performance metrics has been carried out using actual images from the air force and NASA. Although our HCM method does not require a point spread function (PSF), our results are comparable to or better than those methods that do require PSF. More importantly, our performance is better than most if not all methods that do not require PSF. After applying our HCM algorithm, not only the visual performance of the hyperspectral image has been significantly improved, but the target classification performance has also been improved. Another advantage of our technique is that it is very efficient and can be easily parallelized. Hence, our algorithm is very suitable for real-time applications.

  2. Mapping topographic plant location properties using a dense matching approach

    NASA Astrophysics Data System (ADS)

    Niederheiser, Robert; Rutzinger, Martin; Lamprecht, Andrea; Bardy-Durchhalter, Manfred; Pauli, Harald; Winkler, Manuela

    2017-04-01

    Within the project MEDIALPS (Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains) six regions in Alpine and in Mediterranean mountain regions are investigated to assess how plant species respond to climate change. The project is embedded in the Global Observation Research Initiative in Alpine Environments (GLORIA), which is a well-established global monitoring initiative for systematic observation of changes in the plant species composition and soil temperature on mountain summits worldwide to discern accelerating climate change pressures on these fragile alpine ecosystems. Close-range sensing techniques such as terrestrial photogrammetry are well suited for mapping terrain topography of small areas with high resolution. Lightweight equipment, flexible positioning for image acquisition in the field, and independence on weather conditions (i.e. wind) make this a feasible method for in-situ data collection. New developments of dense matching approaches allow high quality 3D terrain mapping with less requirements for field set-up. However, challenges occur in post-processing and required data storage if many sites have to be mapped. Within MEDIALPS dense matching is used for mapping high resolution topography for 284 3x3 meter plots deriving information on vegetation coverage, roughness, slope, aspect and modelled solar radiation. This information helps identifying types of topography-dependent ecological growing conditions and evaluating the potential for existing refugial locations for specific plant species under climate change. This research is conducted within the project MEDIALPS - Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains funded by the Earth System Sciences Programme of the Austrian Academy of Sciences.

  3. Canonical map approach to channeling stability in crystals. II

    NASA Astrophysics Data System (ADS)

    Sáenz, A. W.

    1987-11-01

    A nonrelativistic and a relativistic classical Hamiltonian model of two degrees of freedom are considered describing the plane motion of a particle in a potential V(x1,x2)[(x1,x2) =Cartesian coordinates]. Suppose V(x1,x2) is real analytic in its arguments in a neighborhood of the line x2=0, one-periodic in x1 there, and such that the average value of ∂V(x1,0)/∂x2 vanishes. It is proved that, under these conditions and provided that the particle energy E is sufficiently large, there exist for all time two distinguished solutions, one satisfying the equations of motion of the nonrelativistic model and the other those of the relativistic model, whose corresponding configuration-space orbits are one-periodic in x1 and approach the line x2=0 as E→∞. The main theorem is that these solutions are (future) orbitally stable at large enough E if V satisfies the above conditions, as well as natural requirements of linear and nonlinear stability. To prove their existence, one uses a well-known theorem, for which a new and simpler proof is provided, and properties of certain natural canonical maps appropriate to these respective models. It is shown that such solutions are orbitally stable by reducing the maps in question to Birkhoff canonical form and then applying a version of the Moser twist theorem. The approach used here greatly lightens the labor of deriving key estimates for the above maps, these estimates being needed to effect this reduction. The present stability theorem is physically interesting because it is the first rigorous statement on the orbital stability of certain channeling motions of fast charged particles in rigid two-dimensional lattices, within the context of models of the stated degree of generality.

  4. Dynamics of quantum dissipation systems interacting with fermion and boson grand canonical bath ensembles: hierarchical equations of motion approach.

    PubMed

    Jin, Jinshuang; Welack, Sven; Luo, JunYan; Li, Xin-Qi; Cui, Ping; Xu, Rui-Xue; Yan, YiJing

    2007-04-07

    A hierarchical equations of motion formalism for a quantum dissipation system in a grand canonical bath ensemble surrounding is constructed on the basis of the calculus-on-path-integral algorithm, together with the parametrization of arbitrary non-Markovian bath that satisfies fluctuation-dissipation theorem. The influence functionals for both the fermion or boson bath interaction are found to be of the same path integral expression as the canonical bath, assuming they all satisfy the Gaussian statistics. However, the equation of motion formalism is different due to the fluctuation-dissipation theories that are distinct and used explicitly. The implications of the present work to quantum transport through molecular wires and electron transfer in complex molecular systems are discussed.

  5. Study of hole pair condensation based on the SU(2) Slave-Boson approach to the t-J Hamiltonian: Temperature, momentum and doping dependences of spectral functions

    SciTech Connect

    Salk, S.H.S.; Lee, S.S.

    1999-11-01

    Based on the U(1) and SU(2) slave-boson approaches to the t-J Hamiltonian, the authors evaluate the one electron spectral functions for the hole doped high {Tc} cuprates for comparison with the angle resolved photoemission spectroscopy (ARPES) data. They find that the observed quasiparticle peak in the superconducting state is correlated with the hump which exists in the normal state. They find that the spectral weight of the quasiparticle peak increases as doping rate increases, which is consistent with observation. As a consequence of the phase fluctuation effects of the spinon and holon pairing order parameters the spectral weight of the predicted peak obtained from the SU(2) theory is found to be smaller than the one predicted from U(1) mean field theory.

  6. The Higgs Boson.

    ERIC Educational Resources Information Center

    Veltman, Martinus J. G.

    1986-01-01

    Reports recent findings related to the particle Higgs boson and examines its possible contribution to the standard mode of elementary processes. Critically explores the strengths and uncertainties of the Higgs boson and proposed Higgs field. (ML)

  7. The Higgs Boson.

    ERIC Educational Resources Information Center

    Veltman, Martinus J. G.

    1986-01-01

    Reports recent findings related to the particle Higgs boson and examines its possible contribution to the standard mode of elementary processes. Critically explores the strengths and uncertainties of the Higgs boson and proposed Higgs field. (ML)

  8. Tidal deformability of boson stars and dark matter clumps

    NASA Astrophysics Data System (ADS)

    Mendes, Raissa F. P.; Yang, Huan

    2017-09-01

    In this work we consider minimally-coupled boson stars immersed in a tidal environment and compute their tidal deformability to leading order. We also describe an approximate correspondence between Newtonian boson star configurations (described by the Schrödinger–Poisson equations) and dynamical dark matter clumps (described by the collisionless Boltzmann equation). This allows us to map our results for the tidal deformability of boson stars to approximate statements for dark matter clumps.

  9. Higgs boson at LHC: a diffractive opportunity

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2009-03-23

    An alternative process is presented for diffractive Higgs boson production in peripheral pp collisions, where the particles interact through the Double Pomeron Exchange. The event rate is computed as a central-rapidity distribution for Tevatron and LHC energies leading to a result around 0.6 pb, higher than the predictions from previous approaches. Therefore, this result arises as an enhanced signal for the detection of the Higgs boson in hadron colliders. The predictions for the Higgs boson photoproduction are compared to the ones obtained from a similar approach proposed by the Durham group, enabling an analysis of the future developments of its application to pp and AA collisions.

  10. A statistical approach for validating eSOTER and digital soil maps in front of traditional soil maps

    NASA Astrophysics Data System (ADS)

    Bock, Michael; Baritz, Rainer; Köthe, Rüdiger; Melms, Stephan; Günther, Susann

    2015-04-01

    During the European research project eSOTER, three different Digital Soil Maps (DSM) were developed for the pilot area Chemnitz 1:250,000 (FP7 eSOTER project, grant agreement nr. 211578). The core task of the project was to revise the SOTER method for the interpretation of soil and terrain data. It was one of the working hypothesis that eSOTER does not only provide terrain data with typical soil profiles, but that the new products actually perform like a conceptual soil map. The three eSOTER maps for the pilot area considerably differed in spatial representation and content of soil classes. In this study we compare the three eSOTER maps against existing reconnaissance soil maps keeping in mind that traditional soil maps have many subjective issues and intended bias regarding the overestimation and emphasize of certain features. Hence, a true validation of the proper representation of modeled soil maps is hardly possible; rather a statistical comparison between modeled and empirical approaches is possible. If eSOTER data represent conceptual soil maps, then different eSOTER, DSM and conventional maps from various sources and different regions could be harmonized towards consistent new data sets for large areas including the whole European continent. One of the eSOTER maps has been developed closely to the traditional SOTER method: terrain classification data (derived from SRTM DEM) were combined with lithology data (re-interpreted geological map); the corresponding terrain units were then extended with soil information: a very dense regional soil profile data set was used to define soil mapping units based on a statistical grouping of terrain units. The second map is a pure DSM map using continuous terrain parameters instead of terrain classification; radiospectrometric data were used to supplement parent material information from geology maps. The classification method Random Forest was used. The third approach predicts soil diagnostic properties based on

  11. Structure Prior Effects in Bayesian Approaches of Quantitative Susceptibility Mapping

    PubMed Central

    Chen, Weiwei; Wang, Chunmei; Liu, Tian; Wang, Yi; Pan, Chu; Mu, Ketao; Zhu, Ce; Zhang, Xiang; Cheng, Jian

    2016-01-01

    Quantitative susceptibility mapping (QSM) has shown its potential for anatomical and functional MRI, as it can quantify, for in vivo tissues, magnetic biomarkers and contrast agents which have differential susceptibilities to the surroundings substances. For reconstructing the QSM with a single orientation, various methods have been proposed to identify a unique solution for the susceptibility map. Bayesian QSM approach is the major type which uses various regularization terms, such as a piece-wise constant, a smooth, a sparse, or a morphological prior. Six QSM algorithms with or without structure prior are systematically discussed to address the structure prior effects. The methods are evaluated using simulations, phantom experiments with the given susceptibility, and human brain data. The accuracy and image quality of QSM were increased when using structure prior in the simulation and phantom compared to same regularization term without it, respectively. The image quality of QSM method using the structure prior is better comparing, respectively, to the method without it by either sharpening the image or reducing streaking artifacts in vivo. The structure priors improve the performance of the various QSMs using regularized minimization including L1, L2, and TV norm. PMID:28097129

  12. An autoregressive approach to spatio-temporal disease mapping.

    PubMed

    Martínez-Beneito, M A; López-Quilez, A; Botella-Rocamora, P

    2008-07-10

    Disease mapping has been a very active research field during recent years. Nevertheless, time trends in risks have been ignored in most of these studies, yet they can provide information with a very high epidemiological value. Lately, several spatio-temporal models have been proposed, either based on a parametric description of time trends, on independent risk estimates for every period, or on the definition of the joint covariance matrix for all the periods as a Kronecker product of matrices. The following paper offers an autoregressive approach to spatio-temporal disease mapping by fusing ideas from autoregressive time series in order to link information in time and by spatial modelling to link information in space. Our proposal can be easily implemented in Bayesian simulation software packages, for example WinBUGS. As a result, risk estimates are obtained for every region related to those in their neighbours and to those in the same region in adjacent periods. (c) 2007 John Wiley & Sons, Ltd.

  13. Substrate mapping and ablation for ventricular tachycardia: the LAVA approach.

    PubMed

    Sacher, Frederic; Lim, Han S; Derval, Nicolas; Denis, Arnaud; Berte, Benjamin; Yamashita, Seigo; Hocini, Mélèze; Haissaguerre, Michel; Jaïs, Pierre

    2015-04-01

    Catheter ablation of ventricular tachycardia (VT) is proven effective therapy particularly in patients with frequent defibrillator shocks. However, the optimal endpoint for VT ablation has been debated and additional endpoints have been proposed. At the same time, ablation strategies aiming at homogenizing the substrate of scar-related VT have been reported. Our method to homogenize the substrate consists of local abnormal ventricular activity (LAVA) elimination. LAVA are high-frequency sharp signals that represent near-field signals of slowly conducting tissue and hence potential VT isthmuses. Pacing maneuvers are sometimes required to differentiate them from far-field signals. Delayed enhancement on cardiac MRI and/or wall thinning on multidetector computed tomography are also extremely helpful to identify the areas of interest during ablation. A strategy aiming at careful LAVA mapping, ablation, and elimination is feasible and can be achieved in about 70% of patients with scar-related VT. Complete LAVA elimination is associated with a better outcome when compared to LAVA persistence even when VT is rendered noninducible. This is a simple approach, with a clear endpoint and the ability to ablate in sinus rhythm. This strategy significantly benefits from high-definition imaging, mapping, and epicardial access. © 2014 Wiley Periodicals, Inc.

  14. Teaching Population Health: A Competency Map Approach to Education

    PubMed Central

    Kaprielian, Victoria S.; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K.; Murphy, Gwen; Tran, Anh N.; Sheline, Barbara L.; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J.; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H.; Scutchfield, F. Douglas; Michener, J. Lloyd

    2013-01-01

    A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals’ training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina to improve the local community’s health. Based on these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke’s efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919

  15. A POD Mapping Approach to Emulate Land Surface Models

    NASA Astrophysics Data System (ADS)

    Pau, G. S. H.; Bisht, G.; Liu, Y.; Riley, W. J.; Shen, C.

    2014-12-01

    Existing land surface models (LSMs) describe physical and biological processes that occur over a wide range of spatial and temporal scales. Since simulating LSMs at a spatial scale to explicitly resolve the finest resolution processes is computationally expensive, upscaling techniques are used in LSMs to capture effect of subgrid heterogeneity. However, routinely employed linear upscaling techniques that allow LSMs to be simulated at coarse spatial resolution can result in large prediction error. To efficiently predict fine-resolution solutions to LSMs, we studied the application of a reduce order model (ROM) technique known as the "Proper Orthogonal Decomposition mapping method" that reconstructs temporally-resolved fine-resolution solutions based on coarse-resolution solutions for two case studies. In the first case study, we applied POD approach on surface-subsurface isothermal simulations for four study sites (104 [m2]) in a polygonal tundra landscape near Barrow, Alaska. The results indicate that the ROM produced a significant computational speedup (>103) with very small relative approximation error (<0.1%) for two validation years not used in training the ROM. In the second case study, we illustrate the applicability of our ROM approach at watershed scale (1837 km2) model that is substantially more heterogeneous and demonstrate a hierarchical approach to emulating models at spatial scales consistent with mechanistic physical process representation.

  16. Cortical sulcal atlas construction using a diffeomorphic mapping approach.

    PubMed

    Joshi, Shantanu H; Cabeen, Ryan P; Sun, Bo; Joshi, Anand A; Gutman, Boris; Zamanyan, Alen; Chakrapani, Shruthi; Dinov, Ivo; Woods, Roger P; Toga, Arthur W

    2010-01-01

    We present a geometric approach for constructing shape atlases of sulcal curves on the human cortex. Sulci and gyri are represented as continuous open curves in R3, and their shapes are studied as elements of an infinite-dimensional sphere. This shape manifold has some nice properties--it is equipped with a Riemannian L2 metric on the tangent space and facilitates computational analyses and correspondences between sulcal shapes. Sulcal mapping is achieved by computing geodesics in the quotient space of shapes modulo rigid rotations and reparameterizations. The resulting sulcal shape atlas is shown to preserve important local geometry inherently present in the sample population. This is demonstrated in our experimental results for deep brain sulci, where we integrate the elastic shape model into surface registration framework for a population of 69 healthy young adult subjects.

  17. Omics and Exercise: Global Approaches for Mapping Exercise Biological Networks.

    PubMed

    Hoffman, Nolan J

    2017-03-27

    The application of global "-omics" technologies to exercise has introduced new opportunities to map the complexity and interconnectedness of biological networks underlying the tissue-specific responses and systemic health benefits of exercise. This review will introduce major research tracks and recent advancements in this emerging field, as well as critical gaps in understanding the orchestration of molecular exercise dynamics that will benefit from unbiased omics investigations. Furthermore, significant research hurdles that need to be overcome to effectively fill these gaps related to data collection, computation, interpretation, and integration across omics applications will be discussed. Collectively, a cross-disciplinary physiological and omics-based systems approach will lead to discovery of a wealth of novel exercise-regulated targets for future mechanistic validation. This frontier in exercise biology will aid the development of personalized therapeutic strategies to improve athletic performance and human health through precision exercise medicine.

  18. Mapping the distribution of malaria: current approaches and future directions

    USGS Publications Warehouse

    Johnson, Leah R.; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.; Chen, Dongmei; Moulin, Bernard; Wu, Jianhong

    2015-01-01

    Mapping the distribution of malaria has received substantial attention because the disease is a major source of illness and mortality in humans, especially in developing countries. It also has a defined temporal and spatial distribution. The distribution of malaria is most influenced by its mosquito vector, which is sensitive to extrinsic environmental factors such as rainfall and temperature. Temperature also affects the development rate of the malaria parasite in the mosquito. Here, we review the range of approaches used to model the distribution of malaria, from spatially explicit to implicit, mechanistic to correlative. Although current methods have significantly improved our understanding of the factors influencing malaria transmission, significant gaps remain, particularly in incorporating nonlinear responses to temperature and temperature variability. We highlight new methods to tackle these gaps and to integrate new data with models.

  19. A multi-model ensemble approach to seabed mapping

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  20. Composite Fermion Theory for Bosonic Quantum Hall States on Lattices

    NASA Astrophysics Data System (ADS)

    Möller, G.; Cooper, N. R.

    2009-09-01

    We study the ground states of the Bose-Hubbard model in a uniform magnetic field, motivated by the physics of cold atomic gases on lattices at high vortex density. Mapping the bosons to composite fermions (CF) leads to the prediction of quantum Hall fluids that have no counterpart in the continuum. We construct trial states for these phases and test numerically the predictions of the CF model. We establish the existence of strongly correlated phases beyond those in the continuum limit and provide evidence for a wider scope of the composite fermion approach beyond its application to the lowest Landau level.

  1. Unconventional quantum critical points in systems of strongly interacting bosons

    NASA Astrophysics Data System (ADS)

    Zaleski, T. A.; Kopeć, T. K.

    2014-09-01

    Using the combined Bogoliubov method and the quantum rotor approach, we map the Bose-Hubbard Hamiltonian of strongly interacting bosons onto U(1) phase action. By unraveling consequences of the nontrivial topology of the U(1) gauge group and the associated ground state degeneracy we found a close kinship of the zero-temperature divergence of the compressibility and the topological susceptibility at degeneracy points, which marks a novel quantum criticality governed by topological features rather than the Landau principle of the symmetry breaking. We argue that the existence of this new type of the criticality may be instrumental in explaining unconventional quantum critical points observed in superconducting cuprates.

  2. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    DOE PAGES

    Carena, Marcela; Haber, Howard E.; Low, Ian; ...

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combinationmore » of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.« less

  3. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    SciTech Connect

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combination of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.

  4. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    SciTech Connect

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E.M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP -even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. The combination of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP -even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ

  5. Connectomics: comprehensive approaches for whole-brain mapping.

    PubMed

    Shibata, Shinsuke; Komaki, Yuji; Seki, Fumiko; Inouye, Michiko O; Nagai, Toshihiro; Okano, Hideyuki

    2015-02-01

    The aim of connectomics analysis is to understand whole-brain neural connections. This is accomplished using new biotechnologies. Here, we provide an overview of the recent progress in connectomics analysis. The entire neural network of an organism was revealed for the first time in the nematode. Caenorhabditis elegans (C. elegans) have an advantage of their limited number of neurons and their transparency, allowing the neural network to be visualized using light and electron microscopes (EMs). It is practically impossible to adopt the same approach for mammals because of the large number of neural cells and the opacity of the central nervous system. A variety of new technologies are being developed to perform computer-assisted high-throughput image acquisition and analysis to obtain whole-brain maps for higher species, including mammals. Diffusion tensor magnetic resonance imaging and tractography and three-dimensional imaging with the EM are examples of novel approaches to connectomics. These new technologies will soon be applied not only to Drosophila, C. elegans and rodent research, but also to comprehensive connectomics analysis in a wide range of species including humans and primates. In the near future, results from connectomics analysis will reveal the neural circuitry of the whole brain and enhance our understanding of the human mind and neuropsychiatric diseases.

  6. Pure P2P mediation system: A mappings discovery approach

    NASA Astrophysics Data System (ADS)

    selma, El yahyaoui El idrissi; Zellou, Ahmed; Idri, Ali

    2015-02-01

    The information integration systems consist in offering a uniform interface to provide access to a set of autonomous and distributed information sources. The most important advantage of this system is that it allows users to specify what they want, rather than thinking about how to get the responses. The works realized in this area have particular leads to two major classes of integration systems: the mediation systems based on the paradigm mediator / adapter and peer to peer systems (P2P). The combination of both systems has led to a third type; is the mediation P2P systems. The P2P systems are large-scale systems, self-organized and distributed. They allow the resource management in a completely decentralized way. However, the integration of structured information sources, heterogeneous and distributed proves to be a complex problem. The objective of this work is to propose an approach to resolve conflicts and establish a mapping between the heterogeneous elements. This approach is based on clustering; the latter is to group similar Peers that share common information in the same subnet. Thus, to facilitate the heterogeneity, we introduced three additional layers of our hierarchy of peers: internal schema, external schema and Schema directory peer. We used linguistic techniques, and precisely the name correspondence technique, that is based on the similarity of names to propose a correspondence.

  7. High School Biology: A Group Approach to Concept Mapping.

    ERIC Educational Resources Information Center

    Brown, David S.

    2003-01-01

    Explains concept mapping as an instructional method in cooperative learning environments, and describes a study investigating the effectiveness of concept mapping on student learning during a photosynthesis and cellular respiration unit. Reports on the positive effects of concept mapping in the experimental group. (Contains 16 references.) (YDS)

  8. High School Biology: A Group Approach to Concept Mapping.

    ERIC Educational Resources Information Center

    Brown, David S.

    2003-01-01

    Explains concept mapping as an instructional method in cooperative learning environments, and describes a study investigating the effectiveness of concept mapping on student learning during a photosynthesis and cellular respiration unit. Reports on the positive effects of concept mapping in the experimental group. (Contains 16 references.) (YDS)

  9. Comparison of Mixed-Model Approaches for Association Mapping

    USDA-ARS?s Scientific Manuscript database

    Association-mapping methods promise to overcome the limitations of linkage-mapping methods. The main objectives of this study were to (i) evaluate various methods for association mapping in the autogamous species wheat using an empirical data set, (ii) determine a marker-based kinship matrix using a...

  10. Teaching Map Skills: An Inductive Approach. Part Four.

    ERIC Educational Resources Information Center

    Anderson, Jeremy

    1985-01-01

    Satisfactory completion of this self-contained map exercise will demonstrate student ability to use symbols, legends, scale, orientation, index, and grid in map reading and map use to give directions for way-finding. The exercise should take one class period to complete. (RM)

  11. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    USDA-ARS?s Scientific Manuscript database

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  12. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  13. Slave boson theories of correlated electron systems

    SciTech Connect

    Woelfle, P.

    1995-05-01

    Slave boson theories of various models of correlated fermions are critically reviewed and several new results are presented. In the example of the Anderson impurity model the limitations of slave boson mean field theory are discussed. Self-consistent conserving approximations are compared with results obtained from the numerical renormalization group. The gauge field theory of the t-J-model is considered in the quasistatic approximation. It is shown that weak localization effects can give valuable information on the existence of gauge fields. Applications of the slave-boson approach due to Kotliar and Ruckenstein to the Hubbard model are also discussed.

  14. Mapping diffusion in a living cell via the phasor approach.

    PubMed

    Ranjit, Suman; Lanzano, Luca; Gratton, Enrico

    2014-12-16

    Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created.

  15. Mapping Diffusion in a Living Cell via the Phasor Approach

    PubMed Central

    Ranjit, Suman; Lanzano, Luca; Gratton, Enrico

    2014-01-01

    Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created. PMID:25517145

  16. Spectral similarity approach for mapping turbidity of an inland waterbody

    NASA Astrophysics Data System (ADS)

    Garg, Vaibhav; Senthil Kumar, A.; Aggarwal, S. P.; Kumar, Vinay; Dhote, Pankaj R.; Thakur, Praveen K.; Nikam, Bhaskar R.; Sambare, Rohit S.; Siddiqui, Asfa; Muduli, Pradipta R.; Rastogi, Gurdeep

    2017-07-01

    Turbidity is an important quality parameter of water from its optical property point of view. It varies spatio-temporally over large waterbodies and its well distributed measurement on field is tedious and time consuming. Generally, normalized difference turbidity index (NDTI), or band ratio, or regression analysis between turbidity concentration and band reflectance, approaches have been adapted to retrieve turbidity using multispectral remote sensing data. These techniques usually provide qualitative rather than quantitative estimates of turbidity. However, in the present study, spectral similarity analysis, between the spectral characteristics of spaceborne hyperspectral remote sensing data and spectral library generated on field, was carried out to quantify turbidity in the part of Chilika Lake, Odisha, India. Spatial spectral contextual image analysis, spectral angle mapper (SAM) technique was evaluated for the same. The SAM spectral matching technique has been widely used in geological application (mineral mapping), however, the application of this kind of techniques is limited in water quality studies due to non-availability of reference spectral libraries. A spectral library was generated on field for the different concentrations of turbidity using well calibrated instruments like field spectro-radiometer, turbidity meter and hand held global positioning system. The field spectra were classified into 7 classes of turbidity concentration as <5, 5-10, 10-15, 15-25, 25-45, 45-100 and >100 NTU for analysis. Analysis reveal that at each location in the lake under consideration, the field spectra matched with the image spectra with SAM score of 0.8 and more. The observed turbidity at each location was also very much falling in the estimated turbidity class range. It was observed that the spectral similarity approach provides more quantitative estimate of turbidity as compared to NDTI.

  17. A Hands-On Approach to Understanding Topographic Maps and Their Construction.

    ERIC Educational Resources Information Center

    Bart, Henry Anthony

    1991-01-01

    Describes a topographic map exercise designed for lab session of two to three hours in an introductory geology course. Students are taught the basic principles of topographic map construction and are then required to make a map of a section of campus. Author claims the approach has improved student test performance and resulted in a deeper…

  18. A Hands-On Approach to Understanding Topographic Maps and Their Construction.

    ERIC Educational Resources Information Center

    Bart, Henry Anthony

    1991-01-01

    Describes a topographic map exercise designed for lab session of two to three hours in an introductory geology course. Students are taught the basic principles of topographic map construction and are then required to make a map of a section of campus. Author claims the approach has improved student test performance and resulted in a deeper…

  19. Improved Omnidirectional Odometry for a View-Based Mapping Approach

    PubMed Central

    Valiente, David; Gil, Arturo; Reinoso, Óscar; Juliá, Miguel; Holloway, Mathew

    2017-01-01

    This work presents an improved visual odometry using omnidirectional images. The main purpose is to generate a reliable prior input which enhances the SLAM (Simultaneous Localization and Mapping) estimation tasks within the framework of navigation in mobile robotics, in detriment of the internal odometry data. Generally, standard SLAM approaches extensively use data such as the main prior input to localize the robot. They also tend to consider sensory data acquired with GPSs, lasers or digital cameras, as the more commonly acknowledged to re-estimate the solution. Nonetheless, the modeling of the main prior is crucial, and sometimes especially challenging when it comes to non-systematic terms, such as those associated with the internal odometer, which ultimately turn to be considerably injurious and compromise the convergence of the system. This omnidirectional odometry relies on an adaptive feature point matching through the propagation of the current uncertainty of the system. Ultimately, it is fused as the main prior input in an EKF (Extended Kalman Filter) view-based SLAM system, together with the adaption of the epipolar constraint to the omnidirectional geometry. Several improvements have been added to the initial visual odometry proposal so as to produce better performance. We present real data experiments to test the validity of the proposal and to demonstrate its benefits, in contrast to the internal odometry. Furthermore, SLAM results are included to assess its robustness and accuracy when using the proposed prior omnidirectional odometry. PMID:28208766

  20. Current Approaches Toward Quantitative Mapping of the Interactome

    PubMed Central

    Buntru, Alexander; Trepte, Philipp; Klockmeier, Konrad; Schnoegl, Sigrid; Wanker, Erich E.

    2016-01-01

    Protein–protein interactions (PPIs) play a key role in many, if not all, cellular processes. Disease is often caused by perturbation of PPIs, as recently indicated by studies of missense mutations. To understand the associations of proteins and to unravel the global picture of PPIs in the cell, different experimental detection techniques for PPIs have been established. Genetic and biochemical methods such as the yeast two-hybrid system or affinity purification-based approaches are well suited to high-throughput, proteome-wide screening and are mainly used to obtain qualitative results. However, they have been criticized for not reflecting the cellular situation or the dynamic nature of PPIs. In this review, we provide an overview of various genetic methods that go beyond qualitative detection and allow quantitative measuring of PPIs in mammalian cells, such as dual luminescence-based co-immunoprecipitation, Förster resonance energy transfer or luminescence-based mammalian interactome mapping with bait control. We discuss the strengths and weaknesses of different techniques and their potential applications in biomedical research. PMID:27200083

  1. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  2. Higgs Boson 2016

    SciTech Connect

    Lincoln, Don

    2016-11-16

    The Higgs boson burst into the public arena on July 4, 2012, when scientists working at the CERN laboratory announced the particle’s discovery. However the initial discovery was a bit tentative, with the need to verify that the discovered particle was, indeed, the Higgs boson. In this video, Fermilab’s Dr. Don Lincoln looks at the data from the perspective of 2016 and shows that more recent analyses further supports the idea that the Higgs boson is what was discovered.

  3. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-07-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff and flood predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP-mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexities of automatic DRP-mapping approaches affect hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison, and a deviation map were derived. The automatically derived DRP maps were used in synthetic runoff simulations with an adapted version of the PREVAH hydrological model, and simulation results compared with those from simulations using the reference maps. The DRP maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. Furthermore, we argue not to use

  4. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, M.; Buss, R.; Scherrer, S.; Margreth, M.; Zappa, M.

    2015-12-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexity of automatic DRP mapping approaches affects hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison and a deviation map were derived. The automatically derived DRP-maps were used in synthetic runoff simulations with an adapted version of the hydrological model PREVAH, and simulation results compared with those from simulations using the reference maps. The DRP-maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP-maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. We therefore recommend not only using expert

  5. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    ERIC Educational Resources Information Center

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  6. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    ERIC Educational Resources Information Center

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  7. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    PubMed

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  8. Degradability of Bosonic Gaussian channels

    SciTech Connect

    Caruso, Filippo; Giovannetti, Vittorio

    2006-12-15

    The notion of weak-degradability of quantum channels is introduced by generalizing the degradability definition given by Devetak and Shor. Exploiting the unitary equivalence with beam-splitter/amplifier channels we then prove that a large class of one-mode Bosonic Gaussian channels are either weakly degradable or anti-degradable. In the latter case this implies that their quantum capacity Q is null. In the former case instead, this allows us to establish the additivity of the coherent information for those maps which admit unitary representation with single-mode pure environment.

  9. A high-density, multi-parental SNP genetic map on apple validates a new mapping approach for outcrossing species

    PubMed Central

    Di Pierro, Erica A; Gianfranceschi, Luca; Di Guardo, Mario; Koehorst-van Putten, Herma JJ; Kruisselbrink, Johannes W; Longhi, Sara; Troggio, Michela; Bianco, Luca; Muranty, Hélène; Pagliarani, Giulia; Tartarini, Stefano; Letschka, Thomas; Lozano Luis, Lidia; Garkava-Gustavsson, Larisa; Micheletti, Diego; Bink, Marco CAM; Voorrips, Roeland E; Aziz, Ebrahimi; Velasco, Riccardo; Laurens, François; van de Weg, W Eric

    2016-01-01

    Quantitative trait loci (QTL) mapping approaches rely on the correct ordering of molecular markers along the chromosomes, which can be obtained from genetic linkage maps or a reference genome sequence. For apple (Malus domestica Borkh), the genome sequence v1 and v2 could not meet this need; therefore, a novel approach was devised to develop a dense genetic linkage map, providing the most reliable marker-loci order for the highest possible number of markers. The approach was based on four strategies: (i) the use of multiple full-sib families, (ii) the reduction of missing information through the use of HaploBlocks and alternative calling procedures for single-nucleotide polymorphism (SNP) markers, (iii) the construction of a single backcross-type data set including all families, and (iv) a two-step map generation procedure based on the sequential inclusion of markers. The map comprises 15 417 SNP markers, clustered in 3 K HaploBlock markers spanning 1 267 cM, with an average distance between adjacent markers of 0.37 cM and a maximum distance of 3.29 cM. Moreover, chromosome 5 was oriented according to its homoeologous chromosome 10. This map was useful to improve the apple genome sequence, design the Axiom Apple 480 K SNP array and perform multifamily-based QTL studies. Its collinearity with the genome sequences v1 and v3 are reported. To our knowledge, this is the shortest published SNP map in apple, while including the largest number of markers, families and individuals. This result validates our methodology, proving its value for the construction of integrated linkage maps for any outbreeding species. PMID:27917289

  10. Single point vs. mapping approach for spectral cytopathology (SCP).

    PubMed

    Schubert, Jennifer M; Mazur, Antonella I; Bird, Benjamin; Miljković, Milos; Diem, Max

    2010-08-01

    In this paper we describe the advantages of collecting infrared microspectral data in imaging mode opposed to point mode. Imaging data are processed using the PapMap algorithm, which co-adds pixel spectra that have been scrutinized for R-Mie scattering effects as well as other constraints. The signal-to-noise quality of PapMap spectra will be compared to point spectra for oral mucosa cells deposited onto low-e slides. Also the effects of software atmospheric correction will be discussed. Combined with the PapMap algorithm, data collection in imaging mode proves to be a superior method for spectral cytopathology.

  11. NEW APPROACHES: Using concept maps with trainee physics teachers

    NASA Astrophysics Data System (ADS)

    Adamczyk, Peter; Willson, Mike

    1996-11-01

    The technique of Concept Mapping described here is useful for identifying gaps in trainee teachers' knowledge, which may then be addressed to help those who must nowadays teach Science outside their own specialism.

  12. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships.

  13. Transboundary aquifer mapping and management in Africa: a harmonised approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Yvan; Villholth, Karen G.

    2013-11-01

    Recent attention to transboundary aquifers (TBAs) in Africa reflects the growing importance of these resources for development in the continent. However, relatively little research on these aquifers and their best management strategies has been published. This report recapitulates progress on mapping and management frameworks for TBAs in Africa. The world map on transboundary aquifers presented at the 6th World Water Forum in 2012 identified 71 TBA systems in Africa. This report presents an updated African TBA map including 80 shared aquifers and aquifer systems superimposed on 63 international river basins. Furthermore, it proposes a new nomenclature for the mapping based on three sub-regions, reflecting the leading regional development communities. The map shows that TBAs represent approximately 42 % of the continental area and 30 % of the population. Finally, a brief review of current international law, specific bi- or multilateral treaties, and TBA management practice in Africa reveals little documented international conflicts over TBAs. The existing or upcoming international river and lake basin organisations offer a harmonised institutional base for TBA management while alternative or supportive models involving the regional development communities are also required. The proposed map and geographical classification scheme for TBAs facilitates identification of options for joint institutional setups.

  14. Four-lepton LHC events from MSSM Higgs boson decays into neutralino and chargino pairs

    NASA Astrophysics Data System (ADS)

    Bisset, Mike; Li, Jun; Kersting, Nick; Lu, Ran; Moortgat, Filip; Moretti, Stefano

    2009-08-01

    Heavy neutral Higgs boson production and decay into neutralino and chargino pairs is studied at the Large Hadron Collider in the context of the minimal supersymmetric standard model. Higgs boson decays into the heavier neutralino and chargino states, i.e., H0,A0→tilde chi20tilde chi30, tilde chi20tilde chi40, tilde chi30tilde chi30, tilde chi30tilde chi40, tilde chi40tilde chi40 as well as H0,A0→tilde chi1±tilde chi2mp,tilde chi2+tilde chi2- (all leading to four-lepton plus missing transverse energy final states), is found to improve the possibilities of discovering such Higgs states beyond those previously identified by considering H0,A0→tilde chi20tilde chi20 decays only. In particular, H0,A0 bosons with quite heavy masses, approaching ~ 800 GeV in the so-called `decoupling region' where no clear SM signatures for the heavier MSSM Higgs bosons are known to exist, can now be discerned, for suitable but not particularly restrictive configurations of the low energy supersymmetric parameters. The high MA discovery reach for the H0 and A0 may thus be greatly extended. Full event-generator level simulations, including realistic detector effects and analyses of all significant backgrounds, are performed to delineate the potential H0,A0 discovery regions. The wedgebox plot technique is also utilized to further analyze the 4l plus missing transverse energy signal and background events. This study marks the first thorough and reasonably complete analysis of this important class of MSSM Higgs boson signature modes. In fact, this is the first time discovery regions including all possible neutralino and chargino decay modes of the Higgs bosons have ever been mapped out.

  15. Minimally symmetric Higgs boson

    SciTech Connect

    Low, Ian

    2015-06-17

    Models addressing the naturalness of a light Higgs boson typically employ symmetries, either bosonic or fermionic, to stabilize the Higgs mass. We consider a setup with the minimal amount of symmetries: four shift symmetries acting on the four components of the Higgs doublet, subject to the constraints of linearly realized SU(2)(L) x U(1)(Y) electroweak symmetry. Up to terms that explicitly violate the shift symmetries, the effective Lagrangian can be derived, irrespective of the spontaneously broken group G in the ultraviolet, and is universal among all models where the Higgs arises as a pseudo-Nambu-Goldstone boson. Very high energy scatterings of vector bosons could provide smoking gun signals of a minimally symmetric Higgs boson.

  16. Two-dimensional thermofield bosonization

    SciTech Connect

    Amaral, R.L.P.G.

    2005-12-15

    The main objective of this paper was to obtain an operator realization for the bosonization of fermions in 1 + 1 dimensions, at finite, non-zero temperature T. This is achieved in the framework of the real-time formalism of Thermofield Dynamics. Formally, the results parallel those of the T = 0 case. The well-known two-dimensional Fermion-Boson correspondences at zero temperature are shown to hold also at finite temperature. To emphasize the usefulness of the operator realization for handling a large class of two-dimensional quantum field-theoretic problems, we contrast this global approach with the cumbersome calculation of the fermion-current two-point function in the imaginary-time formalism and real-time formalisms. The calculations also illustrate the very different ways in which the transmutation from Fermi-Dirac to Bose-Einstein statistics is realized.

  17. Supersymmetric Higgs Bosons in Weak Boson Fusion

    SciTech Connect

    Hollik, Wolfgang; Plehn, Tilman; Rauch, Michael; Rzehak, Heidi

    2009-03-06

    We compute the complete supersymmetric next-to-leading-order corrections to the production of a light Higgs boson in weak-boson fusion. The size of the electroweak corrections is of similar order as the next-to-leading-order corrections in the standard model. The supersymmetric QCD corrections turn out to be significantly smaller than expected and than their electroweak counterparts. These corrections are an important ingredient to a precision analysis of the (supersymmetric) Higgs sector at the LHC, either as a known correction factor or as a contribution to the theory error.

  18. Atom-atom correlations in time-of-flight imaging of ultracold bosons in optical lattices

    SciTech Connect

    Zaleski, T. A.; Kopec, T. K.

    2011-11-15

    We study the spatial correlations of strongly interacting bosons in a ground state, confined in a two-dimensional square and a three-dimensional cubic lattice. Using the combined Bogoliubov method and the quantum rotor approach, we map the Hamiltonian of strongly interacting bosons onto U(1) phase action in order to calculate the atom-atom correlations' decay along the principal axis and a diagonal of the lattice-plane direction as a function of distance. Lower tunneling rates lead to quicker decays of the correlations, whose character becomes exponential. Finally, correlation functions allow us to calculate quantities that are directly bound to experimental outcomes, namely time-of-flight absorption images and resulting visibility. Our results contain all the characteristic features present in experimental data (transition from Mott insulating blob to superfluid peaks, etc.), emphasizing the usability of the proposed approach.

  19. Equivalence between spin Hamiltonians and boson sampling

    NASA Astrophysics Data System (ADS)

    Peropadre, Borja; Aspuru-Guzik, Alán; García-Ripoll, Juan José

    2017-03-01

    Aaronson and Arkhipov showed that predicting or reproducing the measurement statistics of a general linear optics circuit with a single Fock-state input is a classically hard problem. Here we show that this problem, known as boson sampling, is as hard as simulating the short time evolution of a large but simple spin model with long-range X Y interactions. The conditions for this equivalence are the same for efficient boson sampling, namely, having a small number of photons (excitations) as compared to the number of modes (spins). This mapping allows efficient implementations of boson sampling in small quantum computers and simulators and sheds light on the complexity of time evolution with critical spin models.

  20. Bosonic self-energy functional theory

    NASA Astrophysics Data System (ADS)

    Hügel, Dario; Werner, Philipp; Pollet, Lode; Strand, Hugo U. R.

    2016-11-01

    We derive the self-energy functional theory for bosonic lattice systems with broken U(1) symmetry by parametrizing the bosonic Baym-Kadanoff effective action in terms of one- and two-point self-energies. The formalism goes beyond other approximate methods such as the pseudoparticle variational cluster approximation, the cluster composite boson mapping, and the Bogoliubov+U theory. It simplifies to bosonic dynamical-mean-field theory when constraining to local fields, whereas when neglecting kinetic contributions of noncondensed bosons, it reduces to the static mean-field approximation. To benchmark the theory, we study the Bose-Hubbard model on the two- and three-dimensional cubic lattice, comparing with exact results from path integral quantum Monte Carlo. We also study the frustrated square lattice with next-nearest-neighbor hopping, which is beyond the reach of Monte Carlo simulations. A reference system comprising a single bosonic state, corresponding to three variational parameters, is sufficient to quantitatively describe phase boundaries and thermodynamical observables, while qualitatively capturing the spectral functions, as well as the enhancement of kinetic fluctuations in the frustrated case. On the basis of these findings, we propose self-energy functional theory as the omnibus framework for treating bosonic lattice models, in particular, in cases where path integral quantum Monte Carlo methods suffer from severe sign problems (e.g., in the presence of nontrivial gauge fields or frustration). Self-energy functional theory enables the construction of diagrammatically sound approximations that are quantitatively precise and controlled in the number of optimization parameters but nevertheless remain computable by modest means.

  1. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    PubMed Central

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  2. Eulerian Mapping Closure Approach for Probability Density Function of Concentration in Shear Flows

    NASA Technical Reports Server (NTRS)

    He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Eulerian mapping closure approach is developed for uncertainty propagation in computational fluid mechanics. The approach is used to study the Probability Density Function (PDF) for the concentration of species advected by a random shear flow. An analytical argument shows that fluctuation of the concentration field at one point in space is non-Gaussian and exhibits stretched exponential form. An Eulerian mapping approach provides an appropriate approximation to both convection and diffusion terms and leads to a closed mapping equation. The results obtained describe the evolution of the initial Gaussian field, which is in agreement with direct numerical simulations.

  3. Mapping Sustainability Initiatives across a Region: An Innovative Survey Approach

    ERIC Educational Resources Information Center

    Somerville, Margaret; Green, Monica

    2012-01-01

    The project of mapping sustainability initiatives across a region is part of a larger program of research about place and sustainability education for the Anthropocene, the new geological age of human-induced planetary changes (Zalasiewicz, Williams, Steffen, & Crutzen, 2010). The study investigated the location, nature and type of…

  4. A New Approach for Constructing the Concept Map

    ERIC Educational Resources Information Center

    Tseng, Shian-Shyong; Sue, Pei-Chi; Su, Jun-Ming; Weng, Jui-Feng; Tsai, Wen-Nung

    2007-01-01

    In recent years, e-learning system has become more and more popular and many adaptive learning environments have been proposed to offer learners customized courses in accordance with their aptitudes and learning results. For achieving the adaptive learning, a predefined concept map of a course is often used to provide adaptive learning guidance…

  5. The Facebook Influence Model: A Concept Mapping Approach

    PubMed Central

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  6. Cognitions of Expert Supervisors in Academe: A Concept Mapping Approach

    ERIC Educational Resources Information Center

    Kemer, Gülsah; Borders, L. DiAnne; Willse, John

    2014-01-01

    Eighteen expert supervisors reported their thoughts while preparing for, conducting, and evaluating their supervision sessions. Concept mapping (Kane & Trochim, [Kane, M., 2007]) yielded 195 cognitions classified into 25 cognitive categories organized into 5 supervision areas: conceptualization of supervision, supervisee assessment,…

  7. A New Approach for Constructing the Concept Map

    ERIC Educational Resources Information Center

    Tseng, Shian-Shyong; Sue, Pei-Chi; Su, Jun-Ming; Weng, Jui-Feng; Tsai, Wen-Nung

    2007-01-01

    In recent years, e-learning system has become more and more popular and many adaptive learning environments have been proposed to offer learners customized courses in accordance with their aptitudes and learning results. For achieving the adaptive learning, a predefined concept map of a course is often used to provide adaptive learning guidance…

  8. Mapping Sustainability Initiatives across a Region: An Innovative Survey Approach

    ERIC Educational Resources Information Center

    Somerville, Margaret; Green, Monica

    2012-01-01

    The project of mapping sustainability initiatives across a region is part of a larger program of research about place and sustainability education for the Anthropocene, the new geological age of human-induced planetary changes (Zalasiewicz, Williams, Steffen, & Crutzen, 2010). The study investigated the location, nature and type of…

  9. Cognitions of Expert Supervisors in Academe: A Concept Mapping Approach

    ERIC Educational Resources Information Center

    Kemer, Gülsah; Borders, L. DiAnne; Willse, John

    2014-01-01

    Eighteen expert supervisors reported their thoughts while preparing for, conducting, and evaluating their supervision sessions. Concept mapping (Kane & Trochim, [Kane, M., 2007]) yielded 195 cognitions classified into 25 cognitive categories organized into 5 supervision areas: conceptualization of supervision, supervisee assessment,…

  10. Computer-Assisted Argument Mapping: A "Rationale" Approach

    ERIC Educational Resources Information Center

    Davies, W. Martin

    2009-01-01

    Computer-Assisted Argument Mapping (CAAM) is a new way of understanding arguments. While still embryonic in its development and application, CAAM is being used increasingly as a training and development tool in the professions and government. Inroads are also being made in its application within education. CAAM claims to be helpful in an…

  11. Approaches to Mapping Nitrogen Removal: Examples at a Landscape Scale

    EPA Science Inventory

    Wetlands can provide the ecosystem service of improved water quality via nitrogen removal, providing clean drinking water and reducing the eutrophication of aquatic resources. Within the ESRP, mapping nitrogen removal by wetlands is a service that incorporates the goals of the ni...

  12. A National Approach to Map and Quantify Terrestrial Vertebrate Biodiversity

    EPA Science Inventory

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  13. Raman mapping of oral buccal mucosa: a spectral histopathology approach

    NASA Astrophysics Data System (ADS)

    Behl, Isha; Kukreja, Lekha; Deshmukh, Atul; Singh, S. P.; Mamgain, Hitesh; Hole, Arti R.; Krishna, C. Murali

    2014-12-01

    Oral cancer is one of the most common cancers worldwide. One-fifth of the world's oral cancer subjects are from India and other South Asian countries. The present Raman mapping study was carried out to understand biochemical variations in normal and malignant oral buccal mucosa. Data were acquired using WITec alpha 300R instrument from 10 normal and 10 tumors unstained tissue sections. Raman maps of normal sections could resolve the layers of epithelium, i.e. basal, intermediate, and superficial. Inflammatory, tumor, and stromal regions are distinctly depicted on Raman maps of tumor sections. Mean and difference spectra of basal and inflammatory cells suggest abundance of DNA and carotenoids features. Strong cytochrome bands are observed in intermediate layers of normal and stromal regions of tumor. Epithelium and stromal regions of normal cells are classified by principal component analysis. Classification among cellular components of normal and tumor sections is also observed. Thus, the findings of the study further support the applicability of Raman mapping for providing molecular level insights in normal and malignant conditions.

  14. The Facebook influence model: a concept mapping approach.

    PubMed

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts.

  15. Approaches to Mapping Nitrogen Removal: Examples at a Landscape Scale

    EPA Science Inventory

    Wetlands can provide the ecosystem service of improved water quality via nitrogen removal, providing clean drinking water and reducing the eutrophication of aquatic resources. Within the ESRP, mapping nitrogen removal by wetlands is a service that incorporates the goals of the ni...

  16. Computational approaches and software tools for genetic linkage map estimation in plants.

    PubMed

    Cheema, Jitender; Dicks, Jo

    2009-11-01

    Genetic maps are an important component within the plant biologist's toolkit, underpinning crop plant improvement programs. The estimation of plant genetic maps is a conceptually simple yet computationally complex problem, growing ever more so with the development of inexpensive, high-throughput DNA markers. The challenge for bioinformaticians is to develop analytical methods and accompanying software tools that can cope with datasets of differing sizes, from tens to thousands of markers, that can incorporate the expert knowledge that plant biologists typically use when developing their maps, and that facilitate user-friendly approaches to achieving these goals. Here, we aim to give a flavour of computational approaches for genetic map estimation, discussing briefly many of the key concepts involved, and describing a selection of software tools that employ them. This review is intended both for plant geneticists as an introduction to software tools with which to estimate genetic maps, and for bioinformaticians as an introduction to the underlying computational approaches.

  17. Higgs Boson 2016

    ScienceCinema

    Lincoln, Don

    2016-12-14

    The Higgs boson burst into the public arena on July 4, 2012, when scientists working at the CERN laboratory announced the particle’s discovery. However the initial discovery was a bit tentative, with the need to verify that the discovered particle was, indeed, the Higgs boson. In this video, Fermilab’s Dr. Don Lincoln looks at the data from the perspective of 2016 and shows that more recent analyses further supports the idea that the Higgs boson is what was discovered.

  18. A Time Sequence-Oriented Concept Map Approach to Developing Educational Computer Games for History Courses

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Yang, Kai-Hsiang; Chen, Jing-Hong

    2015-01-01

    Concept maps have been recognized as an effective tool for students to organize their knowledge; however, in history courses, it is important for students to learn and organize historical events according to the time of their occurrence. Therefore, in this study, a time sequence-oriented concept map approach is proposed for developing a game-based…

  19. Concept Map Engineering: Methods and Tools Based on the Semantic Relation Approach

    ERIC Educational Resources Information Center

    Kim, Minkyu

    2013-01-01

    The purpose of this study is to develop a better understanding of technologies that use natural language as the basis for concept map construction. In particular, this study focuses on the semantic relation (SR) approach to drawing rich and authentic concept maps that reflect students' internal representations of a problem situation. The…

  20. Concept Map Engineering: Methods and Tools Based on the Semantic Relation Approach

    ERIC Educational Resources Information Center

    Kim, Minkyu

    2013-01-01

    The purpose of this study is to develop a better understanding of technologies that use natural language as the basis for concept map construction. In particular, this study focuses on the semantic relation (SR) approach to drawing rich and authentic concept maps that reflect students' internal representations of a problem situation. The…

  1. A Time Sequence-Oriented Concept Map Approach to Developing Educational Computer Games for History Courses

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Yang, Kai-Hsiang; Chen, Jing-Hong

    2015-01-01

    Concept maps have been recognized as an effective tool for students to organize their knowledge; however, in history courses, it is important for students to learn and organize historical events according to the time of their occurrence. Therefore, in this study, a time sequence-oriented concept map approach is proposed for developing a game-based…

  2. New approach to strip-map SAR autofocus

    SciTech Connect

    Wahl, D.; Jakowatz, C.; Thompson, P.; Ghiglia, D.

    1994-05-01

    Means for removing phase errors induced in spotlight mode synthetic aperture radar (SAR) imagery are now well-established. The Phase Gradient Autofocus (PGA) algorithm has been shown to be robust over a wide range of spotlight mode imagery and phase error functions. These phase errors could have their origin either in uncompensated platform motion or random propagation delays incurred, for example, from tropospheric turbulence. The PGA technique, however, cannot be directly applied to imagery formed in the conventional strip-mapping mode. In this paper we show that if the fundamental ideas of PGA are modified in an appropriate way, the phase errors in strip-map imagery can be effectively estimated and compensated.

  3. Comparison of Sub-pixel Classification Approaches for Crop-specific Mapping

    EPA Science Inventory

    The Moderate Resolution Imaging Spectroradiometer (MODIS) data has been increasingly used for crop mapping and other agricultural applications. Phenology-based classification approaches using the NDVI (Normalized Difference Vegetation Index) 16-day composite (250 m) data product...

  4. Comparison of Sub-pixel Classification Approaches for Crop-specific Mapping

    EPA Science Inventory

    The Moderate Resolution Imaging Spectroradiometer (MODIS) data has been increasingly used for crop mapping and other agricultural applications. Phenology-based classification approaches using the NDVI (Normalized Difference Vegetation Index) 16-day composite (250 m) data product...

  5. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  6. Effective Boson Number- A New Approach for Predicting Separation Energies with the IBM1, Applied to Zr, Kr, Sr isotopes near A = 100

    NASA Astrophysics Data System (ADS)

    Paul, Nancy; van Isacker, Pieter; García Ramos, José Enrique; Aprahamian, Ani

    2011-10-01

    This work uses effective boson numbers in the Interacting Boson Model (IBM1) to predict two neutron separation energies for neutron-rich zirconium, strontium, and krypton isotopes., We determine the functional forms of binding energy and excitation energies as a function of boson number for a given choice of IBM parameters that give a good overall description of the experimental spectra of the isotopic chain. The energy of the first excited 2+ level is then used to extract an effective boson number for a given nucleus, that is in turn used to calculate the separation energies. This method accounts for complex interactions among valence nucleons around magic and semi- magic nuclei and successfully predicts the phase transitional signature in separation energies around A=100 for 92-108Zr, 90-104Sr, and 86-96Kr Supported by the NSF under contract PHY0758100, the Joint Institute for Nuclear Astrophysics grant PHY0822648, University of Notre Dame Nanovic Institute, Glynn Family Honors Program, Center for Undergraduate Scholarly Engagement.

  7. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    PubMed

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. High-resolution habitat mapping on mud fields: new approach to quantitative mapping of Ocean quahog.

    PubMed

    Isachenko, Artem; Gubanova, Yana; Tzetlin, Alexander; Mokievsky, Vadim

    2014-12-01

    During 2009-2012 stocks of the bivalve Arctica islandica (Linnaeus, 1767) (Ocean quahog) in Kandalaksha Bay (the White Sea) has been assessed using a side-scan sonar, grab sampling and underwater photo imaging. Structurally uniform localities were highlighted on the basis of side-scan signal. Each type of a signal reflects combination of sediment type, microtopography and structural characteristics of benthic community. The distribution of A. islandica was the predominant factor in determining community structure. Seabed attributes considered most significant were defined for each type of substrate type. Relations of sonar signal and sediment type were used for landscape mapping based on sonar data. Community characteristics at known localities were reliably interpolated to the area of survey using statistical processing of geophysical data. A method of integrated sonar and sampling data interpretation for high-resolution mapping of A. islandica by biomass groups, benthic faunal groups and associated habitats was developed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. A genetic mosaic approach for neural circuit mapping in Drosophila

    PubMed Central

    Bohm, Rudolf A.; Welch, William P.; Goodnight, Lindsey K.; Cox, Logan W.; Henry, Leah G.; Gunter, Tyler C.; Bao, Hong; Zhang, Bing

    2010-01-01

    Transgenic manipulation of subsets of brain cells is increasingly used for studying behaviors and their underlying neural circuits. In Drosophila, the GAL4–upstream activating sequence (UAS) binary system is powerful for gene manipulation, but GAL4 expression is often too broad for fine mapping of neural circuits. Here, we describe the development of unique molecular genetic tools to restrict GAL4 expression patterns. Building on the GAL4-UAS system, our method adds two components: a collection of enhancer-trap recombinase, Flippase (ET-FLP), transgenic lines that provide inheritable, reproducible, and tissue-specific FLP and an FRT-dependent GAL80 “flip-in” construct that converts FLP expression into tissue-specific repression of GAL4 by GAL80. By including a UAS-encoded fluorescent protein, circuit morphology can be simultaneously marked while the circuit function is assessed using another UAS transgene. In a proof-of-principle analysis, we applied this ET-FLP-induced intersectional GAL80/GAL4 repression (FINGR) method to map the neural circuitry underlying fly wing inflation. The FINGR system is versatile and powerful in combination with the vast collection of GAL4 lines for neural circuit mapping as well as for clonal analysis based on the infusion of the yeast-derived FRT/FLP system of mitotic recombination into Drosophila. The strategies and tactics underlying our FINGR system are also applicable to other genetically amenable organisms in which transgenes including the GAL4, UAS, GAL80, and FLP factors can be applied. PMID:20810922

  10. Enhanced vulnerability assessment in karst areas by combining mapping with modeling approaches.

    PubMed

    Butscher, Christoph; Huggenberger, Peter

    2009-01-15

    The objective of this work is to facilitate a sustainable regional planning of water resources in karst areas by providing a conceptual framework for an integrative vulnerability assessment. A combined mapping and modeling approach is proposed, taking into account both spatial and temporal aspects of karst groundwater vulnerability. The conceptual framework comprises the delineation of recharge areas, vulnerability mapping, numerical flow and transport modeling and the integration of information into a combined vulnerability map and time series. The approach is illustrated at a field site in northwest Switzerland (Gempen plateau). The results show that the combination of vulnerability mapping and numerical modeling allows the vulnerability distribution, both in the recharge and discharge areas, to be identified, and at the same time, the time dependence of karst groundwater vulnerability to be assessed. The combined vulnerability map and time series provide a quantitative basis for drinking water management and for regional planning.

  11. An integrated approach for automated cover-type mapping of large inaccessible areas in Alaska

    USGS Publications Warehouse

    Fleming, Michael D.

    1988-01-01

    The lack of any detailed cover type maps in the state necessitated that a rapid and accurate approach to be employed to develop maps for 329 million acres of Alaska within a seven-year period. This goal has been addressed by using an integrated approach to computer-aided analysis which combines efficient use of field data with the only consistent statewide spatial data sets available: Landsat multispectral scanner data, digital elevation data derived from 1:250 000-scale maps, and 1:60 000-scale color-infrared aerial photographs.

  12. Concept mapping: a distinctive educational approach to foster critical thinking.

    PubMed

    Taylor, Laura A; Littleton-Kearney, Marguerite

    2011-01-01

    Advanced practice nurses must be able to link interventions to address pathophysiological processes with underlying alterations in normal physiological function to promote safe, effective patient care. Development of creative methods to assist students to make their own connections among healthcare concepts is imperative to create a positive learning environment. The authors discuss the use of concept mapping in conjunction with case-study clinical rounds to maximize critical thinking and greater learning retention among advanced practice nurses in a graduate physiology/pathophysiology course.

  13. Stationkeeping Approach for the Microwave Anisotropy Probe (MAP)

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, Dave; Schiff, Conrad

    2002-01-01

    The Microwave Anisotropy Probe was successfully launched on June 30, 2001 and placed into a Lissajous orbit about the L2 Sun-Earth-Moon libration point. However, the L2 libration point is unstable which necessitates occasional stationkeeping maneuvers in order to maintain the spacecraft s Lissajous orbit. Analyses were performed in order to develop a feasible L2 stationkeeping strategy for the MAP mission. The resulting strategy meets the allotted fuel budget, allowing for enough fuel to handle additional he1 taxes, while meeting the attitude requirements for the maneuvers. Results from the first two stationkeeping maneuvers are included.

  14. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    PubMed

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology.

  15. A FISH approach for mapping the human genome using Bacterial Artificial Chromosomes (BACs)

    SciTech Connect

    Hubert, R.S.; Chen, X.N.; Mitchell, S.

    1994-09-01

    As the Human Genome Project progresses, large insert cloning vectors such as BACs, P1, and P1 Artificial Chromosomes (PACs) will be required to complement the YAC mapping efforts. The value of the BAC vector for physical mapping lies in the stability of the inserts, the lack of chimerism, the length of inserts (up to 300 kb), the ability to obtain large amounts of pure clone DNA and the ease of BAC manipulation. These features helped us design two approaches for generating physical mapping reagents for human genetic studies. The first approach is a whole genome strategy in which randomly selected BACs are mapped, using FISH, to specific chromosomal bands. To date, 700 BACs have been mapped to single chromosome bands at a resolution of 2-5 Mb in addition to BACs mapped to 14 different centromeres. These BACs represent more than 90 Mb of the genome and include >70% of all human chromosome bands at the 350-band level. These data revealed that >97% of the BACs were non-chimeric and have a genomic distribution covering most gaps in the existing YAC map with excellent coverage of gene-rich regions. In the second approach, we used YACs to identify BACs on chromosome 21. A 1.5 Mb contig between D21S339 and D21S220 nears completion within the Down syndrome congenital heart disease (DS-CHD) region. Seventeen BACs ranging in size from 80 kb to 240 kb were ordered using 14 STSs with FISH confirmation. We have also used 40 YACs spanning 21q to identify, on average, >1 BAC/Mb to provide molecular cytogenetic reagents and anchor points for further mapping. The contig generated on chromosome 21 will be helpful in isolating the genes for DS-CHD. The physical mapping reagents generated using the whole genome approach will provide cytogenetic markers and mapped genomic fragments that will facilitate positional cloning efforts and the identification of genes within most chromosomal bands.

  16. Fractional-filling loophole insulator domains for ultracold bosons in optical superlattices

    SciTech Connect

    Buonsante, P.; Penna, V.; Vezzani, A.

    2004-12-01

    The zero-temperature phase diagram of a Bose-Einstein condensate confined in realistic one-dimensional l-periodic optical superlattices is investigated. The system of interacting bosons is modeled in terms of a Bose-Hubbard Hamiltonian whose site-dependent local potentials and hopping amplitudes reflect the periodicity of the lattice partition in l-site cells. Relying on the exact mapping between the hardcore limit of the boson Hamiltonian and the model of spinless noninteracting fermions, incompressible insulator domains are shown to exist for rational fillings that are predicted to be compressible in the atomic limit. The corresponding boundaries, qualitatively described in a multiple-site mean-field approach, are shown to exhibit an unusual loophole shape. A more quantitative description of the loophole domain boundaries at half filling for the special case l=2 is supplied in terms of analytic strong-coupling expansions and quantum Monte Carlo simulations.

  17. Mapping.

    ERIC Educational Resources Information Center

    Kinney, Douglas M.; McIntosh, Willard L.

    1979-01-01

    The area of geological mapping in the United States in 1978 increased greatly over that reported in 1977; state geological maps were added for California, Idaho, Nevada, and Alaska last year. (Author/BB)

  18. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  19. Engineering a robotic approach to mapping exposed volcanic fissures

    NASA Astrophysics Data System (ADS)

    Parcheta, C. E.; Parness, A.; Mitchell, K. L.

    2014-12-01

    Field geology provides a framework for advanced computer models and theoretical calculations of volcanic systems. Some field terrains, though, are poorly preserved or accessible, making documentation, quantification, and investigation impossible. Over 200 volcanologists at the 2012 Kona Chapman Conference on volcanology agreed that and important step forward in the field over the next 100 years should address the realistic size and shape of volcanic conduits. The 1969 Mauna Ulu eruption of Kīlauea provides a unique opportunity to document volcanic fissure conduits, thus, we have an ideal location to begin addressing this topic and provide data on these geometries. Exposed fissures can be mapped with robotics using machine vision. In order to test the hypothesis that fissures have irregularities with depth that will influence their fluid dynamical behavior, we must first map the fissure vents and shallow conduit to deci- or centimeter scale. We have designed, constructed, and field-tested the first version of a robotic device that will image an exposed volcanic fissure in three dimensions. The design phase included three steps: 1) create the payload harness and protective shell to prevent damage to the electronics and robot, 2) construct a circuit board to have the electronics communicate with a surface-based computer, and 3) prototype wheel shapes that can handle a variety of volcanic rock textures. The robot's mechanical parts were built using 3d printing, milling, casting and laser cutting techniques, and the electronics were assembled from off the shelf components. The testing phase took place at Mauna Ulu, Kīlauea, Hawai'i, from May 5 - 9, 2014. Many valuable design lessons were learned during the week, and the first ever 3D map from inside a volcanic fissure were successfully collected. Three vents had between 25% and 95% of their internal surfaces imaged. A fourth location, a non-eruptive crack (possibly a fault line) had two transects imaging the textures

  20. A New Approach to Mapping, Visualization, and Morphological Classification of Small Bodies

    NASA Astrophysics Data System (ADS)

    Clark, Pamela E.; Clark, C. S.; Stooke, P. J.

    2008-09-01

    We present a systematic approach to interpreting asteroid shape and surface morphology using Constant Scale Natural Boundary (CSNB) map projection applied to Deimos, Phobos, Eros, and Ida. With the CSNB projection, the ridges and troughs, `event horizons’ acting as encoders of asteroid history, can be prominently featured as map edges at constant scale. By contrast, simple cylindrical and mercator maps, although familiar and instantly orientating, produce great distortions, particularly for irregular objects. CSNB projection combines the best features of 3D mosaics and conformal maps, emphasizing highly irregular faceted shape in one view, without distortion, on a flat map. CSNB maps are designed to be conformal for antipodal areas and to preserve proportions in map interiors. For consistency and orientation, we locate the blunt `nose’ in the center of all maps in the equatorial plane, because most asteroids are elongated along the equatorial axis, and the blunt nose is a recognizable feature, but less morphologically complex than the `sharp’ end. The external boundaries then become the ridges connecting `peaks', which typically run parallel to the equator, and troughs connecting `basins', which typically separate the promontories. Three maps, two ridge-bound and one trough-bound, exist for each object. Segmented maps show separation of the surface into geodesic `facets', preserve resolution, and fold to a 3D facsimile of the asteroid. Connected maps are compact and preserve orientation. Morphological parameters manifested in CSNB map shape include E/W and N/S distribution of segments, roughness of boundaries associated with each segment, and aspect ratio for segmented map. Based on comparison of these parameters, Phobos has considerably greater asymmetry in E/W and N/S directions, has a higher aspect ratio, and is considerably rougher than Deimos.

  1. Simulating spin-boson models with matrix product states

    NASA Astrophysics Data System (ADS)

    Wall, Michael; Safavi-Naini, Arghavan; Rey, Ana Maria

    2016-05-01

    The global coupling of few-level quantum systems (``spins'') to a discrete set of bosonic modes is a key ingredient for many applications in quantum science, including large-scale entanglement generation, quantum simulation of the dynamics of long-range interacting spin models, and hybrid platforms for force and spin sensing. In many situations, the bosons are integrated out, leading to effective long-range interactions between the spins; however, strong spin-boson coupling invalidates this approach, and spin-boson entanglement degrades the fidelity of quantum simulation of spin models. We present a general numerical method for treating the out-of-equilibrium dynamics of spin-boson systems based on matrix product states. While most efficient for weak coupling or small numbers of boson modes, our method applies for any spatial and operator dependence of the spin-boson coupling. In addition, our approach allows straightforward computation of many quantities of interest, such as the full counting statistics of collective spin measurements and quantum simulation infidelity due to spin-boson entanglement. We apply our method to ongoing trapped ion quantum simulator experiments in analytically intractable regimes. This work is supported by JILA-NSF-PFC-1125844, NSF-PIF- 1211914, ARO, AFOSR, AFOSR-MURI, and the NRC.

  2. Mapping national capacity to engage in health promotion: overview of issues and approaches.

    PubMed

    Mittelmark, Maurice B; Wise, Marilyn; Nam, Eun Woo; Santos-Burgoa, Carlos; Fosse, Elisabeth; Saan, Hans; Hagard, Spencer; Tang, Kwok Cho

    2006-12-01

    This paper reviews approaches to the mapping of resources needed to engage in health promotion at the country level. There is not a single way, or a best way to make a capacity map, since it should speak to the needs of its users as they define their needs. Health promotion capacity mapping is therefore approached in various ways. At the national level, the objective is usually to learn the extent to which essential policies, institutions, programmes and practices are in place to guide recommendations about what remedial measures are desirable. In Europe, capacity mapping has been undertaken at the national level by the WHO for a decade. A complimentary capacity mapping approach, HP-Source.net, has been undertaken since 2000 by a consortium of European organizations including the EC, WHO, International Union for Health Promotion and Education, Health Development Agency (of England) and various European university research centres. The European approach emphasizes the need for multi-methods and the principle of triangulation. In North America, Canadian approaches have included large- and small-scale international collaborations to map capacity for sustainable development. US efforts include state-level mapping of capacity to prevent chronic diseases and reduce risk factor levels. In Australia, two decades of mapping national health promotion capacity began with systems needed by the health sector to design and deliver effective, efficient health promotion, and has now expanded to include community-level capacity and policy review. In Korea and Japan, capacity mapping is newly developing in collaboration with European efforts, illustrating the usefulness of international health promotion networks. Mapping capacity for health promotion is a practical and vital aspect of developing capacity for health promotion. The new context for health promotion contains both old and new challenges, but also new opportunities. A large scale, highly collaborative approach to capacity

  3. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  4. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    ERIC Educational Resources Information Center

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  5. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    ERIC Educational Resources Information Center

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  6. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  7. Cosmic expansion from boson and fermion fields

    NASA Astrophysics Data System (ADS)

    de Souza, Rudinei C.; Kremer, Gilberto M.

    2011-06-01

    This paper consists in analyzing an action that describes boson and fermion fields minimally coupled to the gravity and a common matter field. The self-interaction potentials of the fields are not chosen a priori but from the Noether symmetry approach. The Noether forms of the potentials allow the boson field to play the role of dark energy and matter and the fermion field to behave as standard matter. The constant of motion and the cyclic variable associated with the Noether symmetry allow the complete integration of the field equations, whose solution produces a universe with alternated periods of accelerated and decelerated expansion.

  8. Higgs boson photoproduction at the LHC

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2011-07-15

    We present the current development of the photoproduction approach for the Higgs boson with its application to pp and pA collisions at the LHC. We perform a different analysis for the Gap Survival Probability, where we consider a probability of 3% and also a more optimistic value of 10% based on the HERA data for dijet production. As a result, the cross section for the exclusive Higgs boson production is about 2 fb and 6 fb in pp collisions and 617 and 2056 fb for pPb collisions, considering the gap survival factor of 3% and 10%, respectively.

  9. Mapping New Approaches in Program Evaluation: A Cross-Cultural Perspective.

    ERIC Educational Resources Information Center

    Gorostiaga, Jorge M.; Paulston, Rolland G.

    This paper examines new approaches to program evaluation and explores their possible utility in Latin American educational settings. Part 1 briefly discusses why new ideas for evaluating educational studies are needed. Part 2 examines seven new evaluative approaches as follows: (1) "Concept Mapping," a type of structural…

  10. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    ERIC Educational Resources Information Center

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  11. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    ERIC Educational Resources Information Center

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  12. Comparison of four Vulnerability Approaches to Mapping of Shallow Aquifers of Eastern Dahomey Basin of Nigeria

    NASA Astrophysics Data System (ADS)

    Oke, Saheed; Vermeulen, Danie

    2016-04-01

    This study presents the outcome of mapping the shallow aquifers of the eastern Dahomey Basin of southwestern Nigeria vulnerability studies. The basin is a coastal transboundary aquifer extending from eastern Ghana to southwestern Nigeria. The study aimed to examine the most suitable method for mapping the basin shallow aquifers by comparing the results of four different vulnerability approaches. This is most important due to differences in vulnerability assessment parameters, approaches and results derived from most vulnerability methods on a particular aquifer. The methodology involves using vulnerability techniques that assess the intrinsic properties of the aquifer. Two methods from travel time approach (AVI and RTt) and index approach (DRASTIC and PI) were employed in the mapping of the basin. The results show the AVI has the least mapping parameters with 75% of the basin classified as very high vulnerability and 25% with high vulnerability. The DRASTIC mapping shows 18% as low vulnerability, 61% as moderate vulnerability and 21% reveal high vulnerability. Mapping with the PI method which has highest parameters shows 66% of the aquifer as low vulnerability and 34% reveal moderate vulnerability. The RTt method shows 18% as very high vulnerability, 8% as high vulnerability, 64% as moderate vulnerability and 10% reveal very low vulnerability. Further analysis involving correlation plots shows the highest correlation of 62% between the RTt and DRASTIC method than within any others methods. The analysis shows that the PI method is the mildest of all the vulnerability methods while the AVI method is the strictest of the methods considered in this vulnerability mapping. The significance of using four different approaches to the mapping of the shallow aquifers of the eastern Dahomey Basin will guide in the recommendation of the best vulnerability method for subsequent future assessment of this and other shallow aquifers. Keywords: Aquifer vulnerability, Dahomey Basin

  13. Approximate gauge symmetry of composite vector bosons

    NASA Astrophysics Data System (ADS)

    Suzuki, Mahiko

    2010-08-01

    It can be shown in a solvable field theory model that the couplings of the composite vector bosons made of a fermion pair approach the gauge couplings in the limit of strong binding. Although this phenomenon may appear accidental and special to the vector bosons made of a fermion pair, we extend it to the case of bosons being constituents and find that the same phenomenon occurs in a more intriguing way. The functional formalism not only facilitates computation but also provides us with a better insight into the generating mechanism of approximate gauge symmetry, in particular, how the strong binding and global current conservation conspire to generate such an approximate symmetry. Remarks are made on its possible relevance or irrelevance to electroweak and higher symmetries.

  14. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    NASA Astrophysics Data System (ADS)

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  15. Experimental mapping of soluble protein domains using a hierarchical approach.

    PubMed

    Pedelacq, Jean-Denis; Nguyen, Hau B; Cabantous, Stephanie; Mark, Brian L; Listwan, Pawel; Bell, Carolyn; Friedland, Natasha; Lockard, Meghan; Faille, Alexandre; Mourey, Lionel; Terwilliger, Thomas C; Waldo, Geoffrey S

    2011-10-01

    Exploring the function and 3D space of large multidomain protein targets often requires sophisticated experimentation to obtain the targets in a form suitable for structure determination. Screening methods capable of selecting well-expressed, soluble fragments from DNA libraries exist, but require the use of automation to maximize chances of picking a few good candidates. Here, we describe the use of an insertion dihydrofolate reductase (DHFR) vector to select in-frame fragments and a split-GFP assay technology to filter-out constructs that express insoluble protein fragments. With the incorporation of an IPCR step to create high density, focused sublibraries of fragments, this cost-effective method can be performed manually with no a priori knowledge of domain boundaries while permitting single amino acid resolution boundary mapping. We used it on the well-characterized p85α subunit of the phosphoinositide-3-kinase to demonstrate the robustness and efficiency of our methodology. We then successfully tested it onto the polyketide synthase PpsC from Mycobacterium tuberculosis, a potential drug target involved in the biosynthesis of complex lipids in the cell envelope. X-ray quality crystals from the acyl-transferase (AT), dehydratase (DH) and enoyl-reductase (ER) domains have been obtained.

  16. Experimental mapping of soluble protein domains using a hierarchical approach

    PubMed Central

    Pedelacq, Jean-Denis; Nguyen, Hau B.; Cabantous, Stephanie; Mark, Brian L.; Listwan, Pawel; Bell, Carolyn; Friedland, Natasha; Lockard, Meghan; Faille, Alexandre; Mourey, Lionel; Terwilliger, Thomas C.; Waldo, Geoffrey S.

    2011-01-01

    Exploring the function and 3D space of large multidomain protein targets often requires sophisticated experimentation to obtain the targets in a form suitable for structure determination. Screening methods capable of selecting well-expressed, soluble fragments from DNA libraries exist, but require the use of automation to maximize chances of picking a few good candidates. Here, we describe the use of an insertion dihydrofolate reductase (DHFR) vector to select in-frame fragments and a split-GFP assay technology to filter-out constructs that express insoluble protein fragments. With the incorporation of an IPCR step to create high density, focused sublibraries of fragments, this cost-effective method can be performed manually with no a priori knowledge of domain boundaries while permitting single amino acid resolution boundary mapping. We used it on the well-characterized p85α subunit of the phosphoinositide-3-kinase to demonstrate the robustness and efficiency of our methodology. We then successfully tested it onto the polyketide synthase PpsC from Mycobacterium tuberculosis, a potential drug target involved in the biosynthesis of complex lipids in the cell envelope. X-ray quality crystals from the acyl-transferase (AT), dehydratase (DH) and enoyl-reductase (ER) domains have been obtained. PMID:21771856

  17. Geomatics Approach for Assessment of respiratory disease Mapping

    NASA Astrophysics Data System (ADS)

    Pandey, M.; Singh, V.; Vaishya, R. C.

    2014-11-01

    Air quality is an important subject of relevance in the context of present times because air is the prime resource for sustenance of life especially human health position. Then with the aid of vast sums of data about ambient air quality is generated to know the character of air environment by utilizing technological advancements to know how well or bad the air is. This report supplies a reliable method in assessing the Air Quality Index (AQI) by using fuzzy logic. The fuzzy logic model is designed to predict Air Quality Index (AQI) that report monthly air qualities. With the aid of air quality index we can evaluate the condition of the environment of that area suitability regarding human health position. For appraisal of human health status in industrial area, utilizing information from health survey questionnaire for obtaining a respiratory risk map by applying IDW and Gettis Statistical Techniques. Gettis Statistical Techniques identifies different spatial clustering patterns like hot spots, high risk and cold spots over the entire work area with statistical significance.

  18. MAPS: A Quantitative Radiomics Approach for Prostate Cancer Detection.

    PubMed

    Cameron, Andrew; Khalvati, Farzad; Haider, Masoom A; Wong, Alexander

    2016-06-01

    This paper presents a quantitative radiomics feature model for performing prostate cancer detection using multiparametric MRI (mpMRI). It incorporates a novel tumor candidate identification algorithm to efficiently and thoroughly identify the regions of concern and constructs a comprehensive radiomics feature model to detect tumorous regions. In contrast to conventional automated classification schemes, this radiomics-based feature model aims to ground its decisions in a way that can be interpreted and understood by the diagnostician. This is done by grouping features into high-level feature categories which are already used by radiologists to diagnose prostate cancer: Morphology, Asymmetry, Physiology, and Size (MAPS), using biomarkers inspired by the PI-RADS guidelines for performing structured reporting on prostate MRI. Clinical mpMRI data were collected from 13 men with histology-confirmed prostate cancer and labeled by an experienced radiologist. These annotated data were used to train classifiers using the proposed radiomics-driven feature model in order to evaluate the classification performance. The preliminary experimental results indicated that the proposed model outperformed each of its constituent feature groups as well as a comparable conventional mpMRI feature model. A further validation of the proposed algorithm will be conducted using a larger dataset as future work.

  19. A National Approach to Quantify and Map Biodiversity ...

    EPA Pesticide Factsheets

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human welfare. The degradation of natural ecosystems and climate variation impact the environment and society by affecting ecological integrity and ecosystems’ capacity to provide critical services (i.e., the contributions of ecosystems to human well-being). These challenges will require complex management decisions that can often involve significant trade-offs between societal desires and environmental needs. Evaluating trade-offs in terms of ecosystem services and human well-being provides an intuitive and comprehensive way to assess the broad implications of our decisions and to help shape policies that enhance environmental and social sustainability. In answer to this challenge, the U.S. government has created a partnership among the U.S. Environmental Protection Agency, other Federal agencies, academic institutions, and, Non-Governmental Organizations to develop the EnviroAtlas, an online Decision Support Tool that allows users (e.g., planners, policy-makers, resource managers, NGOs, private indu

  20. A National Approach to Quantify and Map Biodiversity ...

    EPA Pesticide Factsheets

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human welfare. The degradation of natural ecosystems and climate variation impact the environment and society by affecting ecological integrity and ecosystems’ capacity to provide critical services (i.e., the contributions of ecosystems to human well-being). These challenges will require complex management decisions that can often involve significant trade-offs between societal desires and environmental needs. Evaluating trade-offs in terms of ecosystem services and human well-being provides an intuitive and comprehensive way to assess the broad implications of our decisions and to help shape policies that enhance environmental and social sustainability. In answer to this challenge, the U.S. government has created a partnership among the U.S. Environmental Protection Agency, other Federal agencies, academic institutions, and, Non-Governmental Organizations to develop the EnviroAtlas, an online Decision Support Tool that allows users (e.g., planners, policy-makers, resource managers, NGOs, private indu

  1. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  2. Computer-based Approaches for Training Interactive Digital Map Displays

    DTIC Science & Technology

    2005-09-01

    SUPPLEMENTARY NOTES Subject Matter POC: Jean L. Dyer 14. ABSTRACT (Maximum 200 words): Five computer-based training approaches for learning digital skills...Training assessment Exploratory Learning Guided ExploratoryTraining Guided Discovery SECURITY CLASSIFICATION OF 19. LIMITATION OF 20. NUMBER 21...the other extreme of letting Soldiers learn a digital interface on their own. The research reported here examined these two conditions and three other

  3. The W Boson Mass Measurement

    NASA Astrophysics Data System (ADS)

    Kotwal, Ashutosh V.

    2016-10-01

    The measurement of the W boson mass has been growing in importance as its precision has improved, along with the precision of other electroweak observables and the top quark mass. Over the last decade, the measurement of the W boson mass has been led at hadron colliders. Combined with the precise measurement of the top quark mass at hadron colliders, the W boson mass helped to pin down the mass of the Standard Model Higgs boson through its induced radiative correction on the W boson mass. With the discovery of the Higgs boson and the measurement of its mass, the electroweak sector of the Standard Model is over-constrained. Increasing the precision of the W boson mass probes new physics at the TeV-scale. We summarize an extensive Tevatron (1984-2011) program to measure the W boson mass at the CDF and Dø experiments. We highlight the recent Tevatron measurements and prospects for the final Tevatron measurements.

  4. A heuristic multi-criteria classification approach incorporating data quality information for choropleth mapping

    PubMed Central

    Sun, Min; Wong, David; Kronenfeld, Barry

    2016-01-01

    Despite conceptual and technology advancements in cartography over the decades, choropleth map design and classification fail to address a fundamental issue: estimates that are statistically indifferent may be assigned to different classes on maps or vice versa. Recently, the class separability concept was introduced as a map classification criterion to evaluate the likelihood that estimates in two classes are statistical different. Unfortunately, choropleth maps created according to the separability criterion usually have highly unbalanced classes. To produce reasonably separable but more balanced classes, we propose a heuristic classification approach to consider not just the class separability criterion but also other classification criteria such as evenness and intra-class variability. A geovisual-analytic package was developed to support the heuristic mapping process to evaluate the trade-off between relevant criteria and to select the most preferable classification. Class break values can be adjusted to improve the performance of a classification. PMID:28286426

  5. A novel subfractionation approach for mitochondrial proteins: a three-dimensional mitochondrial proteome map.

    PubMed

    Hanson, B J; Schulenberg, B; Patton, W F; Capaldi, R A

    2001-03-01

    As mitochondria play critical roles in both cell life and cell death, there is great interest in obtaining a human mitochondrial proteome map. Such a map could potentially be useful in diagnosing diseases, identifying targets for drug therapy, and in screening for unwanted drug side effects. In this paper, we present a novel approach to obtaining a human mitochondrial proteome map that combines sucrose gradient centrifugation with standard two-dimensional gel electrophoresis. The resulting three-dimensional separation of proteins allows us to address some of the problems encountered during previous attempts to obtain mitochondrial proteome maps such as resolution of proteins and solubility of hydrophobic proteins during isoelectric focusing. In addition, we show that this new approach provides functional information about protein complexes within the organelle that is not obtained with two-dimensional gel electrophoresis of whole mitochondria.

  6. Inverse field-based approach for simultaneous B₁ mapping at high fields - a phantom based study.

    PubMed

    Jin, Jin; Liu, Feng; Zuo, Zhentao; Xue, Rong; Li, Mingyan; Li, Yu; Weber, Ewald; Crozier, Stuart

    2012-04-01

    Based on computational electromagnetics and multi-level optimization, an inverse approach of attaining accurate mapping of both transmit and receive sensitivity of radiofrequency coils is presented. This paper extends our previous study of inverse methods of receptivity mapping at low fields, to allow accurate mapping of RF magnetic fields (B(1)) for high-field applications. Accurate receive sensitivity mapping is essential to image domain parallel imaging methods, such as sensitivity encoding (SENSE), to reconstruct high quality images. Accurate transmit sensitivity mapping will facilitate RF-shimming and parallel transmission techniques that directly address the RF inhomogeneity issue, arguably the most challenging issue of high-field magnetic resonance imaging (MRI). The inverse field-based approach proposed herein is based on computational electromagnetics and iterative optimization. It fits an experimental image to the numerically calculated signal intensity by iteratively optimizing the coil-subject geometry to better resemble the experiments. Accurate transmit and receive sensitivities are derived as intermediate results of the optimization process. The method is validated by imaging studies using homogeneous saline phantom at 7T. A simulation study at 300MHz demonstrates that the proposed method is able to obtain receptivity mapping with errors an order of magnitude less than that of the conventional method. The more accurate receptivity mapping and simultaneously obtained transmit sensitivity mapping could enable artefact-reduced and intensity-corrected image reconstructions. It is hoped that by providing an approach to the accurate mapping of both transmit and receive sensitivity, the proposed method will facilitate a range of applications in high-field MRI and parallel imaging. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Mapping Biological Transmission: An Empirical, Dynamical, and Evolutionary Approach.

    PubMed

    Merlin, Francesca; Riboli-Sasco, Livio

    2017-06-01

    The current debate over extending inheritance and its evolutionary impact has focused on adding new categories of non-genetic factors to the classical transmission of DNA, and on trying to redefine inheritance. Transmitted factors have been mainly characterized by their directions of transmission (vertical, horizontal, or both) and the way they store variations. In this paper, we leave aside the issue of defining inheritance. We rather try to build an evolutionary conceptual framework that allows for tracing most, if not all forms of transmission and makes sense of their different tempos and modes. We discuss three key distinctions that should in particular be the targets of theoretical and empirical investigation, and try to assess the interplay among them and evolutionary dynamics. We distinguish two channels of transmission (channel 1 and channel 2), two measurements of the temporal dynamics of transmission, respectively across and within generations (durability and residency), and two types of transmitted factors according to their evolutionary relevance (selectively relevant and neutral stable factors). By implementing these three distinctions we can then map different forms of transmission over a continuous space describing the combination of their varying dynamical features. While our aim is not to provide yet another model of inheritance, putting together these distinctions and crossing them, we manage to offer an inclusive conceptual framework of transmission, grounded in empirical observation, and coherent with evolutionary theory. This interestingly opens possibilities for qualitative and quantitative analyses, and is a necessary step, we argue, in order to question the interplay between the dynamics of evolution and the dynamics of multiple forms of transmission.

  8. Flood Hazard Mapping over Large Regions using Geomorphic Approaches

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Troy, Tara J.; Manfreda, Salvatore

    2016-04-01

    Historically, man has always preferred to settle and live near the water. This tendency has not changed throughout time, and today nineteen of the twenty most populated agglomerations of the world (Demographia World Urban Areas, 2015) are located along watercourses or at the mouth of a river. On one hand, these locations are advantageous from many points of view. On the other hand, they expose significant populations and economic assets to a certain degree of flood hazard. Knowing the location and the extent of the areas exposed to flood hazards is essential to any strategy for minimizing the risk. Unfortunately, in data-scarce regions the use of traditional floodplain mapping techniques is prevented by the lack of the extensive data required, and this scarcity is generally most pronounced in developing countries. The present work aims to overcome this limitation by defining an alternative simplified procedure for a preliminary, but efficient, floodplain delineation. To validate the method in a data-rich environment, eleven flood-related morphological descriptors derived from DEMs have been used as linear binary classifiers over the Ohio River basin and its sub-catchments, measuring their performances in identifying the floodplains at the change of the topography and the size of the calibration area. The best performing classifiers among those analysed have been applied and validated across the continental U.S. The results suggest that the classifier based on the index ln(hr/H), named the Geomorphic Flood Index (GFI), is the most suitable to detect the flood-prone areas in data-scarce environments and for large-scale applications, providing good accuracy with low requirements in terms of data and computational costs. Keywords: flood hazard, data-scarce regions, large-scale studies, binary classifiers, DEM, USA.

  9. Boson/Fermion Janus Particles

    NASA Astrophysics Data System (ADS)

    Tsekov, R.

    2017-04-01

    Thermodynamically, bosons and fermions differ by their statistics only. A general entropy functional is proposed by superposition of entropic terms, typical for different quantum gases. The statistical properties of the corresponding Janus particles are derived by variation of the weight of the boson/fermion fraction. It is shown that di-bosons and anti-fermions separate in gas and liquid phases, while three-phase equilibrium appears for poly-boson/fermion Janus particles.

  10. A chemical approach to mapping nucleosomes at base pair resolution in yeast.

    PubMed

    Brogaard, Kristin R; Xi, Liqun; Wang, Ji-Ping; Widom, Jonathan

    2012-01-01

    Most eukaryotic DNA exists in DNA-protein complexes known as nucleosomes. The exact locations of nucleosomes along the genome play a critical role in chromosome functions and gene regulation. However, the current methods for nucleosome mapping do not provide the necessary accuracy to identify the precise nucleosome locations. Here we describe a new experimental approach that directly maps nucleosome center locations in vivo genome-wide at single base pair resolution.

  11. Bosonization of Weyl Fermions

    NASA Astrophysics Data System (ADS)

    Marino, Eduardo

    The electron, discovered by Thomson by the end of the nineteenth century, was the first experimentally observed particle. The Weyl fermion, though theoretically predicted since a long time, was observed in a condensed matter environment in an experiment reported only a few weeks ago. Is there any linking thread connecting the first and the last observed fermion (quasi)particles? The answer is positive. By generalizing the method known as bosonization, the first time in its full complete form, for a spacetime with 3+1 dimensions, we are able to show that both electrons and Weyl fermions can be expressed in terms of the same boson field, namely the Kalb-Ramond anti-symmetric tensor gauge field. The bosonized form of the Weyl chiral currents lead to the angle-dependent magneto-conductance behavior observed in these systems.

  12. Multilayer apparent magnetization mapping approach and its application in mineral exploration

    NASA Astrophysics Data System (ADS)

    Guo, L.; Meng, X.; Chen, Z.

    2016-12-01

    Apparent magnetization mapping is a technique to estimate magnetization distribution in the subsurface from the observed magnetic data. It has been applied for geologic mapping and mineral exploration for decades. Apparent magnetization mapping usually models the magnetic layer as a collection of vertical, juxtaposed prisms in both horizontal directions, whose top and bottom surfaces are assumed to be horizontal or variable-depth, and then inverts or deconvolves the magnetic anomalies in the space or frequency domain to determine the magnetization of each prism. The conventional mapping approaches usually assume that magnetic sources contain no remanent magnetization. However, such assumptions are not always valid in mineral exploration of metallic ores. In this case, the negligence of the remanence will result in large geologic deviation or the occurrence of negative magnetization. One alternate strategy is to transform the observed magnetic anomalies into some quantities that are insensitive or weakly sensitive to the remanence and then subsequently to perform inversion on these quantities, without needing any a priori information about remanent magnetization. Such kinds of quantities include the amplitude of the magnetic total field anomaly (AMA), and the normalized magnetic source strength (NSS). Here, we present a space-domain inversion approach for multilayer magnetization mapping based on the AMA for reducing effects of remanence. In the real world, magnetization usually varies vertically in the subsurface. If we use only one-layer model for mapping, the result is simply vertical superposition of different magnetization distributions. Hence, a multi-layer model for mapping would be a more realistic approach. We test the approach on the real data from a metallic deposit area in North China. The results demonstrated that our approach is feasible and produces considerable magnetization distribution from top layer to bottom layer in the subsurface.

  13. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    PubMed

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  14. A taxonomy of behaviour change methods: an Intervention Mapping approach

    PubMed Central

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fernández, María E.; Markham, Christine; Bartholomew, L. Kay

    2016-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it

  15. [Recent progress in gene mapping through high-throughput sequencing technology and forward genetic approaches].

    PubMed

    Lu, Cairui; Zou, Changsong; Song, Guoli

    2015-08-01

    Traditional gene mapping using forward genetic approaches is conducted primarily through construction of a genetic linkage map, the process of which is tedious and time-consuming, and often results in low accuracy of mapping and large mapping intervals. With the rapid development of high-throughput sequencing technology and decreasing cost of sequencing, a variety of simple and quick methods of gene mapping through sequencing have been developed, including direct sequencing of the mutant genome, sequencing of selective mutant DNA pooling, genetic map construction through sequencing of individuals in population, as well as sequencing of transcriptome and partial genome. These methods can be used to identify mutations at the nucleotide level and has been applied in complex genetic background. Recent reports have shown that sequencing mapping could be even done without the reference of genome sequence, hybridization, and genetic linkage information, which made it possible to perform forward genetic study in many non-model species. In this review, we summarized these new technologies and their application in gene mapping.

  16. Higgs boson hunting

    SciTech Connect

    Dawson, S.; Haber, H.E.; Rindani, S.D.

    1989-05-01

    This is the summary report of the Higgs Boson Working Group. We discuss a variety of search techniques for a Higgs boson which is lighter than the Z. The processes K /yields/ /pi/H, /eta//prime/ /yields/ /eta/H,/Upsilon/ /yields/ H/gamma/ and e/sup +/e/sup /minus// /yields/ ZH are examined with particular attention paid to theoretical uncertainties in the calculations. We also briefly examine new features of Higgs phenomenology in a model which contains Higgs triplets as well as the usual doublet of scalar fields. 33 refs., 6 figs., 1 tab.

  17. Accidental Higgs boson

    NASA Astrophysics Data System (ADS)

    Holdom, B.

    2014-07-01

    We suggest that the Higgs boson is a light composite state that does not emerge from TeV scale strong dynamics for any generic reason, such as when it is pseudo-Goldstone boson. Instead, a state that is Higgs-like and fairly decoupled from heavier states may simply be a reflection of very particular strong dynamics, with properties quite distinct from more familiar large-Nc type gauge dynamics. We elaborate on this picture in the context of a strongly interacting fourth family and an effective 4-Higgs-doublet model. The origin of a decoupling limit and the corrections to it are discussed.

  18. Effects of boson dispersion in fermion-boson coupled systems

    NASA Astrophysics Data System (ADS)

    Motome, Yukitoshi; Kotliar, Gabriel

    2000-11-01

    We study the nonlinear feedback in a fermion-boson system using an extension of dynamical mean-field theory and the quantum Monte Carlo method. In the perturbative regimes (weak-coupling and atomic limits) the effective interaction among fermions increases as the width of the boson dispersion increases. In the strong-coupling regime away from the antiadiabatic limit, the effective interaction decreases as we increase the width of the boson dispersion. This behavior is closely related to complete softening of the boson field. We elucidate the parameters that control this nonperturbative region where fluctuations of the dispersive bosons enhance the delocalization of fermions.

  19. Streamlined approach to mapping the magnetic induction of skyrmionic materials.

    PubMed

    Chess, Jordan J; Montoya, Sergio A; Harvey, Tyler R; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E; McMorran, Benjamin J

    2017-02-28

    Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  20. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  1. Approaches to interval mapping of QTL in a multigeneration pedigree: the example of porcine chromosome 4.

    PubMed

    Knott, S A; Nyström, P E; Andersson-Eklund, L; Stern, S; Marklund, L; Andersson, L; Haley, C S

    2002-02-01

    Quantitative trait loci (QTLs) have been mapped in many studies of F2 populations derived from crosses between diverse lines. One approach to confirming these effects and improving the mapping resolution is genetic chromosome dissection through a backcrossing programme. Analysis by interval mapping of the data generated is likely to provide additional power and resolution compared with treating data marker by marker. However, interval mapping approaches for such a programme are not well developed, especially where the founder lines were outbred. We explore alternative approaches to analysis using, as an example, data from chromosome 4 in an intercross between wild boar and Large White pigs where QTLs have been previously identified. A least squares interval mapping procedure was used to study growth rate and carcass traits in a subsequent second backcross generation (BC2). This procedure requires the probability of inheriting a wild boar allele for each BC2 animal for locations throughout the chromosome. Two methods for obtaining these probabilities were compared: stochastic or deterministic. The two methods gave similar probabilities for inheriting wild boar alleles and, hence, gave very similar results from the QTL analysis. The deterministic approach has the advantage of being much faster to run but requires specialized software. A QTL for fatness and for growth were confirmed and, in addition, a QTL for piglet growth from weaning at 5 weeks up to 7 weeks of age and another for carcass length were detected.

  2. Correlation Between Local Structure and Boson Peak in Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Ahmad, Azkar Saeed; Zhao, Xiangnan; Xu, Mingxiang; Zhang, Dongxian; Hu, Junwen; Fecht, Hans J.; Wang, Xiaodong; Cao, Qingping; Jiang, J. Z.

    2017-01-01

    We made a systematic study of the boson peak for six different Zr-based metallic glasses and found a universal correlation between average local atomic structure and boson peak. It is found that the boson peak can be decomposed into six characteristic vibratory modes, i.e., Debye's vibratory mode and five Einstein's vibratory modes. By using the Ioffe-Regel condition over all studied Zr-based metallic glasses, we reveal that atomic pair correlation function exactly maps on the low-temperature dynamics and the origin of the boson peak, which is the sum of vibrations of local density fluctuation domains in the glasses. In addition, it is found that the Debye's type oscillators are the major contributors to the low-temperature specific heat capacities. This study opens a new way of understanding the relationship of the physical properties with the atomic arrangements in glasses.

  3. Development and Comparison of Techniques for Generating Permeability Maps using Independent Experimental Approaches

    NASA Astrophysics Data System (ADS)

    Hingerl, Ferdinand; Romanenko, Konstantin; Pini, Ronny; Balcom, Bruce; Benson, Sally

    2014-05-01

    We have developed and evaluated methods for creating voxel-based 3D permeability maps of a heterogeneous sandstone sample using independent experimental data from single phase flow (Magnetic Resonance Imaging, MRI) and two-phase flow (X-ray Computed Tomography, CT) measurements. Fluid velocities computed from the generated permeability maps using computational fluid dynamics simulations fit measured velocities very well and significantly outperform empirical porosity-permeability relations, such as the Kozeny-Carman equation. Acquiring images on the meso-scale from porous rocks using MRI has till recently been a great challenge, due to short spin relaxation times and large field gradients within the sample. The combination of the 13-interval Alternating-Pulsed-Gradient Stimulated-Echo (APGSTE) scheme with three-dimensional Single Point Ramped Imaging with T1 Enhancement (SPRITE) - a technique recently developed at the UNB MRI Center - can overcome these challenges and enables obtaining quantitative 3 dimensional maps of porosities and fluid velocities. Using porosity and (single-phase) velocity maps from MRI and (multi-phase) saturation maps from CT measurements, we employed three different techniques to obtain permeability maps. In the first approach, we applied the Kozeny-Carman relationship to porosities measured using MRI. In the second approach, we computed permeabilities using a J-Leverett scaling method, which is based on saturation maps obtained from N2-H2O multi-phase experiments. The third set of permeabilities was generated using a new inverse iterative-updating technique, which is based on porosities and measured velocities obtained in single-phase flow experiments. The resulting three permeability maps provided then input for computational fluid dynamics simulations - employing the Stanford CFD code AD-GPRS - to generate velocity maps, which were compared to velocity maps measured by MRI. The J-Leveret scaling method and the iterative-updating method

  4. Mapping paths: new approaches to dissect eukaryotic signaling circuitry

    PubMed Central

    Mutlu, Nebibe; Kumar, Anuj

    2016-01-01

    Eukaryotic cells are precisely “wired” to coordinate changes in external and intracellular signals with corresponding adjustments in the output of complex and often interconnected signaling pathways. These pathways are critical in understanding cellular growth and function, and several experimental trends are emerging with applicability toward more fully describing the composition and topology of eukaryotic signaling networks. In particular, recent studies have implemented CRISPR/Cas-based screens in mouse and human cell lines for genes involved in various cell growth and disease phenotypes. Proteomic methods using mass spectrometry have enabled quantitative and dynamic profiling of protein interactions, revealing previously undiscovered complexes and allele-specific protein interactions. Methods for the single-cell study of protein localization and gene expression have been integrated with computational analyses to provide insight into cell signaling in yeast and metazoans. In this review, we present an overview of exemplary studies using the above approaches, relevant for the analysis of cell signaling and indeed, more broadly, for many modern biological applications. PMID:27540473

  5. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    PubMed Central

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661

  6. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    NASA Astrophysics Data System (ADS)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  7. An integrated two-stage support vector machine approach to forecast inundation maps during typhoons

    NASA Astrophysics Data System (ADS)

    Jhong, Bing-Chen; Wang, Jhih-Huang; Lin, Gwo-Fong

    2017-04-01

    During typhoons, accurate forecasts of hourly inundation depths are essential for inundation warning and mitigation. Due to the lack of observed data of inundation maps, sufficient observed data are not available for developing inundation forecasting models. In this paper, the inundation depths, which are simulated and validated by a physically based two-dimensional model (FLO-2D), are used as a database for inundation forecasting. A two-stage inundation forecasting approach based on Support Vector Machine (SVM) is proposed to yield 1- to 6-h lead-time inundation maps during typhoons. In the first stage (point forecasting), the proposed approach not only considers the rainfall intensity and inundation depth as model input but also simultaneously considers cumulative rainfall and forecasted inundation depths. In the second stage (spatial expansion), the geographic information of inundation grids and the inundation forecasts of reference points are used to yield inundation maps. The results clearly indicate that the proposed approach effectively improves the forecasting performance and decreases the negative impact of increasing forecast lead time. Moreover, the proposed approach is capable of providing accurate inundation maps for 1- to 6-h lead times. In conclusion, the proposed two-stage forecasting approach is suitable and useful for improving the inundation forecasting during typhoons, especially for long lead times.

  8. Mapping water quality and substrate cover in optically complex coastal and reef waters: an integrated approach.

    PubMed

    Phinn, S R; Dekker, A G; Brando, V E; Roelfsema, C M

    2005-01-01

    Sustainable management of coastal and coral reef environments requires regular collection of accurate information on recognized ecosystem health indicators. Satellite image data and derived maps of water column and substrate biophysical properties provide an opportunity to develop baseline mapping and monitoring programs for coastal and coral reef ecosystem health indicators. A significant challenge for satellite image data in coastal and coral reef water bodies is the mixture of both clear and turbid waters. A new approach is presented in this paper to enable production of water quality and substrate cover type maps, linked to a field based coastal ecosystem health indicator monitoring program, for use in turbid to clear coastal and coral reef waters. An optimized optical domain method was applied to map selected water quality (Secchi depth, Kd PAR, tripton, CDOM) and substrate cover type (seagrass, algae, sand) parameters. The approach is demonstrated using commercially available Landsat 7 Enhanced Thematic Mapper image data over a coastal embayment exhibiting the range of substrate cover types and water quality conditions commonly found in sub-tropical and tropical coastal environments. Spatially extensive and quantitative maps of selected water quality and substrate cover parameters were produced for the study site. These map products were refined by interactions with management agencies to suit the information requirements of their monitoring and management programs.

  9. Benthic habitat mapping in a Portuguese Marine Protected Area using EUNIS: An integrated approach

    NASA Astrophysics Data System (ADS)

    Henriques, Victor; Guerra, Miriam Tuaty; Mendes, Beatriz; Gaudêncio, Maria José; Fonseca, Paulo

    2015-06-01

    A growing demand for seabed and habitat mapping has taken place over the past years to support the maritime integrated policies at EU and national levels aiming at the sustainable use of sea resources. This study presents the results of applying the hierarchical European Nature Information System (EUNIS) to classify and map the benthic habitats of the Luiz Saldanha Marine Park, a marine protected area (MPA), located in the mainland Portuguese southwest coast, in the Iberian Peninsula. The habitat map was modelled by applying a methodology based on EUNIS to merge biotic and abiotic key habitat drivers. The modelling in this approach focused on predicting the association of different data types: substrate, bathymetry, light intensity, waves and currents energy, sediment grain size and benthic macrofauna into a common framework. The resulting seamless medium scale habitat map discriminates twenty six distinct sublittoral habitats, including eight with no match in the current classification, which may be regarded as new potential habitat classes and therefore will be submitted to EUNIS. A discussion is provided examining the suitability of the current EUNIS scheme as a standardized approach to classify marine benthic habitats and map their spatial distribution at medium scales in the Portuguese coast. In addition the factors that most affected the results available in the predictive habitat map and the role of the environmental factors on macrofaunal assemblage composition and distribution are outlined.

  10. A multi-temporal analysis approach for land cover mapping in support of nuclear incident response

    NASA Astrophysics Data System (ADS)

    Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.

    2012-06-01

    Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.

  11. A Visual-Based Approach for Indoor Radio Map Construction Using Smartphones.

    PubMed

    Liu, Tao; Zhang, Xing; Li, Qingquan; Fang, Zhixiang

    2017-08-04

    Localization of users in indoor spaces is a common issue in many applications. Among various technologies, a Wi-Fi fingerprinting based localization solution has attracted much attention, since it can be easily deployed using the existing off-the-shelf mobile devices and wireless networks. However, the collection of the Wi-Fi radio map is quite labor-intensive, which limits its potential for large-scale application. In this paper, a visual-based approach is proposed for the construction of a radio map in anonymous indoor environments. This approach collects multi-sensor data, e.g., Wi-Fi signals, video frames, inertial readings, when people are walking in indoor environments with smartphones in their hands. Then, it spatially recovers the trajectories of people by using both visual and inertial information. Finally, it estimates the location of fingerprints from the trajectories and constructs a Wi-Fi radio map. Experiment results show that the average location error of the fingerprints is about 0.53 m. A weighted k-nearest neighbor method is also used to evaluate the constructed radio map. The average localization error is about 3.2 m, indicating that the quality of the constructed radio map is at the same level as those constructed by site surveying. However, this approach can greatly reduce the human labor cost, which increases the potential for applying it to large indoor environments.

  12. A Visual-Based Approach for Indoor Radio Map Construction Using Smartphones

    PubMed Central

    Zhang, Xing; Li, Qingquan; Fang, Zhixiang

    2017-01-01

    Localization of users in indoor spaces is a common issue in many applications. Among various technologies, a Wi-Fi fingerprinting based localization solution has attracted much attention, since it can be easily deployed using the existing off-the-shelf mobile devices and wireless networks. However, the collection of the Wi-Fi radio map is quite labor-intensive, which limits its potential for large-scale application. In this paper, a visual-based approach is proposed for the construction of a radio map in anonymous indoor environments. This approach collects multi-sensor data, e.g., Wi-Fi signals, video frames, inertial readings, when people are walking in indoor environments with smartphones in their hands. Then, it spatially recovers the trajectories of people by using both visual and inertial information. Finally, it estimates the location of fingerprints from the trajectories and constructs a Wi-Fi radio map. Experiment results show that the average location error of the fingerprints is about 0.53 m. A weighted k-nearest neighbor method is also used to evaluate the constructed radio map. The average localization error is about 3.2 m, indicating that the quality of the constructed radio map is at the same level as those constructed by site surveying. However, this approach can greatly reduce the human labor cost, which increases the potential for applying it to large indoor environments. PMID:28777300

  13. A whole spectroscopic mapping approach for studying the spatial distribution of pigments in paintings

    NASA Astrophysics Data System (ADS)

    Mosca, S.; Alberti, R.; Frizzi, T.; Nevin, A.; Valentini, G.; Comelli, D.

    2016-09-01

    We propose a non-invasive approach for the identification and mapping of pigments in paintings. The method is based on three highly complementary imaging spectroscopy techniques, visible multispectral imaging, X-Ray fluorescence mapping and Raman mapping, combined with multivariate data analysis of multidimensional spectroscopic datasets for the extraction of key distribution information in a semi-automatic way. The proposed approach exploits a macro-Raman mapping device, capable of detecting Raman signals from non-perfectly planar surfaces without the need of refocusing. Here, we show that the presence of spatially correlated Raman signals, detected in adjacent points of a painted surface, reinforces the level of confidence for material identification with respect to single-point analysis, even in the presence of very weak and complex Raman signals. The new whole-mapping approach not only provides the identification of inorganic and organic pigments but also gives striking information on the spatial distribution of pigments employed in complex mixtures for achieving different hues. Moreover, we demonstrate how the synergic combination on three spectroscopic methods, characterized by highly different time consumption, yields maximum information.

  14. A new computer approach to map mixed forest features and postprocess multispectral data

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1976-01-01

    A computer technique for mapping mixed softwood and hardwood stands in multispectral satellite imagery of forest regions is described. The purpose of the technique is to obtain smoother resource maps useful in timber harvesting operations. The computer program relies on an algorithm which assesses the size and similarity of adjacent sections on satellite imagery (Landsat-1 data is used) and constructs, through an iteration of the basic algorithm, a more general map of timber mixtures, eliminating the mottled appearance of the raw imagery. Despite difficulties in the experimental analysis of a Texas forest, apparently due to relatively low resolution of the Landsat data, the computer classification approach outlined is suggested as a generally applicable method of creating serviceable maps from multispectral imagery.

  15. Mapping raised bogs with an iterative one-class classification approach

    NASA Astrophysics Data System (ADS)

    Mack, Benjamin; Roscher, Ribana; Stenzel, Stefanie; Feilhauer, Hannes; Schmidtlein, Sebastian; Waske, Björn

    2016-10-01

    Land use and land cover maps are one of the most commonly used remote sensing products. In many applications the user only requires a map of one particular class of interest, e.g. a specific vegetation type or an invasive species. One-class classifiers are appealing alternatives to common supervised classifiers because they can be trained with labeled training data of the class of interest only. However, training an accurate one-class classification (OCC) model is challenging, particularly when facing a large image, a small class and few training samples. To tackle these problems we propose an iterative OCC approach. The presented approach uses a biased Support Vector Machine as core classifier. In an iterative pre-classification step a large part of the pixels not belonging to the class of interest is classified. The remaining data is classified by a final classifier with a novel model and threshold selection approach. The specific objective of our study is the classification of raised bogs in a study site in southeast Germany, using multi-seasonal RapidEye data and a small number of training sample. Results demonstrate that the iterative OCC outperforms other state of the art one-class classifiers and approaches for model selection. The study highlights the potential of the proposed approach for an efficient and improved mapping of small classes such as raised bogs. Overall the proposed approach constitutes a feasible approach and useful modification of a regular one-class classifier.

  16. MAPS

    Atmospheric Science Data Center

    2014-07-03

    ... Measurement of Air Pollution from Satellites (MAPS) data were collected during Space Shuttle flights in 1981, ... Facts Correlative Data  - CDIAC - Spring & Fall 1994 - Field and Aircraft Campaigns SCAR-B Block:  ...

  17. A Bayesian approach for mapping event landslides using optical remote sensing imagery and digital terrain data

    NASA Astrophysics Data System (ADS)

    Guzzetti, F.; Mondini, A. C.; Marchesini, I.; Rossi, M.; Chang, K.; Pasquariello, G.

    2012-12-01

    Event landside inventory maps can be prepared using conventional or new mapping methods. Conventional methods, including field mapping and the visual interpretation of stereoscopic aerial photographs, are time consuming and resource intensive, restricting the ability to prepare event inventory maps rapidly, repeatedly, and for large and very large areas. This is a significant drawback for regional landslide studies and post event remedial efforts. Investigators are currently experimenting new methods for preparing landslide event inventories exploiting remotely sensed data, including qualitative (visual) and quantitative (numerical) analysis of very-high resolution (VHR) digital elevation models obtained chiefly through LiDAR surveys, and the interpretation and analysis of satellite images, including panchromatic, multispectral, and synthetic aperture radar images. We devised a stepwise, semi-automatic approach to detect, map, and classify internally rainfall-induced shallow landslides exploiting multispectral satellite images taken shortly after a landslide-triggering event, and information on the topographic signature of landslides obtained from a pre-event digital elevation model. In a Bayesian framework, the approach combines a standard image classification obtained by a supervised classifier (e.g., the Mahalanobis Distance classifier) applied to a post-event image, with information on the morphometric landslide signature measured by statistics of terrain slope and cross section convexity in landslide and stable areas. The semi-automatic approach is applied in two steps. First, the rainfall-induced landslides are detected and mapped, separating them from the stable areas. Next, the mapped landslides are classified internally, separating the source from the run out areas. We have applied the approach in a 117 km2 study area in Taiwan, where shallow landslides triggered by high intensity rainfall brought by typhoon Morakot in august 2009 were abundant. Comparison

  18. Simulating generic spin-boson models with matrix product states

    NASA Astrophysics Data System (ADS)

    Wall, Michael L.; Safavi-Naini, Arghavan; Rey, Ana Maria

    2016-11-01

    The global coupling of few-level quantum systems ("spins") to a discrete set of bosonic modes is a key ingredient for many applications in quantum science, including large-scale entanglement generation, quantum simulation of the dynamics of long-range interacting spin models, and hybrid platforms for force and spin sensing. We present a general numerical framework for treating the out-of-equilibrium dynamics of such models based on matrix product states. Our approach applies for generic spin-boson systems: it treats any spatial and operator dependence of the two-body spin-boson coupling and places no restrictions on relative energy scales. We show that the full counting statistics of collective spin measurements and infidelity of quantum simulation due to spin-boson entanglement, both of which are difficult to obtain by other techniques, are readily calculable in our approach. We benchmark our method using a recently developed exact solution for a particular spin-boson coupling relevant to trapped ion quantum simulators. Finally, we show how decoherence can be incorporated within our framework using the method of quantum trajectories, and study the dynamics of an open-system spin-boson model with spatially nonuniform spin-boson coupling relevant for trapped atomic ion crystals in the presence of molecular ion impurities.

  19. Effect of topographic data, geometric configuration and modeling approach on flood inundation mapping

    NASA Astrophysics Data System (ADS)

    Cook, Aaron; Merwade, Venkatesh

    2009-10-01

    SummaryTechnological aspects of producing, delivering and updating of flood hazard maps in the US have has gone through a revolutionary change through Federal Emergency Management Agency's Map Modernization program. In addition, the use of topographic information derived from Light Detection and Ranging (LIDAR) is enabling creation of relatively more accurate flood inundation maps. However, LIDAR is not available for the entire United States. Even for areas, where LIDAR data are available, the effect of other factors such as cross-section configuration in one-dimensional (1D) models, mesh resolution in two-dimensional models (2D), representation of river bathymetry, and modeling approach is not well studied or documented. The objective of this paper is to address some of these issues by comparing newly developed flood inundation maps from LIDAR data to maps that are developed using different topography, geometric description and modeling approach. The methodology involves use of six topographic datasets with different horizontal resolutions, vertical accuracies and bathymetry details. Each topographic dataset is used to create a flood inundation map for twelve different cross-section configurations using 1D HEC-RAS model, and two mesh resolutions using 2D FESWMS model. Comparison of resulting maps for two study areas (Strouds Creek in North Carolina and Brazos River in Texas) show that the flood inundation area reduces with improved horizontal resolution and vertical accuracy in the topographic data. This reduction is further enhanced by incorporating river bathymetry in topography data. Overall, the inundation extent predicted by FESWMS is smaller compared to prediction from HEC-RAS for the study areas, and that the variations in the flood inundation maps arising from different factors are smaller in FESWMS compared to HEC-RAS.

  20. Deciphering the genomic architecture of the stickleback brain with a novel multilocus gene-mapping approach.

    PubMed

    Li, Zitong; Guo, Baocheng; Yang, Jing; Herczeg, Gábor; Gonda, Abigél; Balázs, Gergely; Shikano, Takahito; Calboli, Federico C F; Merilä, Juha

    2017-03-01

    Quantitative traits important to organismal function and fitness, such as brain size, are presumably controlled by many small-effect loci. Deciphering the genetic architecture of such traits with traditional quantitative trait locus (QTL) mapping methods is challenging. Here, we investigated the genetic architecture of brain size (and the size of five different brain parts) in nine-spined sticklebacks (Pungitius pungitius) with the aid of novel multilocus QTL-mapping approaches based on a de-biased LASSO method. Apart from having more statistical power to detect QTL and reduced rate of false positives than conventional QTL-mapping approaches, the developed methods can handle large marker panels and provide estimates of genomic heritability. Single-locus analyses of an F2 interpopulation cross with 239 individuals and 15 198, fully informative single nucleotide polymorphisms (SNPs) uncovered 79 QTL associated with variation in stickleback brain size traits. Many of these loci were in strong linkage disequilibrium (LD) with each other, and consequently, a multilocus mapping of individual SNPs, accounting for LD structure in the data, recovered only four significant QTL. However, a multilocus mapping of SNPs grouped by linkage group (LG) identified 14 LGs (1-6 depending on the trait) that influence variation in brain traits. For instance, 17.6% of the variation in relative brain size was explainable by cumulative effects of SNPs distributed over six LGs, whereas 42% of the variation was accounted for by all 21 LGs. Hence, the results suggest that variation in stickleback brain traits is influenced by many small-effect loci. Apart from suggesting moderately heritable (h(2)  ≈ 0.15-0.42) multifactorial genetic architecture of brain traits, the results highlight the challenges in identifying the loci contributing to variation in quantitative traits. Nevertheless, the results demonstrate that the novel QTL-mapping approach developed here has distinctive advantages

  1. Integrated environmental mapping and monitoring, a methodological approach to optimise knowledge gathering and sampling strategy.

    PubMed

    Nilssen, Ingunn; Ødegård, Øyvind; Sørensen, Asgeir J; Johnsen, Geir; Moline, Mark A; Berge, Jørgen

    2015-07-15

    New technology has led to new opportunities for a holistic environmental monitoring approach adjusted to purpose and object of interest. The proposed integrated environmental mapping and monitoring (IEMM) concept, presented in this paper, describes the different steps in such a system from mission of survey to selection of parameters, sensors, sensor platforms, data collection, data storage, analysis and to data interpretation for reliable decision making. The system is generic; it can be used by authorities, industry and academia and is useful for planning- and operational phases. In the planning process the systematic approach is also ideal to identify areas with gap of knowledge. The critical stages of the concept is discussed and exemplified by two case studies, one environmental mapping and one monitoring case. As an operational system, the IEMM concept can contribute to an optimised integrated environmental mapping and monitoring for knowledge generation as basis for decision making. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Determination of contact maps in proteins: A combination of structural and chemical approaches

    SciTech Connect

    Wołek, Karol; Cieplak, Marek

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  3. Doubly Dressed Bosons: Exciton Polaritons in a Strong Terahertz Field

    NASA Astrophysics Data System (ADS)

    Pietka, B.; Bobrovska, N.; Stephan, D.; Teich, M.; Król, M.; Winnerl, S.; Pashkin, A.; Mirek, R.; Lekenta, K.; Morier-Genoud, F.; Schneider, H.; Deveaud, B.; Helm, M.; Matuszewski, M.; Szczytko, J.

    2017-08-01

    We demonstrate the existence of a novel quasiparticle, an exciton in a semiconductor doubly dressed with two photons of different wavelengths: a near infrared cavity photon and terahertz (THz) photon, with the THz coupling strength approaching the ultrastrong coupling regime. This quasiparticle is composed of three different bosons, being a mixture of a matter-light quasiparticle. Our observations are confirmed by a detailed theoretical analysis, treating quantum mechanically all three bosonic fields. The doubly dressed quasiparticles retain the bosonic nature of their constituents, but their internal quantum structure strongly depends on the intensity of the applied terahertz field.

  4. Concept Maps in the Classroom: A New Approach to Reveal Students' Conceptual Change

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Liefländer, Anne K.; Bogner, Franz X.

    2015-01-01

    When entering the classroom, adolescents already hold various conceptions on science topics. Concept maps may function as useful tools to reveal such conceptions although labor-intensive analysis often prevents application in typical classroom situations. The authors aimed to provide teachers with an appropriate approach to analyze students'…

  5. Improving Students' Creative Thinking and Achievement through the Implementation of Multiple Intelligence Approach with Mind Mapping

    ERIC Educational Resources Information Center

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…

  6. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  7. Does Constructivist Approach Applicable through Concept Maps to Achieve Meaningful Learning in Science?

    ERIC Educational Resources Information Center

    Jena, Ananta Kumar

    2012-01-01

    This study deals with the application of constructivist approach through individual and cooperative modes of spider and hierarchical concept maps to achieve meaningful learning on science concepts (e.g. acids, bases & salts, physical and chemical changes). The main research questions were: Q (1): is there any difference in individual and…

  8. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  9. The Criterion-Related Validity of a Computer-Based Approach for Scoring Concept Maps

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Koul, Ravinder; Salehi, Roya

    2006-01-01

    This investigation seeks to confirm a computer-based approach that can be used to score concept maps (Poindexter & Clariana, 2004) and then describes the concurrent criterion-related validity of these scores. Participants enrolled in two graduate courses (n=24) were asked to read about and research online the structure and function of the heart…

  10. Orbital stability during the mapping and approach phases of the MarcoPolo-R spacecraft

    NASA Astrophysics Data System (ADS)

    Wickhusen, K.; Hussmann, H.; Oberst, J.; Luedicke, F.

    2012-09-01

    In support of the Marco-Polo-R mission we are analyzing the motion of the spacecraft in the vicinity of its primary target, the binary asteroid system 175706 (1996 FG3). We ran simulations in order to support the general mapping, the approach, and the sampling phase

  11. Testing a Landsat-based approach for mapping disturbance causality in U.S. forests

    Treesearch

    Todd A. Schroeder; Karen G. Schleeweis; Gretchen G. Moisen; Chris Toney; Warren B. Cohen; Elizabeth A. Freeman; Zhiqiang Yang; Chengquan Huang

    2017-01-01

    In light of Earth's changing climate and growing human population, there is an urgent need to improve monitoring of natural and anthropogenic disturbanceswhich effect forests' ability to sequester carbon and provide other ecosystem services. In this study, a two-step modeling approach was used to map the type and timing of forest disturbances occurring...

  12. Does Constructivist Approach Applicable through Concept Maps to Achieve Meaningful Learning in Science?

    ERIC Educational Resources Information Center

    Jena, Ananta Kumar

    2012-01-01

    This study deals with the application of constructivist approach through individual and cooperative modes of spider and hierarchical concept maps to achieve meaningful learning on science concepts (e.g. acids, bases & salts, physical and chemical changes). The main research questions were: Q (1): is there any difference in individual and…

  13. Shape-to-String Mapping: A Novel Approach to Clustering Time-Index Biomics Data

    USDA-ARS?s Scientific Manuscript database

    Herein we describe a qualitative approach for clustering time-index biomics data. The data are transformed into angles from the intensity-ratios between adjacent time-points. A code is used to map a qualitative representation of the numerical time-index data which captures the features in the data ...

  14. Concept Maps in the Classroom: A New Approach to Reveal Students' Conceptual Change

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Liefländer, Anne K.; Bogner, Franz X.

    2015-01-01

    When entering the classroom, adolescents already hold various conceptions on science topics. Concept maps may function as useful tools to reveal such conceptions although labor-intensive analysis often prevents application in typical classroom situations. The authors aimed to provide teachers with an appropriate approach to analyze students'…

  15. The Criterion-Related Validity of a Computer-Based Approach for Scoring Concept Maps

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Koul, Ravinder; Salehi, Roya

    2006-01-01

    This investigation seeks to confirm a computer-based approach that can be used to score concept maps (Poindexter & Clariana, 2004) and then describes the concurrent criterion-related validity of these scores. Participants enrolled in two graduate courses (n=24) were asked to read about and research online the structure and function of the heart…

  16. Higgs boson production via vector-boson fusion at next-to-next-to-leading order in QCD.

    PubMed

    Bolzoni, Paolo; Maltoni, Fabio; Moch, Sven-Olaf; Zaro, Marco

    2010-07-02

    We present the total cross sections at next-to-next-to-leading order in the strong coupling for Higgs boson production via weak-boson fusion. Our results are obtained via the structure function approach, which builds upon the approximate, though very accurate, factorization of the QCD corrections between the two quark lines. The theoretical uncertainty on the total cross sections at the LHC from higher order corrections and the parton distribution uncertainties are estimated at the 2% level each for a wide range of Higgs boson masses.

  17. Correlation energy for elementary bosons: Physics of the singularity

    SciTech Connect

    Shiau, Shiue-Yuan; Combescot, Monique; Chang, Yia-Chung

    2016-04-15

    We propose a compact perturbative approach that reveals the physical origin of the singularity occurring in the density dependence of correlation energy: like fermions, elementary bosons have a singular correlation energy which comes from the accumulation, through Feynman “bubble” diagrams, of the same non-zero momentum transfer excitations from the free particle ground state, that is, the Fermi sea for fermions and the Bose–Einstein condensate for bosons. This understanding paves the way toward deriving the correlation energy of composite bosons like atomic dimers and semiconductor excitons, by suggesting Shiva diagrams that have similarity with Feynman “bubble” diagrams, the previous elementary boson approaches, which hide this physics, being inappropriate to do so.

  18. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    NASA Astrophysics Data System (ADS)

    Raj, Raghu G.

    2016-07-01

    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  19. Toward real-time three-dimensional mapping of surficial aquifers using a hybrid modeling approach

    NASA Astrophysics Data System (ADS)

    Friedel, Michael J.; Esfahani, Akbar; Iwashita, Fabio

    2016-02-01

    A hybrid modeling approach is proposed for near real-time three-dimensional (3D) mapping of surficial aquifers. First, airborne frequency-domain electromagnetic (FDEM) measurements are numerically inverted to obtain subsurface resistivities. Second, a machine-learning (ML) algorithm is trained using the FDEM measurements and inverted resistivity profiles, and borehole geophysical and hydrogeologic data. Third, the trained ML algorithm is used together with independent FDEM measurements to map the spatial distribution of the aquifer system. Efficacy of the hybrid approach is demonstrated for mapping a heterogeneous surficial aquifer and confining unit in northwestern Nebraska, USA. For this case, independent performance testing reveals that aquifer mapping is unbiased with a strong correlation (0.94) among numerically inverted and ML-estimated binary (clay-silt or sand-gravel) layer resistivities (5-20 ohm-m or 21-5,000 ohm-m), and an intermediate correlation (0.74) for heterogeneous (clay, silt, sand, gravel) layer resistivities (5-5,000 ohm-m). Reduced correlation for the heterogeneous model is attributed to over-estimating the under-sampled high-resistivity gravels (about 0.5 % of the training data), and when removed the correlation increases (0.87). Independent analysis of the numerically inverted and ML-estimated resistivities finds that the hybrid procedure preserves both univariate and spatial statistics for each layer. Following training, the algorithms can map 3D surficial aquifers as fast as leveled FDEM measurements are presented to the ML network.

  20. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    SciTech Connect

    Zhou, Shiguo; Kile, A.; Bechner, M.; Kvikstad, E.; Deng, W.; Wei, J.; Severin, J.; Runnheim, R.; Churas, C.; Forrest, D.; Dimalanta, E.; Lamers, C.; Burland, V.; Blattner, F. R.; Schwartz, David C.

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  1. A unified approach for debugging is-a structure and mappings in networked taxonomies

    PubMed Central

    2013-01-01

    Background With the increased use of ontologies and ontology mappings in semantically-enabled applications such as ontology-based search and data integration, the issue of detecting and repairing defects in ontologies and ontology mappings has become increasingly important. These defects can lead to wrong or incomplete results for the applications. Results We propose a unified framework for debugging the is-a structure of and mappings between taxonomies, the most used kind of ontologies. We present theory and algorithms as well as an implemented system RepOSE, that supports a domain expert in detecting and repairing missing and wrong is-a relations and mappings. We also discuss two experiments performed by domain experts: an experiment on the Anatomy ontologies from the Ontology Alignment Evaluation Initiative, and a debugging session for the Swedish National Food Agency. Conclusions Semantically-enabled applications need high quality ontologies and ontology mappings. One key aspect is the detection and removal of defects in the ontologies and ontology mappings. Our system RepOSE provides an environment that supports domain experts to deal with this issue. We have shown the usefulness of the approach in two experiments by detecting and repairing circa 200 and 30 defects, respectively. PMID:23548155

  2. Engineering geological mapping in Wallonia (Belgium) : present state and recent computerized approach

    NASA Astrophysics Data System (ADS)

    Delvoie, S.; Radu, J.-P.; Ruthy, I.; Charlier, R.

    2012-04-01

    An engineering geological map can be defined as a geological map with a generalized representation of all the components of a geological environment which are strongly required for spatial planning, design, construction and maintenance of civil engineering buildings. In Wallonia (Belgium) 24 engineering geological maps have been developed between the 70s and the 90s at 1/5,000 or 1/10,000 scale covering some areas of the most industrialized and urbanized cities (Liège, Charleroi and Mons). They were based on soil and subsoil data point (boring, drilling, penetration test, geophysical test, outcrop…). Some displayed data present the depth (with isoheights) or the thickness (with isopachs) of the different subsoil layers up to about 50 m depth. Information about geomechanical properties of each subsoil layer, useful for engineers and urban planners, is also synthesized. However, these maps were built up only on paper and progressively needed to be updated with new soil and subsoil data. The Public Service of Wallonia and the University of Liège have recently initiated a study to evaluate the feasibility to develop engineering geological mapping with a computerized approach. Numerous and various data (about soil and subsoil) are stored into a georelational database (the geotechnical database - using Access, Microsoft®). All the data are geographically referenced. The database is linked to a GIS project (using ArcGIS, ESRI®). Both the database and GIS project consist of a powerful tool for spatial data management and analysis. This approach involves a methodology using interpolation methods to update the previous maps and to extent the coverage to new areas. The location (x, y, z) of each subsoil layer is then computed from data point. The geomechanical data of these layers are synthesized in an explanatory booklet joined to maps.

  3. Using Concept Mapping in Community-Based Participatory Research: A Mixed Methods Approach

    PubMed Central

    Windsor, Liliane Cambraia

    2015-01-01

    Community-based participatory research (CBPR) has been identified as a useful approach to increasing community involvement in research. Developing rigorous methods in conducting CBPR is an important step in gaining more support for this approach. The current article argues that concept mapping, a structured mixed methods approach, is useful in the initial development of a rigorous CBPR program of research aiming to develop culturally tailored and community-based health interventions for vulnerable populations. A research project examining social dynamics and consequences of alcohol and substance use in Newark, New Jersey, is described to illustrate the use of concept mapping methodology in CBPR. A total of 75 individuals participated in the study. PMID:26561484

  4. A new GIS approach for reconstructing and mapping dynamic late Holocene coastal plain palaeogeography

    NASA Astrophysics Data System (ADS)

    Pierik, H. J.; Cohen, K. M.; Stouthamer, E.

    2016-10-01

    The geomorphological development of Holocene coastal plains around the world has been studied since the beginning of the twentieth century from various disciplines, resulting in large amounts of data. However, the overwhelming quantities and heterogeneous nature of this data have caused the divided knowledge to remain inconsistent and fragmented. To keep improving the understanding of coastal plain geomorphology and geology, cataloguing of data and integration of knowledge are essential. In this paper we present a GIS that incorporates the accumulated data of the Netherlands' coastal plain and functions as a storage and integration tool for coastal plain mapped data. The GIS stores redigitised architectural elements (beach barriers, tidal channels, intertidal flats, supratidal flats, and coastal fresh water peat) from earlier mappings in separate map layers. A coupled catalogue-style database stores the dating information of these elements, besides references to source studies and annotations regarding changed insights. Using scripts, the system automatically establishes palaeogeographical maps for any chosen moment, combining the above mapping and dating information. In our approach, we strip the information to architectural element level, and we separate mapping from dating information, serving the automatic generation of time slice maps. It enables a workflow in which the maker can iteratively regenerate maps, which speeds up fine-tuning and thus the quality of palaeogeographical reconstruction. The GIS currently covers the late Holocene coastal plain development of the Netherlands. This period witnessed widespread renewed flooding along the southern North Sea coast, coinciding with large-scale reclamation and human occupation. Our GIS method is generic and can be expanded and adapted to allow faster integrated processing of growing amounts of data for many coastal areas and other large urbanising lowlands around the world. It allows maintaining actual data

  5. DREAM--a novel approach for robust, ultrafast, multislice B₁ mapping.

    PubMed

    Nehrke, Kay; Börnert, Peter

    2012-11-01

    A novel multislice B₁-mapping method dubbed dual refocusing echo acquisition mode is proposed, able to cover the whole transmit coil volume in only one second, which is more than an order of magnitude faster than established approaches. The dual refocusing echo acquisition mode technique employs a stimulated echo acquisition mode (STEAM) preparation sequence followed by a tailored single-shot gradient echo sequence, measuring simultaneously the stimulated echo and the free induction decay as gradient-recalled echoes, and determining the actual flip angle of the STEAM preparation radiofrequency pulses from the ratio of the two measured signals. Due to an elaborated timing scheme, the method is insensitive against susceptibility/chemical shift effects and can deliver a B₀ phase map and a transceive phase map for free. The approach has only a weak T₁ and T₂ dependence and moreover, causes only a low specific absorption rate (SAR) burden. The accuracy of the method with respect to systematic and statistical errors is investigated both, theoretically and in experiments on phantoms. In addition, the performance of the approach is demonstrated in vivo in B₁-mapping and radiofrequency shimming experiments on the abdomen, the legs, and the head on an eight-channel parallel transmit 3 T MRI system. Copyright © 2012 Wiley Periodicals, Inc.

  6. Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population

    PubMed Central

    Raghavan, Chitra; Mauleon, Ramil; Lacorte, Vanica; Jubay, Monalisa; Zaw, Hein; Bonifacio, Justine; Singh, Rakesh Kumar; Huang, B. Emma; Leung, Hei

    2017-01-01

    Multi-parent Advanced Generation Intercross (MAGIC) populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL) mapping. In this study, 1316 S6:8 indica MAGIC (MI) lines and the eight founders were sequenced using Genotyping by Sequencing (GBS). As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height), physical (grain length and grain width) and cooking properties (amylose content) of the rice grain, abiotic stress (submergence tolerance), and biotic stress (brown spot disease) were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations. PMID:28592653

  7. Repelling Point Bosons

    NASA Astrophysics Data System (ADS)

    McGuire, J. B.

    2011-12-01

    There is a body of conventional wisdom that holds that a solvable quantum problem, by virtue of its solvability, is pathological and thus irrelevant. It has been difficult to refute this view owing to the paucity of theoretical constructs and experimental results. Recent experiments involving equivalent ions trapped in a spatial conformation of extreme anisotropic confinement (longitudinal extension tens, hundreds or even thousands of times transverse extension) have modified the view of relevancy, and it is now possible to consider systems previously thought pathological, in particular point Bosons that repel in one dimension. It has been difficult for the experimentalists to utilize existing theory, mainly due to long-standing theoretical misunderstanding of the relevance of the permutation group, in particular the non-commutativity of translations (periodicity) and transpositions (permutation). This misunderstanding is most easily rectified in the case of repelling Bosons.

  8. An automated approach for mapping persistent ice and snow cover over high latitude regions

    USGS Publications Warehouse

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  9. Higgs Boson Properties

    NASA Astrophysics Data System (ADS)

    David, André Dührssen, Michael

    2016-10-01

    This chapter presents an overview of the measured properties of the Higgs boson discovered in 2012 by the ATLAS and CMS collaborations at the CERN LHC. Searches for deviations from the properties predicted by the standard theory are also summarised. The present status corresponds to the combined analysis of the full Run 1 data sets of collisions collected at centre-of-mass energies of 7 and 8 TeV.

  10. Effects of concept map teaching on students' critical thinking and approach to learning and studying.

    PubMed

    Chen, Shiah-Lian; Liang, Tienli; Lee, Mei-Li; Liao, I-Chen

    2011-08-01

    The purpose of this study was to explore the effects of concept mapping in developing critical thinking ability and approach to learning and studying. A quasi-experimental study design with a purposive sample was drawn from a group of nursing students enrolled in a medical-surgical nursing course in central Taiwan. Students in the experimental group were taught to use concept mapping in their learning. Students in the control group were taught by means of traditional lectures. After the intervention, the experimental group had better overall critical thinking scores than did the control group, although the difference was not statistically significant. After controlling for the effects of age and the pretest score on critical thinking using analysis of covariance, the experimental group had significantly higher adjusted mean scores on inference and overall critical thinking compared with the control group. Concept mapping is an effective tool for improving students' ability to think critically.

  11. Phase map retrieval in digital holography: avoiding the undersampling effect by a lateral shear approach.

    PubMed

    Ferraro, P; Del Core, C; Miccio, L; Grilli, S; De Nicola, S; Finizio, A; Coppola, G

    2007-08-01

    In digital holography (DH) the numerical reconstruction of the whole wavefront allows one to extract the wrapped phase map mod, 2 pi. It can occur that the reconstructed wrapped phase map in the image plane is undersampled because of the limited pixel size in that plane. In such a case the phase distribution cannot be retrieved correctly by the usual unwrapping procedures. We show that the use of the digital lateral-shearing interferometry approach in DH provides the correct reconstruction of the phase map in the image plane, even in extreme cases where the phase profile changes very rapidly. We demonstrate the effectiveness of the method in a particular case where the profile of a highly curved silicon microelectromechanical system membrane has to be reconstructed.

  12. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  13. High-resolution geologic mapping of the inner continental shelf: Boston Harbor and approaches, Massachusetts

    USGS Publications Warehouse

    Ackerman, Seth D.; Butman, Bradford; Barnhardt, Walter A.; Danforth, William W.; Crocker, James M.

    2006-01-01

    This report presents the surficial geologic framework data and information for the sea floor of Boston Harbor and Approaches, Massachusetts (fig. 1.1). This mapping was conducted as part of a cooperative program between the U.S. Geological Survey (USGS), the Massachusetts Office of Coastal Zone Management (CZM), and the National Oceanic and Atmospheric Administration (NOAA). The primary objective of this project was to provide sea floor geologic information and maps of Boston Harbor to aid resource management, scientific research, industry and the public. A secondary objective was to test the feasibility of using NOAA hydrographic survey data, normally collected to update navigation charts, to create maps of the sea floor suitable for geologic and habitat interpretations. Defining sea-floor geology is the first steps toward managing ocean resources and assessing environmental changes due to natural or human activity. The geophysical data for these maps were collected as part of hydrographic surveys carried out by NOAA in 2000 and 2001 (fig. 1.2). Bottom photographs, video, and samples of the sediments were collected in September 2004 to help in the interpretation of the geophysical data. Included in this report are high-resolution maps of the sea floor, at a scale of 1:25,000; the data used to create these maps in Geographic Information Systems (GIS) format; a GIS project; and a gallery of photographs of the sea floor. Companion maps of sea floor to the north Boston Harbor and Approaches are presented by Barnhardt and others (2006) and to the east by Butman and others (2003a,b,c). See Butman and others (2004) for a map of Massachusetts Bay at a scale of 1:125,000. The sections of this report are listed in the navigation bar along the left-hand margin of this page. Section 1 (this section) introduces the report. Section 2 presents the large-format map sheets. Section 3 describes data collection, processing, and analysis. Section 4 summarizes the geologic history of

  14. Combining participatory and socioeconomic approaches to map fishing effort in small-scale fisheries.

    PubMed

    Thiault, Lauric; Collin, Antoine; Chlous, Frédérique; Gelcich, Stefan; Claudet, Joachim

    2017-01-01

    Mapping the spatial allocation of fishing effort while including key stakeholders in the decision making process is essential for effective fisheries management but is difficult to implement in complex small-scale fisheries that are diffuse, informal and multifaceted. Here we present a standardized but flexible approach that combines participatory mapping approaches (fishers' spatial preference for fishing grounds, or fishing suitability) with socioeconomic approaches (spatial extrapolation of social surrogates, or fishing capacity) to generate a comprehensive map of predicted fishing effort. Using a real world case study, in Moorea, French Polynesia, we showed that high predicted fishing effort is not simply located in front of, or close to, main fishing villages with high dependence on marine resources; it also occurs where resource dependency is moderate and generally in near-shore areas and reef passages. The integrated approach we developed can contribute to addressing the recurrent lack of fishing effort spatial data through key stakeholders' (i.e., resource users) participation. It can be tailored to a wide range of social, ecological and data availability contexts, and should help improve place-based management of natural resources.

  15. Combining participatory and socioeconomic approaches to map fishing effort in small-scale fisheries

    PubMed Central

    Collin, Antoine; Chlous, Frédérique; Gelcich, Stefan; Claudet, Joachim

    2017-01-01

    Mapping the spatial allocation of fishing effort while including key stakeholders in the decision making process is essential for effective fisheries management but is difficult to implement in complex small-scale fisheries that are diffuse, informal and multifaceted. Here we present a standardized but flexible approach that combines participatory mapping approaches (fishers’ spatial preference for fishing grounds, or fishing suitability) with socioeconomic approaches (spatial extrapolation of social surrogates, or fishing capacity) to generate a comprehensive map of predicted fishing effort. Using a real world case study, in Moorea, French Polynesia, we showed that high predicted fishing effort is not simply located in front of, or close to, main fishing villages with high dependence on marine resources; it also occurs where resource dependency is moderate and generally in near-shore areas and reef passages. The integrated approach we developed can contribute to addressing the recurrent lack of fishing effort spatial data through key stakeholders' (i.e., resource users) participation. It can be tailored to a wide range of social, ecological and data availability contexts, and should help improve place-based management of natural resources. PMID:28486509

  16. Automated mapping of glacial overdeepenings beneath contemporary ice sheets: Approaches and potential applications

    NASA Astrophysics Data System (ADS)

    Patton, Henry; Swift, Darrel A.; Clark, Chris D.; Livingstone, Stephen J.; Cook, Simon J.; Hubbard, Alun

    2015-03-01

    Awareness is growing on the significance of overdeepenings in ice sheet systems. However, a complete understanding of overdeepening formation is lacking, meaning observations of overdeepening location and morphometry are urgently required to motivate process understanding. Subject to the development of appropriate mapping approaches, high resolution subglacial topography data sets covering the whole of Antarctica and Greenland offer significant potential to acquire such observations and to relate overdeepening characteristics to ice sheet parameters. We explore a possible method for mapping overdeepenings beneath the Antarctic and Greenland ice sheets and illustrate a potential application of this approach by testing a possible relationship between overdeepening elongation ratio and ice sheet flow velocity. We find that hydrological and terrain filtering approaches are unsuited to mapping overdeepenings and develop a novel rule-based GIS methodology that delineates overdeepening perimeters by analysis of closed-contour properties. We then develop GIS procedures that provide information on overdeepening morphology and topographic context. Limitations in the accuracy and resolution of bed-topography data sets mean that application to glaciological problems requires consideration of quality-control criteria to (a) remove potentially spurious depressions and (b) reduce uncertainties that arise from the inclusion of depressions of nonglacial origin, or those in regions where empirical data are sparse. To address the problem of overdeepening elongation, potential quality control criteria are introduced; and discussion of this example serves to highlight the limitations that mapping approaches - and applications of such approaches - must confront. We predict that improvements in bed-data quality will reduce the need for quality control procedures and facilitate increasingly robust insights from empirical data.

  17. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    PubMed

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  18. A Practical and Automated Approach to Large Area Forest Disturbance Mapping with Remote Sensing

    PubMed Central

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  19. A DEM-based approach for large-scale floodplain mapping in ungauged watersheds

    NASA Astrophysics Data System (ADS)

    Jafarzadegan, Keighobad; Merwade, Venkatesh

    2017-07-01

    Binary threshold classifiers are a simple form of supervised classification methods that can be used in floodplain mapping. In these methods, a given watershed is examined as a grid of cells with a particular morphologic value. A reference map is a grid of cells labeled as flood and non-flood from hydraulic modeling or remote sensing observations. By using the reference map, a threshold on morphologic feature is determined to label the unknown cells as flood and non-flood (binary classification). The main limitation of these methods is the threshold transferability assumption in which a homogenous geomorphological and hydrological behavior is assumed for the entire region and the same threshold derived from the reference map (training area) is used for other locations (ungauged watersheds) inside the study area. In order to overcome this limitation and consider the threshold variability inside a large region, regression modeling is used in this paper to predict the threshold by relating it to the watershed characteristics. Application of this approach for North Carolina shows that the threshold is related to main stream slope, average watershed elevation, and average watershed slope. By using the Fitness (F) and Correct (C) criteria of C > 0.9 and F > 0.6, results show the threshold prediction and the corresponding floodplain for 100-year design flow are comparable to that from Federal Emergency Management Agency's (FEMA) Flood Insurance Rate Maps (FIRMs) in the region. However, the floodplains from the proposed model are underpredicted and overpredicted in the flat (average watershed slope <1%) and mountainous regions (average watershed slope >20%). Overall, the proposed approach provides an alternative way of mapping floodplain in data-scarce regions.

  20. Continuous intensity map optimization (CIMO): a novel approach to leaf sequencing in step and shoot IMRT.

    PubMed

    Cao, Daliang; Earl, Matthew A; Luan, Shuang; Shepard, David M

    2006-04-01

    A new leaf-sequencing approach has been developed that is designed to reduce the number of required beam segments for step-and-shoot intensity modulated radiation therapy (IMRT). This approach to leaf sequencing is called continuous-intensity-map-optimization (CIMO). Using a simulated annealing algorithm, CIMO seeks to minimize differences between the optimized and sequenced intensity maps. Two distinguishing features of the CIMO algorithm are (1) CIMO does not require that each optimized intensity map be clustered into discrete levels and (2) CIMO is not rule-based but rather simultaneously optimizes both the aperture shapes and weights. To test the CIMO algorithm, ten IMRT patient cases were selected (four head-and-neck, two pancreas, two prostate, one brain, and one pelvis). For each case, the optimized intensity maps were extracted from the Pinnacle3 treatment planning system. The CIMO algorithm was applied, and the optimized aperture shapes and weights were loaded back into Pinnacle. A final dose calculation was performed using Pinnacle's convolution/superposition based dose calculation. On average, the CIMO algorithm provided a 54% reduction in the number of beam segments as compared with Pinnacle's leaf sequencer. The plans sequenced using the CIMO algorithm also provided improved target dose uniformity and a reduced discrepancy between the optimized and sequenced intensity maps. For ten clinical intensity maps, comparisons were performed between the CIMO algorithm and the power-of-two reduction algorithm of Xia and Verhey [Med. Phys. 25(8), 1424-1434 (1998)]. When the constraints of a Varian Millennium multileaf collimator were applied, the CIMO algorithm resulted in a 26% reduction in the number of segments. For an Elekta multileaf collimator, the CIMO algorithm resulted in a 67% reduction in the number of segments. An average leaf sequencing time of less than one minute per beam was observed.

  1. Spatial correlation between the mineralogic and geologic maps of Vesta: a GIS based approach

    NASA Astrophysics Data System (ADS)

    Frigeri, Alessandro; De Sanctis, Maria Cristina; Ammannito, Eleonora; Yingst, Aileen; Williams, David; Capaccioni, Fabrizio; Tosi, Federico; Palomba, Ernesto; Zambon, Francesca; Jaumann, Ralf; Pieters, Carle; Raymond, Carol; Russell, Christopher

    2015-04-01

    Between July 2011 and September 2012, the NASA/Dawn mission has mapped the surface of Vesta with images from the Framing Camera (FC), spectral data from the Visible and Infrared Mapping Spectrometer (VIR), and elemental data from the Gamma Ray and Neutron Detector (GRaND). The successful acquisition of imagery from FC and VIR allowed us to produce image mosaics reaching 20 meters per pixel and global mineralogic maps at 100 meters per pixel. A global geologic map of Vesta has been recently published. Geologic uits and structures have been identified and put into their stratigraphic context using FC image-mosaic and the digital terrain model derived from stereo image processing. The VIR spectra have been synthesized into spectral parameters or indicators that have been used to produce quadrangle and global maps showing the mineralogic diversity across Vesta, through the variation of the compositional and the physical state of the pyroxene-rich lithologies, which are typical of Vesta. We have designed a Geographic Information System (GIS) approach to correlate quantitatively the geologic map and the spectral parameters maps of Vesta, applying statistical analysis and informational techniques to the geospatial aspect of the geologic and mineralogic data. The Geographic Resources Analysis Support System (GRASS) Software version 7 has been used for the analysis, while data has been stored in an Open Gis Consortium (OGC) compatible digital format, which guarantees the interoperability with other GIS and other computational software packages. Here we present the work done so far on the most up-to-date global geologic and mineralogic dataset available for Vesta.

  2. Quantum Kinematics of Bosonic Vortex Loops

    SciTech Connect

    Goldin, G.A.; Owczarek, R.; Sharp, D.H.

    1999-05-06

    Poisson structure for vortex filaments (loops and arcs) in 2D ideal incompressible fluid is analyzed in detail. Canonical coordinates and momenta on coadjoint orbits of the area-preserving diffeomorphism group, associated with such vortices, are found. The quantum space of states in the simplest case of ''bosonic'' vortex loops is built within a geometric quantization approach to the description of a quantum fluid. Fock-like structure and non-local creation and annihilation operators of quantum vortex filaments are introduced.

  3. Chiral anomaly, bosonization, and fractional charge

    SciTech Connect

    Mignaco, J.A.; Monteiro, M.A.R.

    1985-06-15

    We present a method to evaluate the Jacobian of chiral rotations, regulating determinants through the proper-time method and using Seeley's asymptotic expansion. With this method we compute easily the chiral anomaly for ..nu.. = 4,6 dimensions, discuss bosonization of some massless two-dimensional models, and handle the problem of charge fractionization. In addition, we comment on the general validity of Fujikawa's approach to regulate the Jacobian of chiral rotations with non-Hermitian operators.

  4. Global land cover mapping at 30 m resolution: A POK-based operational approach

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  5. A Random-Model Approach to QTL Mapping in Multiparent Advanced Generation Intercross (MAGIC) Populations.

    PubMed

    Wei, Julong; Xu, Shizhong

    2016-02-01

    Most standard QTL mapping procedures apply to populations derived from the cross of two parents. QTL detected from such biparental populations are rarely relevant to breeding programs because of the narrow genetic basis: only two alleles are involved per locus. To improve the generality and applicability of mapping results, QTL should be detected using populations initiated from multiple parents, such as the multiparent advanced generation intercross (MAGIC) populations. The greatest challenges of QTL mapping in MAGIC populations come from multiple founder alleles and control of the genetic background information. We developed a random-model methodology by treating the founder effects of each locus as random effects following a normal distribution with a locus-specific variance. We also fit a polygenic effect to the model to control the genetic background. To improve the statistical power for a scanned marker, we release the marker effect absorbed by the polygene back to the model. In contrast to the fixed-model approach, we estimate and test the variance of each locus and scan the entire genome one locus at a time using likelihood-ratio test statistics. Simulation studies showed that this method can increase statistical power and reduce type I error compared with composite interval mapping (CIM) and multiparent whole-genome average interval mapping (MPWGAIM). We demonstrated the method using a public Arabidopsis thaliana MAGIC population and a mouse MAGIC population.

  6. A novel approach to locate Phytophthora infestans resistance genes on the potato genetic map.

    PubMed

    Jacobs, Mirjam M J; Vosman, Ben; Vleeshouwers, Vivianne G A A; Visser, Richard G F; Henken, Betty; van den Berg, Ronald G

    2010-02-01

    Mapping resistance genes is usually accomplished by phenotyping a segregating population for the resistance trait and genotyping it using a large number of markers. Most resistance genes are of the NBS-LRR type, of which an increasing number is sequenced. These genes and their analogs (RGAs) are often organized in clusters. Clusters tend to be rather homogenous, viz. containing genes that show high sequence similarity with each other. From many of these clusters the map position is known. In this study we present and test a novel method to quickly identify to which cluster a new resistance gene belongs and to produce markers that can be used for introgression breeding. We used NBS profiling to identify markers in bulked DNA samples prepared from resistant and susceptible genotypes of small segregating populations. Markers co-segregating with resistance can be tested on individual plants and directly used for breeding. To identify the resistance gene cluster a gene belongs to, the fragments were sequenced and the sequences analyzed using bioinformatics tools. Putative map positions arising from this analysis were validated using markers mapped in the segregating population. The versatility of the approach is demonstrated with a number of populations derived from wild Solanum species segregating for P. infestans resistance. Newly identified P. infestans resistance genes originating from S. verrucosum, S. schenckii, and S. capsicibaccatum could be mapped to potato chromosomes 6, 4, and 11, respectively.

  7. A Random-Model Approach to QTL Mapping in Multiparent Advanced Generation Intercross (MAGIC) Populations

    PubMed Central

    Wei, Julong; Xu, Shizhong

    2016-01-01

    Most standard QTL mapping procedures apply to populations derived from the cross of two parents. QTL detected from such biparental populations are rarely relevant to breeding programs because of the narrow genetic basis: only two alleles are involved per locus. To improve the generality and applicability of mapping results, QTL should be detected using populations initiated from multiple parents, such as the multiparent advanced generation intercross (MAGIC) populations. The greatest challenges of QTL mapping in MAGIC populations come from multiple founder alleles and control of the genetic background information. We developed a random-model methodology by treating the founder effects of each locus as random effects following a normal distribution with a locus-specific variance. We also fit a polygenic effect to the model to control the genetic background. To improve the statistical power for a scanned marker, we release the marker effect absorbed by the polygene back to the model. In contrast to the fixed-model approach, we estimate and test the variance of each locus and scan the entire genome one locus at a time using likelihood-ratio test statistics. Simulation studies showed that this method can increase statistical power and reduce type I error compared with composite interval mapping (CIM) and multiparent whole-genome average interval mapping (MPWGAIM). We demonstrated the method using a public Arabidopsis thaliana MAGIC population and a mouse MAGIC population. PMID:26715662

  8. Visualization: a really generic approach or the art of mapping data to graphical objects

    NASA Astrophysics Data System (ADS)

    Trilk, Joern; Schuetz, Frank

    1998-05-01

    Visualization is an important technology for analyzing large amounts of data. However, the process of creating meaningful visualizations is quite difficult. The success of this process depends heavily on a good mapping of objects present in the application domain to objects used in the graphical representation. Both kinds of objects possess several attributes. Whereas data objects have attributes of certain types (e.g. integers, strings) graphical objects are characterized by their appearance (shape, color, size, etc.). In our approach, the user may map arbitrarily data attributes to graphical attributes, leading to a great flexibility. In our opinion, this is the only possibility to achieve a really generic approach. To evaluate our ideas, we developed a tool called ProViS. This tool indicates the possible attributes of data objects as well as graphical objects. Depending on his goals, the user can then 'connect' (freely) attributes of data objects to attributes of their graphical counterparts. The structure behind the application objects can be worked out very easily with the help of various layout algorithms. In addition, we integrated several mechanisms (e.g. ghosting, hiding, grouping, fisheye views) to reduce complexity and to further enhance the three-dimensional visualization. In this paper, first of all we take a look at the basic principle of visualization: mapping data. Then we present, ProViS, a visualization tool implementing our idea of mapping.

  9. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    PubMed Central

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  10. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.

    PubMed

    Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement.

  11. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    NASA Astrophysics Data System (ADS)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  12. Interacting boson models for N{approx}Z nuclei

    SciTech Connect

    Van Isacker, P.

    2011-05-06

    This contribution discusses the use of boson models in the description of N{approx}Z nuclei. A brief review is given of earlier attempts, initiated by Elliott and co-workers, to extend the interacting boson model of Arima and Iachello by the inclusion of neutron-proton s and d bosons with T = 1 (IBM-3) as well as T = 0 (IBM-4). It is argued that for the N{approx}Z nuclei that are currently studied experimentally, a different approach is needed which invokes aligned neutron-proton pairs with angular momentum J = 2j and isospin T = 0. This claim is supported by an analysis of shell-model wave functions in terms of pair states. Results of this alternative version of the interacting boson model are compared with shell-model calculations in the 1g{sub 9/2} shell.

  13. Interacting boson models for N˜Z nuclei

    NASA Astrophysics Data System (ADS)

    Van Isacker, P.

    2011-05-01

    This contribution discusses the use of boson models in the description of N˜Z nuclei. A brief review is given of earlier attempts, initiated by Elliott and co-workers, to extend the interacting boson model of Arima and Iachello by the inclusion of neutron-proton s and d bosons with T = 1 (IBM-3) as well as T = 0 (IBM-4). It is argued that for the N˜Z nuclei that are currently studied experimentally, a different approach is needed which invokes aligned neutron-proton pairs with angular momentum J = 2j and isospin T = 0. This claim is supported by an analysis of shell-model wave functions in terms of pair states. Results of this alternative version of the interacting boson model are compared with shell-model calculations in the 1g9/2 shell.

  14. Bose-Einstein condensates of bosonic Thomson atoms

    NASA Astrophysics Data System (ADS)

    Schneider, Tobias; Blümel, Reinhold

    1999-10-01

    A system of charged particles in a harmonic trap is a realization of Thomson's raisin cake model. Therefore, we call it a Thomson atom. Bosonic, fermionic and mixed Thomson atoms exist. In this paper we focus on bosonic Thomson atoms in isotropic traps. Approximating the exact ground state by a condensate we investigate ground-state properties at temperature T = 0 using the Hartree-Fock theory for bosons. In order to assess the quality of our mean-field approach we compare the Hartree-Fock results for bosonic Thomson helium with an exact diagonalization. In contrast to the weakly interacting Bose gas (alkali vapours) mean-field calculations are reliable in the limit of large particle density. The Wigner regime (low particle density) is discussed.

  15. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  16. A pooling-based approach to mapping genetic variants associated with DNA methylation

    SciTech Connect

    Kaplow, Irene M.; MacIsaac, Julia L.; Mah, Sarah M.; McEwen, Lisa M.; Kobor, Michael S.; Fraser, Hunter B.

    2015-04-24

    DNA methylation is an epigenetic modification that plays a key role in gene regulation. Previous studies have investigated its genetic basis by mapping genetic variants that are associated with DNA methylation at specific sites, but these have been limited to microarrays that cover <2% of the genome and cannot account for allele-specific methylation (ASM). Other studies have performed whole-genome bisulfite sequencing on a few individuals, but these lack statistical power to identify variants associated with DNA methylation. We present a novel approach in which bisulfite-treated DNA from many individuals is sequenced together in a single pool, resulting in a truly genome-wide map of DNA methylation. Compared to methods that do not account for ASM, our approach increases statistical power to detect associations while sharply reducing cost, effort, and experimental variability. As a proof of concept, we generated deep sequencing data from a pool of 60 human cell lines; we evaluated almost twice as many CpGs as the largest microarray studies and identified more than 2000 genetic variants associated with DNA methylation. Here we found that these variants are highly enriched for associations with chromatin accessibility and CTCF binding but are less likely to be associated with traits indirectly linked to DNA, such as gene expression and disease phenotypes. In summary, our approach allows genome-wide mapping of genetic variants associated with DNA methylation in any tissue of any species, without the need for individual-level genotype or methylation data.

  17. A pooling-based approach to mapping genetic variants associated with DNA methylation.

    PubMed

    Kaplow, Irene M; MacIsaac, Julia L; Mah, Sarah M; McEwen, Lisa M; Kobor, Michael S; Fraser, Hunter B

    2015-06-01

    DNA methylation is an epigenetic modification that plays a key role in gene regulation. Previous studies have investigated its genetic basis by mapping genetic variants that are associated with DNA methylation at specific sites, but these have been limited to microarrays that cover <2% of the genome and cannot account for allele-specific methylation (ASM). Other studies have performed whole-genome bisulfite sequencing on a few individuals, but these lack statistical power to identify variants associated with DNA methylation. We present a novel approach in which bisulfite-treated DNA from many individuals is sequenced together in a single pool, resulting in a truly genome-wide map of DNA methylation. Compared to methods that do not account for ASM, our approach increases statistical power to detect associations while sharply reducing cost, effort, and experimental variability. As a proof of concept, we generated deep sequencing data from a pool of 60 human cell lines; we evaluated almost twice as many CpGs as the largest microarray studies and identified more than 2000 genetic variants associated with DNA methylation. We found that these variants are highly enriched for associations with chromatin accessibility and CTCF binding but are less likely to be associated with traits indirectly linked to DNA, such as gene expression and disease phenotypes. In summary, our approach allows genome-wide mapping of genetic variants associated with DNA methylation in any tissue of any species, without the need for individual-level genotype or methylation data.

  18. Putting people on the map through an approach that integrates social data in conservation planning.

    PubMed

    Stephanson, Sheri L; Mascia, Michael B

    2014-10-01

    Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making. © 2014 Society for Conservation Biology.

  19. Inhomogeneous hard-core bosonic mixture with checkerboard supersolid phase: Quantum and thermal phase diagram

    NASA Astrophysics Data System (ADS)

    Heydarinasab, F.; Abouie, J.

    2017-09-01

    We introduce an inhomogeneous bosonic mixture composed of two kinds of hard-core and semi-hard-core bosons with different nilpotency conditions and demonstrate that in contrast with the standard hard-core Bose-Hubbard model, our bosonic mixture with nearest- and next-nearest-neighbor interactions on a square lattice develops the checkerboard supersolid phase characterized by the simultaneous superfluid and checkerboard solid orders. Our bosonic mixture is created from a two-orbital Bose-Hubbard model including two kinds of bosons: a single-orbital boson and a two-orbital boson. By mapping the bosonic mixture to an anisotropic inhomogeneous spin model in the presence of a magnetic field, we study the ground-state phase diagram of the model by means of cluster mean field theory and linear spin-wave theory and show that various phases such as solid, superfluid, supersolid, and Mott insulator appear in the phase diagram of the mixture. Competition between the interactions and magnetic field causes the mixture to undergo different kinds of first- and second-order phase transitions. By studying the behavior of the spin-wave excitations, we find the reasons of all first- and second-order phase transitions. We also obtain the temperature phase diagram of the system using cluster mean field theory. We show that the checkerboard supersolid phase persists at finite temperature comparable with the interaction energies of bosons.

  20. An assessment of a collaborative mapping approach for exploring land use patterns for several European metropolises

    NASA Astrophysics Data System (ADS)

    Jokar Arsanjani, Jamal; Vaz, Eric

    2015-03-01

    Until recently, land surveys and digital interpretation of remotely sensed imagery have been used to generate land use inventories. These techniques however, are often cumbersome and costly, allocating large amounts of technical and temporal costs. The technological advances of web 2.0 have brought a wide array of technological achievements, stimulating the participatory role in collaborative and crowd sourced mapping products. This has been fostered by GPS-enabled devices, and accessible tools that enable visual interpretation of high resolution satellite images/air photos provided in collaborative mapping projects. Such technologies offer an integrative approach to geography by means of promoting public participation and allowing accurate assessment and classification of land use as well as geographical features. OpenStreetMap (OSM) has supported the evolution of such techniques, contributing to the existence of a large inventory of spatial land use information. This paper explores the introduction of this novel participatory phenomenon for land use classification in Europe's metropolitan regions. We adopt a positivistic approach to assess comparatively the accuracy of these contributions of OSM for land use classifications in seven large European metropolitan regions. Thematic accuracy and degree of completeness of OSM data was compared to available Global Monitoring for Environment and Security Urban Atlas (GMESUA) datasets for the chosen metropolises. We further extend our findings of land use within a novel framework for geography, justifying that volunteered geographic information (VGI) sources are of great benefit for land use mapping depending on location and degree of VGI dynamism and offer a great alternative to traditional mapping techniques for metropolitan regions throughout Europe. Evaluation of several land use types at the local level suggests that a number of OSM classes (such as anthropogenic land use, agricultural and some natural environment

  1. Mapping genetic determinants of viral traits with FST and quantitative trait locus (QTL) approaches.

    PubMed

    Doumayrou, Juliette; Thébaud, Gaël; Vuillaume, Florence; Peterschmitt, Michel; Urbino, Cica

    2015-10-01

    The genetic determinism of viral traits can generally be dissected using either forward or reverse genetics because the clonal reproduction of viruses does not require the use of approaches based on laboratory crosses. Nevertheless, we hypothesized that recombinant viruses could be analyzed as sexually reproducing organisms, using either a quantitative trait loci (QTL) approach or a locus-by-locus fixation index (FST). Locus-by-locus FST analysis, and four different regressions and interval mapping algorithms of QTL analysis were applied to a phenotypic and genotypic dataset previously obtained from 47 artificial recombinant genomes generated between two begomovirus species. Both approaches assigned the determinant of within-host accumulation-previously identified using standard virology approaches-to a region including the 5׳ end of the replication-associated protein (Rep) gene and the upstream intergenic region. This study provides a proof of principle that QTL and population genetics tools can be extended to characterize the genetic determinants of viral traits.

  2. MAP3D: a media processor approach for high-end 3D graphics

    NASA Astrophysics Data System (ADS)

    Darsa, Lucia; Stadnicki, Steven; Basoglu, Chris

    1999-12-01

    Equator Technologies, Inc. has used a software-first approach to produce several programmable and advanced VLIW processor architectures that have the flexibility to run both traditional systems tasks and an array of media-rich applications. For example, Equator's MAP1000A is the world's fastest single-chip programmable signal and image processor targeted for digital consumer and office automation markets. The Equator MAP3D is a proposal for the architecture of the next generation of the Equator MAP family. The MAP3D is designed to achieve high-end 3D performance and a variety of customizable special effects by combining special graphics features with high performance floating-point and media processor architecture. As a programmable media processor, it offers the advantages of a completely configurable 3D pipeline--allowing developers to experiment with different algorithms and to tailor their pipeline to achieve the highest performance for a particular application. With the support of Equator's advanced C compiler and toolkit, MAP3D programs can be written in a high-level language. This allows the compiler to successfully find and exploit any parallelism in a programmer's code, thus decreasing the time to market of a given applications. The ability to run an operating system makes it possible to run concurrent applications in the MAP3D chip, such as video decoding while executing the 3D pipelines, so that integration of applications is easily achieved--using real-time decoded imagery for texturing 3D objects, for instance. This novel architecture enables an affordable, integrated solution for high performance 3D graphics.

  3. Asymptotic Evaluation of Bosonic Probability Amplitudes in Linear Unitary Networks in the Case of Large Number of Bosons

    NASA Astrophysics Data System (ADS)

    Shchesnovich, V. S.

    2013-09-01

    An asymptotic analytical approach is proposed for bosonic probability amplitudes in unitary linear networks, such as the optical multiport devices for photons. The asymptotic approach applies for large number of bosons N ≫ M in the M-mode network, where M is finite. The probability amplitudes of N bosons unitarily transformed from the input modes to the output modes of a unitary network are approximated by a multidimensional integral with the integrand containing a large parameter (N) in the exponent. The integral representation allows an asymptotic estimate of bosonic probability amplitudes up to a multiplicative error of order 1/N by the saddle point method. The estimate depends on solution of the scaling problem for the M × M-dimensional unitary network matrix: to find the left and right diagonal matrices which scale the unitary matrix to a matrix which has specified row and column sums (equal, respectively, to the distributions of bosons in the input and output modes). The scaled matrices give the saddle points of the integral. For simple saddle points, an explicit formula giving the asymptotic estimate of bosonic probability amplitudes is derived. Performance of the approximation and the scaling of the relative error with N are studied for two-mode network (the beam splitter), where the saddle-points are roots of a quadratic and an exact analytical formula for the probability amplitudes is available, and for three-mode network (the tritter).

  4. Whole-Genome Restriction Mapping by "Subhaploid"-Based RAD Sequencing: An Efficient and Flexible Approach for Physical Mapping and Genome Scaffolding.

    PubMed

    Dou, Jinzhuang; Dou, Huaiqian; Mu, Chuang; Zhang, Lingling; Li, Yangping; Wang, Jia; Li, Tianqi; Li, Yuli; Hu, Xiaoli; Wang, Shi; Bao, Zhenmin

    2017-07-01

    Assembly of complex genomes using short reads remains a major challenge, which usually yields highly fragmented assemblies. Generation of ultradense linkage maps is promising for anchoring such assemblies, but traditional linkage mapping methods are hindered by the infrequency and unevenness of meiotic recombination that limit attainable map resolution. Here we develop a sequencing-based "in vitro" linkage mapping approach (called RadMap), where chromosome breakage and segregation are realized by generating hundreds of "subhaploid" fosmid/bacterial-artificial-chromosome clone pools, and by restriction site-associated DNA sequencing of these clone pools to produce an ultradense whole-genome restriction map to facilitate genome scaffolding. A bootstrap-based minimum spanning tree algorithm is developed for grouping and ordering of genome-wide markers and is implemented in a user-friendly, integrated software package (AMMO). We perform extensive analyses to validate the power and accuracy of our approach in the model plant Arabidopsis thaliana and human. We also demonstrate the utility of RadMap for enhancing the contiguity of a variety of whole-genome shotgun assemblies generated using either short Illumina reads (300 bp) or long PacBio reads (6-14 kb), with up to 15-fold improvement of N50 (∼816 kb-3.7 Mb) and high scaffolding accuracy (98.1-98.5%). RadMap outperforms BioNano and Hi-C when input assembly is highly fragmented (contig N50 = 54 kb). RadMap can capture wide-range contiguity information and provide an efficient and flexible tool for high-resolution physical mapping and scaffolding of highly fragmented assemblies. Copyright © 2017 Dou et al.

  5. Whole-Genome Restriction Mapping by “Subhaploid”-Based RAD Sequencing: An Efficient and Flexible Approach for Physical Mapping and Genome Scaffolding

    PubMed Central

    Dou, Jinzhuang; Dou, Huaiqian; Mu, Chuang; Zhang, Lingling; Li, Yangping; Wang, Jia; Li, Tianqi; Li, Yuli; Hu, Xiaoli; Wang, Shi; Bao, Zhenmin

    2017-01-01

    Assembly of complex genomes using short reads remains a major challenge, which usually yields highly fragmented assemblies. Generation of ultradense linkage maps is promising for anchoring such assemblies, but traditional linkage mapping methods are hindered by the infrequency and unevenness of meiotic recombination that limit attainable map resolution. Here we develop a sequencing-based “in vitro” linkage mapping approach (called RadMap), where chromosome breakage and segregation are realized by generating hundreds of “subhaploid” fosmid/bacterial-artificial-chromosome clone pools, and by restriction site-associated DNA sequencing of these clone pools to produce an ultradense whole-genome restriction map to facilitate genome scaffolding. A bootstrap-based minimum spanning tree algorithm is developed for grouping and ordering of genome-wide markers and is implemented in a user-friendly, integrated software package (AMMO). We perform extensive analyses to validate the power and accuracy of our approach in the model plant Arabidopsis thaliana and human. We also demonstrate the utility of RadMap for enhancing the contiguity of a variety of whole-genome shotgun assemblies generated using either short Illumina reads (300 bp) or long PacBio reads (6–14 kb), with up to 15-fold improvement of N50 (∼816 kb-3.7 Mb) and high scaffolding accuracy (98.1–98.5%). RadMap outperforms BioNano and Hi-C when input assembly is highly fragmented (contig N50 = 54 kb). RadMap can capture wide-range contiguity information and provide an efficient and flexible tool for high-resolution physical mapping and scaffolding of highly fragmented assemblies. PMID:28468906

  6. Progress in landslide susceptibility mapping over Europe using Tier-based approaches

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Hervás, Javier; Reichenbach, Paola; Malet, Jean-Philippe

    2010-05-01

    The European Thematic Strategy for Soil Protection aims, among other objectives, to ensure a sustainable use of soil. The legal instrument of the strategy, the proposed Framework Directive, suggests identifying priority areas of several soil threats including landslides using a coherent and compatible approach based on the use of common thematic data. In a first stage, this can be achieved through landslide susceptibility mapping using geographically nested, multi-step tiered approaches, where areas identified as of high susceptibility by a first, synoptic-scale Tier ("Tier 1") can then be further assessed and mapped at larger scale by successive Tiers. In order to identify areas prone to landslides at European scale ("Tier 1"), a number of thematic terrain and environmental data sets already available for the whole of Europe can be used as input for a continental scale susceptibility model. However, since no coherent landslide inventory data is available at the moment over the whole continent, qualitative heuristic zonation approaches are proposed. For "Tier 1" a preliminary, simplified model has been developed. It consists of an equally weighting combination of a reduced, continent-wide common dataset of landslide conditioning factors including soil parent material, slope angle and land cover, to derive a landslide susceptibility index using raster mapping units consisting of 1 x 1 km pixels. A preliminary European-wide susceptibility map has thus been produced at 1:1 Million scale, since this is compatible with that of the datasets used. The map has been validated by means of a ratio of effectiveness using samples from landslide inventories in Italy, Austria, Hungary and United Kingdom. Although not differentiated for specific geomorphological environments or specific landslide types, the experimental model reveals a relatively good performance in many European regions at a 1:1 Million scale. An additional "Tier 1" susceptibility map at the same scale and using

  7. Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach

    NASA Astrophysics Data System (ADS)

    Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai

    2017-02-01

    The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.

  8. Assessment of landslide distribution map reliability in Niigata prefecture - Japan using frequency ratio approach

    NASA Astrophysics Data System (ADS)

    Rahardianto, Trias; Saputra, Aditya; Gomez, Christopher

    2017-07-01

    Research on landslide susceptibility has evolved rapidly over the few last decades thanks to the availability of large databases. Landslide research used to be focused on discreet events but the usage of large inventory dataset has become a central pillar of landslide susceptibility, hazard, and risk assessment. Indeed, extracting meaningful information from the large database is now at the forth of geoscientific research, following the big-data research trend. Indeed, the more comprehensive information of the past landslide available in a particular area is, the better the produced map will be, in order to support the effective decision making, planning, and engineering practice. The landslide inventory data which is freely accessible online gives an opportunity for many researchers and decision makers to prevent casualties and economic loss caused by future landslides. This data is advantageous especially for areas with poor landslide historical data. Since the construction criteria of landslide inventory map and its quality evaluation remain poorly defined, the assessment of open source landslide inventory map reliability is required. The present contribution aims to assess the reliability of open-source landslide inventory data based on the particular topographical setting of the observed area in Niigata prefecture, Japan. Geographic Information System (GIS) platform and statistical approach are applied to analyze the data. Frequency ratio method is utilized to model and assess the landslide map. The outcomes of the generated model showed unsatisfactory results with AUC value of 0.603 indicate the low prediction accuracy and unreliability of the model.

  9. A reciprocal space approach for locating symmetry elements in Patterson superposition maps

    SciTech Connect

    Hendrixson, T.

    1990-09-21

    A method for determining the location and possible existence of symmetry elements in Patterson superposition maps has been developed. A comparison of the original superposition map and a superposition map operated on by the symmetry element gives possible translations to the location of the symmetry element. A reciprocal space approach using structure factor-like quantities obtained from the Fourier transform of the superposition function is then used to determine the best'' location of the symmetry element. Constraints based upon the space group requirements are also used as a check on the locations. The locations of the symmetry elements are used to modify the Fourier transform coefficients of the superposition function to give an approximation of the structure factors, which are then refined using the EG relation. The analysis of several compounds using this method is presented. Reciprocal space techniques for locating multiple images in the superposition function are also presented, along with methods to remove the effect of multiple images in the Fourier transform coefficients of the superposition map. In addition, crystallographic studies of the extended chain structure of (NHC{sub 5}H{sub 5})SbI{sub 4} and of the twinning method of the orthorhombic form of the high-{Tc} superconductor YBa{sub 2}Cu{sub 3}O{sub 7-x} are presented. 54 refs.

  10. Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Herkenhoff, K.

    2010-01-01

    The 1:500K-scale geologic map of MTM 85200 - the Olympia Cavi region of Mars - has been submitted for peer review [1]. Physiographically, the quadrangle includes portions of Olympia Rupes, a set of sinuous scarps which elevate Planum Boreum 800 meters above Olympia Planum. The region includes the high-standing, spiral troughs of Boreales Scopuli, the rugged and deep depressions of Olympia Cavi, and the vast dune fields of Olympia Undae. Geologically, the mapped units and landforms reflect the recent history of repeated accumulation and degradation. The widespread occurrence of both weakly and strongly stratified units implicates the drape-like accumulation of ice, dust, and sand through climatic variations. Similarly, the occurrence of layer truncations, particularly at unit boundaries, implicates punctuated periods of both localized and regional erosion and surface deflation whereby underlying units were exhumed and their material transported and re-deposited. Herein, we focus on the iterative mapping approaches that allowed not only the accommodation of the burgeoning variety and volume of data sets, but also facilitated the efficient presentation of map information. Unit characteristics and their geologic history are detailed in past abstracts [2-3].

  11. Concept mapping as an approach for expert-guided model building: The example of health literacy.

    PubMed

    Soellner, Renate; Lenartz, Norbert; Rudinger, Georg

    2017-02-01

    Concept mapping served as the starting point for the aim of capturing the comprehensive structure of the construct of 'health literacy.' Ideas about health literacy were generated by 99 experts and resulted in 105 statements that were subsequently organized by 27 experts in an unstructured card sorting. Multidimensional scaling was applied to the sorting data and a two and three-dimensional solution was computed. The three dimensional solution was used in subsequent cluster analysis and resulted in a concept map of nine "clusters": (1) self-regulation, (2) self-perception, (3) proactive approach to health, (4) basic literacy and numeracy skills, (5) information appraisal, (6) information search, (7) health care system knowledge and acting, (8) communication and cooperation, and (9) beneficial personality traits. Subsequently, this concept map served as a starting point for developing a "qualitative" structural model of health literacy and a questionnaire for the measurement of health literacy. On the basis of questionnaire data, a "quantitative" structural model was created by first applying exploratory factor analyses (EFA) and then cross-validating the model with confirmatory factor analyses (CFA). Concept mapping proved to be a highly valuable tool for the process of model building up to translational research in the "real world".

  12. Hyperspectral classification approaches for intertidal macroalgae habitat mapping: a case study in Heligoland

    NASA Astrophysics Data System (ADS)

    Oppelt, Natascha; Schulze, Florian; Bartsch, Inka; Doernhoefer, Katja; Eisenhardt, Inga

    2012-11-01

    Analysis of coastal marine algae communities enables us to adequately estimate the state of coastal marine environments and provides evidence for environmental changes. Hyperspectral remote sensing provides a tool for mapping macroalgal habitats if the algal communities are spectrally resolvable. We compared the performance of three classification approaches to determine the distribution of macroalgae communities in the rocky intertidal zone of Heligoland, Germany, using airborne hyperspectral (AISA) data. The classification results of two supervised approaches (maximum likelihood classifier and spectral angle mapping) are compared with an approach combining k-Means classification of derivative measures. We identified regions of different slopes between main pigment absorption features of macroalgae and classified the resulting slope bands. The maximum likelihood classifier gained the best results (Cohan's kappa=0.81), but the new approach turned out as a time-effective possibility to identify the dominating macroalgae species with sufficient accuracy (Cohan's kappa=0.77), even in the heterogeneous and patchy coverage of the study area.

  13. a Novel Fusion-Based Unsupervised Approach for Multispectral Image Change Detection with Saliency Maps

    NASA Astrophysics Data System (ADS)

    Zhang, A.; Jiang, G.; Shao, L.; Zhang, Y.; Fang, J.

    2017-09-01

    To fully utilize the spectral information and remove noise in multispectral image change detection, A fusion-based unsupervised approach, which exploits NSCT (Nonsubsampled Contourlet Transform) and multi-scale saliency maps for detecting changed areas by using multispectral images is presented in this paper. Firstly, aiming at make full use of multispectral information, each band of the multitemporal images is applied to get an initial difference image set (IDIS), which is then decomposed into several low-pass approximation and high-pass directional sub bands by NSCT; In order to remove most of the noise, saliency maps of each sub bands and each scales are obtained by processing only the low-frequency sub-band coefficients of the decomposed image; Finally the binary change map is extracted by using a novel inter-scale and inter-band fusion method. Experimental results validate the superior performance of the proposed approach with respect to several state-of-the-art change detection techniques.

  14. A random model approach to mapping quantitative trait loci for complex binary traits in outbred populations.

    PubMed Central

    Yi, N; Xu, S

    1999-01-01

    Mapping quantitative trait loci (QTL) for complex binary traits is more challenging than for normally distributed traits due to the nonlinear relationship between the observed phenotype and unobservable genetic effects, especially when the mapping population contains multiple outbred families. Because the number of alleles of a QTL depends on the number of founders in an outbred population, it is more appropriate to treat the effect of each allele as a random variable so that a single variance rather than individual allelic effects is estimated and tested. Such a method is called the random model approach. In this study, we develop the random model approach of QTL mapping for binary traits in outbred populations. An EM-algorithm with a Fisher-scoring algorithm embedded in each E-step is adopted here to estimate the genetic variances. A simple Monte Carlo integration technique is used here to calculate the likelihood-ratio test statistic. For the first time we show that QTL of complex binary traits in an outbred population can be scanned along a chromosome for their positions, estimated for their explained variances, and tested for their statistical significance. Application of the method is illustrated using a set of simulated data. PMID:10511576

  15. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  16. Discovering Higgs Bosons of the MSSM using Jet Substructure

    SciTech Connect

    Kribs, Graham D.; Martin, Adam; Roy, Tuhin S.; Spannowsky, Michael

    2010-06-01

    We present a qualitatively new approach to discover Higgs bosons of the MSSM at the LHC using jet substructure techniques applied to boosted Higgs decays. These techniques are ideally suited to the MSSM, since the lightest Higgs boson overwhelmingly decays to b{bar b} throughout the entire parameter space, while the heavier neutral Higgs bosons, if light enough to be produced in a cascade, also predominantly decay to b{bar b}. The Higgs production we consider arises from superpartner production where superpartners cascade decay into Higgs bosons. We study this mode of Higgs production for several superpartner hierarchies: m{sub {tilde q}},m{sub {tilde g}} > m{sub {tilde W}},{sub {tilde B}} > m{sub h} + {mu}; m{tilde q};m{sub {tilde q}},m{sub {tilde g}} > m{sub {tilde W}},{sub {tilde B}} > m {sub h,H,A} + {mu}; and m{sub {tilde q}},m{sub {tilde g}} > m{sub {tilde W}} > m{sub h} + {mu} with m{sub {tilde B}} {approx} {mu}. In these cascades, the Higgs bosons are boosted, with pT > 200 GeV a large fraction of the time. Since Higgs bosons appear in cascades originating from squarks and/or gluinos, the cross section for events with at least one Higgs boson can be the same order as squark/gluino production. Given 10 fb{sup -1} of 14 TeV LHC data, with m{sub {tilde q}} {approx}< 1 TeV, and one of the above superpartner mass hierarchies, our estimate of S{radical} B of the Higgs signal is sufficiently high that the b{bar b} mode can become the discovery mode of the lightest Higgs boson of the MSSM.

  17. A sib-pair approach to interval mapping of quantitative trait loci.

    PubMed Central

    Fulker, D. W.; Cardon, L. R.

    1994-01-01

    An interval mapping procedure based on the sib-pair method of Haseman and Elston is developed, and simulation studies are carried out to explore its properties. The procedure is analogous to other interval mapping procedures used with experimental material, such as plants and animals, and yields very similar results in terms of the location and effect size of a quantitative trait locus (QTL). The procedure offers an advantage over the conventional Haseman and Elston approach, in terms of power, and provides useful information concerning the location of a QTL. Because of its simplicity, the method readily lends itself to the analysis of selected samples for increased power and the evaluation of multilocus models of complex phenotypes. PMID:8198132

  18. A Voxel-Map Quantitative Analysis Approach for Atherosclerotic Noncalcified Plaques of the Coronary Artery Tree

    PubMed Central

    Li, Ying; Chen, Wei; Chen, Yonglin; Chu, Chun; Fang, Bingji; Tan, Liwen

    2013-01-01

    Noncalcified plaques (NCPs) are associated with the presence of lipid-core plaques that are prone to rupture. Thus, it is important to detect and monitor the development of NCPs. Contrast-enhanced coronary Computed Tomography Angiography (CTA) is a potential imaging technique to identify atherosclerotic plaques in the whole coronary tree, but it fails to provide information about vessel walls. In order to overcome the limitations of coronary CTA and provide more meaningful quantitative information for percutaneous coronary intervention (PCI), we proposed a Voxel-Map based on mathematical morphology to quantitatively analyze the noncalcified plaques on a three-dimensional coronary artery wall model (3D-CAWM). This approach is a combination of Voxel-Map analysis techniques, plaque locating, and anatomical location related labeling, which show more detailed and comprehensive coronary tree wall visualization. PMID:24348749

  19. A GIS based method for soil mapping in Sardinia, Italy: a geomatic approach.

    PubMed

    Vacca, A; Loddo, S; Melis, M T; Funedda, A; Puddu, R; Verona, M; Fanni, S; Fantola, F; Madrau, S; Marrone, V A; Serra, G; Tore, C; Manca, D; Pasci, S; Puddu, M R; Schirru, P

    2014-06-01

    A new project was recently initiated for the realization of the "Land Unit and Soil Capability Map of Sardinia" at a scale of 1:50,000 to support land use planning. In this study, we outline the general structure of the project and the methods used in the activities that have been thus far conducted. A GIS approach was used. We used the soil-landscape paradigm for the prediction of soil classes and their spatial distribution or the prediction of soil properties based on landscape features. The work is divided into two main phases. In the first phase, the available digital data on land cover, geology and topography were processed and classified according to their influence on weathering processes and soil properties. The methods used in the interpretation are based on consolidated and generalized knowledge about the influence of geology, topography and land cover on soil properties. The existing soil data (areal and point data) were collected, reviewed, validated and standardized according to international and national guidelines. Point data considered to be usable were input into a specific database created for the project. Using expert interpretation, all digital data were merged to produce a first draft of the Land Unit Map. During the second phase, this map will be implemented with the existing soil data and verified in the field if also needed with new soil data collection, and the final Land Unit Map will be produced. The Land Unit and Soil Capability Map will be produced by classifying the land units using a reference matching table of land capability classes created for this project. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  1. A New Approach to Liquefaction Potential Mapping Using Remote Sensing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Oommen, T.; Baise, L. G.

    2007-12-01

    learning capabilities of a human brain and make appropriate predictions that involve intuitive judgments and a high degree of nonlinearity. The accuracy of the developed liquefaction potential map was tested using independent testing data that was not used for the model development. The results show that the developed liquefaction potential map has an overall classification accuracy of 84%, indicating that the combination of remote sensing data and other relevant spatial data together with machine learning can be a promising approach for liquefaction potential mapping.

  2. Alternative SERRS probes for the immunochemical localization of ovalbumin in paintings: an advanced mapping detection approach.

    PubMed

    Sciutto, Giorgia; Litti, Lucio; Lofrumento, Cristiana; Prati, Silvia; Ricci, Marilena; Gobbo, Marina; Roda, Aldo; Castellucci, Emilio; Meneghetti, Moreno; Mazzeo, Rocco

    2013-08-21

    In the field of analytical chemistry, many scientific efforts have been devoted to develop experimental procedures for the characterization of organic substances present in heterogeneous artwork samples, due to their challenging identification. In particular, performances of immunochemical techniques have been recently investigated, optimizing ad hoc systems for the identification of proteins. Among all the different immunochemical approaches, the use of metal nanoparticles - for surface enhanced Raman scattering (SERS) detection - remains one of the most powerful methods that has still not been explored enough for the analysis of artistic artefacts. For this reason, the present research work was aimed at proposing a new optimized and highly efficient indirect immunoassay for the detection of ovalbumin. In particular, the study proposed a new SERRS probe composed of gold nanoparticles (AuNPs) functionalised with Nile Blue A and produced with an excellent green and cheap alternative approach to the traditional chemical nanoparticles synthesis: the laser ablation synthesis in solution (LASiS). This procedure allows us to obtain stable nanoparticles which can be easily functionalized without any ligand exchange reaction or extensive purification procedures. Moreover, the present research work also focused on the development of a comprehensive analytical approach, based on the combination of potentialities of immunochemical methods and Raman analysis, for the simultaneous identification of the target protein and the different organic and inorganic substances present in the paint matrix. An advanced mapping detection system was proposed to achieve the exact spatial location of all the components through the creation of false colour chemical maps.

  3. A Non-parametric Approach to Constrain the Transfer Function in Reverberation Mapping

    NASA Astrophysics Data System (ADS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-11-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  4. An internal state variable mapping approach for Li-Plating diagnosis

    NASA Astrophysics Data System (ADS)

    Bai, Guangxing; Wang, Pingfeng

    2016-08-01

    Li-ion battery failure becomes one of major challenges for reliable battery applications, as it could cause catastrophic consequences. Compared with capacity fading resulted from calendar effects, Li-plating induced battery failures are more difficult to identify, as they causes sudden capacity loss leaving limited time for failure diagnosis. This paper presents a new internal state variable (ISV) mapping approach to identify values of immeasurable battery ISVs considering changes of inherent parameters of battery system dynamics for Li-plating diagnosis. Employing the developed ISV mapping approach, an explicit functional relationship model between measurable battery signals and immeasurable battery ISVs can be developed. The developed model can then be used to identify ISVs from an online battery system for the occurrence identification of Li-plating. Employing multiphysics based simulation of Li-plating using COMSOL, the proposed Li-plating diagnosis approach is implemented under different conditions in the case studies to demonstrate its efficacy in diagnosis of Li-plating onset timings.

  5. Functional connectivity-based parcellation of amygdala using self-organized mapping: a data driven approach.

    PubMed

    Mishra, Arabinda; Rogers, Baxter P; Chen, Li Min; Gore, John C

    2014-04-01

    The overall goal of this work is to demonstrate how resting state functional magnetic resonance imaging (fMRI) signals may be used to objectively parcellate functionally heterogeneous subregions of the human amygdala into structures characterized by similar patterns of functional connectivity. We hypothesize that similarity of functional connectivity of subregions with other parts of the brain can be a potential basis to segment and cluster voxels using data driven approaches. In this work, self-organizing map (SOM) was implemented to cluster the connectivity maps associated with each voxel of the human amygdala, thereby defining distinct subregions. The functional separation was optimized by evaluating the overall differences in functional connectivity between the subregions at group level. Analysis of 25 resting state fMRI data sets suggests that SOM can successfully identify functionally independent nuclei based on differences in their inter subregional functional connectivity, evaluated statistically at various confidence levels. Although amygdala contains several nuclei whose distinct roles are implicated in various functions, our objective approach discerns at least two functionally distinct volumes comparable to previous parcellation results obtained using probabilistic tractography and cytoarchitectonic analysis. Association of these nuclei with various known functions and a quantitative evaluation of their differences in overall functional connectivity with lateral orbital frontal cortex and temporal pole confirms the functional diversity of amygdala. The data driven approach adopted here may be used as a powerful indicator of structure-function relationships in the amygdala and other functionally heterogeneous structures as well.

  6. A pooling-based approach to mapping genetic variants associated with DNA methylation

    DOE PAGES

    Kaplow, Irene M.; MacIsaac, Julia L.; Mah, Sarah M.; ...

    2015-04-24

    DNA methylation is an epigenetic modification that plays a key role in gene regulation. Previous studies have investigated its genetic basis by mapping genetic variants that are associated with DNA methylation at specific sites, but these have been limited to microarrays that cover <2% of the genome and cannot account for allele-specific methylation (ASM). Other studies have performed whole-genome bisulfite sequencing on a few individuals, but these lack statistical power to identify variants associated with DNA methylation. We present a novel approach in which bisulfite-treated DNA from many individuals is sequenced together in a single pool, resulting in a trulymore » genome-wide map of DNA methylation. Compared to methods that do not account for ASM, our approach increases statistical power to detect associations while sharply reducing cost, effort, and experimental variability. As a proof of concept, we generated deep sequencing data from a pool of 60 human cell lines; we evaluated almost twice as many CpGs as the largest microarray studies and identified more than 2000 genetic variants associated with DNA methylation. Here we found that these variants are highly enriched for associations with chromatin accessibility and CTCF binding but are less likely to be associated with traits indirectly linked to DNA, such as gene expression and disease phenotypes. In summary, our approach allows genome-wide mapping of genetic variants associated with DNA methylation in any tissue of any species, without the need for individual-level genotype or methylation data.« less

  7. Permanents, bosons and linear optics

    NASA Astrophysics Data System (ADS)

    Vlasov, Alexander Yu

    2017-10-01

    The particular complexity of linear quantum optical networks has received certain deserved attention recently due to the possible implications for the theory of quantum computation. Two relevant boson models are discussed in the presented work. The symmetric product of the Hilbert spaces produces a rather abstract model; the second one is obtained by quantization of the harmonic oscillator. In contrast to the considered bosonic processes, the so-called ‘fermionic linear optics’ is effectively simulated on a classical computer. The comparison of the bosonic and fermionic case clarifies the controversy, and the more elaborate oscillator model provides a deeper analogy.

  8. A self organizing map approach to physiological data analysis for enhanced group performance.

    SciTech Connect

    Doser, Adele Beatrice; Merkle, Peter Benedict

    2004-10-01

    A Self Organizing Map (SOM) approach was used to analyze physiological data taken from a group of subjects participating in a cooperative video shooting game. The ultimate aim was to discover signatures of group cooperation, conflict, leadership, and performance. Such information could be fed back to participants in a meaningful way, and ultimately increase group performance in national security applications, where the consequences of a poor group decision can be devastating. Results demonstrated that a SOM can be a useful tool in revealing individual and group signatures from physiological data, and could ultimately be used to heighten group performance.

  9. Chameleon vector bosons

    SciTech Connect

    Nelson, Ann E.

    2008-05-01

    We show that for a force mediated by a vector particle coupled to a conserved U(1) charge, the apparent range and strength can depend on the size and density of the source, and the proximity to other sources. This chameleon effect is due to screening from a light charged scalar. Such screening can weaken astrophysical constraints on new gauge bosons. As an example we consider the constraints on chameleonic gauged B-L. We show that although Casimir measurements greatly constrain any B-L force much stronger than gravity with range longer than 0.1 {mu}m, there remains an experimental window for a long-range chameleonic B-L force. Such a force could be much stronger than gravity, and long or infinite range in vacuum, but have an effective range near the surface of the earth which is less than a micron.

  10. Dark light Higgs bosons.

    SciTech Connect

    Draper, P.; Liu, T.; Wagner, C. E. M.; Wang, L.-T.; Zhang, H.

    2011-03-24

    We study a limit of the nearly Peccei-Quinn-symmetric next-to-minimal supersymmetric standard model possessing novel Higgs and dark matter (DM) properties. In this scenario, there naturally coexist three light singletlike particles: a scalar, a pseudoscalar, and a singlinolike DM candidate, all with masses of order 0.1-10 GeV. The decay of a standard model-like Higgs boson to pairs of the light scalars or pseudoscalars is generically suppressed, avoiding constraints from collider searches for these channels. For a certain parameter window annihilation into the light pseudoscalar and exchange of the light scalar with nucleons allow the singlino to achieve the correct relic density and a large direct-detection cross section consistent with the DM direct-detection experiments, CoGeNT and DAMA/LIBRA, preferred region simultaneously. This parameter space is consistent with experimental constraints from LEP, the Tevatron, ?, and flavor physics.

  11. Experimental Boson Sampling

    NASA Astrophysics Data System (ADS)

    White, Andrew; Broome, Matthew; Fedrizzi, Alessandro; Rahimi-Keshari, Saleh; Ralph, Timothy; Dove, Justin; Aaronson, Scott

    2013-03-01

    Quantum computers are unnecessary for exponentially-efficient computation or simulation if the Extended Church-Turing thesis--a foundational tenet of computer science--is correct. The thesis would be directly contradicted by a physical device that efficiently performs a task believed to be intractable for classical computers. Such a task is BOSONSAMPLING: obtaining a distribution of n bosons scattered by some linear-optical unitary process. Here we test the central premise of BOSONSAMPLING, experimentally verifying that the amplitudes of 3-photon scattering processes are given by the permanents of submatrices generated from a unitary describing a 6-mode integrated optical circuit. We find the protocol to be robust, working even with the unavoidable effects of photon loss, non-ideal sources, and imperfect detection. Strong evidence against the Extended-Church-Turing thesis will come from scaling to large numbers of photons, which is a much simpler task than building a universal quantum computer.

  12. Advanced GPS-based field mapping for collecting training data within a remote sensing classification approach

    NASA Astrophysics Data System (ADS)

    Michel, Ulrich

    2004-10-01

    Automation of image classification is a challenge to the image interpretation community. One of the most time consuming task is certainly the collection of training data. The introduction especially of low cost Global Positioning System (GPS) receivers and the higher accuracy of the GPS signal after turning-off the Selective Availability has enhanced the ease and versatility of spatial data acquisition. Has also made the approaches by which it is integrated with GIS and remote sensing data more flexible. The emphasis of this paper is to present a method for improving the training data collection for classification purposes using remotely sensed imagery as well as various data sources combined in a GIS. Firstly, a methodology for defining and describing training areas is demonstrated. The training data were stored in a vector data base (shape files) by using the geometry of the land parcels of the test site. Secondly, in addition to a conventional field mapping approach, an advanced GPS based field mapping methodology was used to collect new training data. Within this new approach single point information of the target ground truth class were collected along the roads in this test area. In this step, the following attributes were recorded: ID, left or right of the street and biotope class. The goal for this approach is that one single person should handle the field mapping while driving in a car. The implementation of this approach is performed in ArcPad 6.03 and Application Builder from ESRI. The standard version of ArcPad was modified so that a one hand collection of training data is possible. After the field survey, the results were used within Erdas Imagine (version 8.7). In our approach all "left" points were moved to the adjacent left field - "orthogonal" to the street. All "right" points were shifted to the adjacent right field. Now those moved points were used as a seed pixel in a Euclidian distance algorithm to automatically derive new training sets. In

  13. An entropy-driven matrix completion (E-MC) approach to complex network mapping

    NASA Astrophysics Data System (ADS)

    Koochakzadeh, Ali; Pal, Piya

    2016-05-01

    Mapping the topology of a complex network in a resource-efficient manner is a challenging problem with applications in internet mapping, social network inference, and so forth. We propose a new entropy driven algorithm leveraging ideas from matrix completion, to map the network using monitors (or sensors) which, when placed on judiciously selected nodes, are capable of discovering their immediate neighbors. The main challenge is to maximize the portion of discovered network using only a limited number of available monitors. To this end, (i) a new measure of entropy or uncertainty is associated with each node, in terms of the currently discovered edges incident on that node, and (ii) a greedy algorithm is developed to select a candidate node for monitor placement based on its entropy. Utilizing the fact that many complex networks of interest (such as social networks), have a low-rank adjacency matrix, a matrix completion algorithm, namely 1-bit matrix completion, is combined with the greedy algorithm to further boost its performance. The low rank property of the network adjacency matrix can be used to extrapolate a portion of missing edges, and consequently update the node entropies, so as to efficiently guide the network discovery algorithm towards placing monitors on the nodes that can turn out to be more informative. Simulations performed on a variety of real world networks such as social networks and peer networks demonstrate the superior performance of the matrix-completion guided approach in discovering the network topology.

  14. Image Mining in Remote Sensing for Coastal Wetlands Mapping: from Pixel Based to Object Based Approach

    NASA Astrophysics Data System (ADS)

    Farda, N. M.; Danoedoro, P.; Hartono; Harjoko, A.

    2016-11-01

    The availably of remote sensing image data is numerous now, and with a large amount of data it makes “knowledge gap” in extraction of selected information, especially coastal wetlands. Coastal wetlands provide ecosystem services essential to people and the environment. The aim of this research is to extract coastal wetlands information from satellite data using pixel based and object based image mining approach. Landsat MSS, Landsat 5 TM, Landsat 7 ETM+, and Landsat 8 OLI images located in Segara Anakan lagoon are selected to represent data at various multi temporal images. The input for image mining are visible and near infrared bands, PCA band, invers PCA bands, mean shift segmentation bands, bare soil index, vegetation index, wetness index, elevation from SRTM and ASTER GDEM, and GLCM (Harralick) or variability texture. There is three methods were applied to extract coastal wetlands using image mining: pixel based - Decision Tree C4.5, pixel based - Back Propagation Neural Network, and object based - Mean Shift segmentation and Decision Tree C4.5. The results show that remote sensing image mining can be used to map coastal wetlands ecosystem. Decision Tree C4.5 can be mapped with highest accuracy (0.75 overall kappa). The availability of remote sensing image mining for mapping coastal wetlands is very important to provide better understanding about their spatiotemporal coastal wetlands dynamics distribution.

  15. Large Deformation Multiresolution Diffeomorphic Metric Mapping for Multiresolution Cortical Surfaces: A Coarse-to-Fine Approach.

    PubMed

    Tan, Mingzhen; Qiu, Anqi

    2016-09-01

    Brain surface registration is an important tool for characterizing cortical anatomical variations and understanding their roles in normal cortical development and psychiatric diseases. However, surface registration remains challenging due to complicated cortical anatomy and its large differences across individuals. In this paper, we propose a fast coarse-to-fine algorithm for surface registration by adapting the large diffeomorphic deformation metric mapping (LDDMM) framework for surface mapping and show improvements in speed and accuracy via a multiresolution analysis of surface meshes and the construction of multiresolution diffeomorphic transformations. The proposed method constructs a family of multiresolution meshes that are used as natural sparse priors of the cortical morphology. At varying resolutions, these meshes act as anchor points where the parameterization of multiresolution deformation vector fields can be supported, allowing the construction of a bundle of multiresolution deformation fields, each originating from a different resolution. Using a coarse-to-fine approach, we show a potential reduction in computation cost along with improvements in sulcal alignment when compared with LDDMM surface mapping.

  16. A Random Model Approach to Interval Mapping of Quantitative Trait Loci

    PubMed Central

    Xu, S.; Atchley, W. R.

    1995-01-01

    Mapping quantitative trait loci in outbred populations is important because many populations of organisms are noninbred. Unfortunately, information about the genetic architecture of the trait may not be available in outbred populations. Thus, the allelic effects of genes can not be estimated with ease. In addition, under linkage equilibrium, marker genotypes provide no information about the genotype of a QTL (our terminology for a single quantitative trait locus is QTL while multiple loci are referred to as QTLs). To circumvent this problem, an interval mapping procedure based on a random model approach is described. Under a random model, instead of estimating the effects, segregating variances of QTLs are estimated by a maximum likelihood method. Estimation of the variance component of a QTL depends on the proportion of genes identical-by-descent (IBD) shared by relatives at the locus, which is predicted by the IBD of two markers flanking the QTL. The marker IBD shared by two relatives are inferred from the observed marker genotypes. The procedure offers an advantage over the regression interval mapping in terms of high power and small estimation errors and provides flexibility for large sibships, irregular pedigree relationships and incorporation of common environmental and fixed effects. PMID:8582623

  17. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    PubMed Central

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  18. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach.

    PubMed

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems.

  19. An object-oriented approach to automated landform mapping: A case study of drumlins

    NASA Astrophysics Data System (ADS)

    Saha, Kakoli; Wells, Neil A.; Munro-Stasiuk, Mandy

    2011-09-01

    This paper details an automated object-oriented approach to mapping landforms from digital elevation models (DEMs), using the example of drumlins in the Chautauqua drumlin field in NW Pennsylvania and upstate New York. Object-oriented classification is highly desirable as it can identify specific shapes in datasets based on both the pixel values in a raster dataset and the contextual information between pixels and extracted objects. The methodology is built specifically for application to the USGS 30 m resolution DEM data, which are freely available to the public and of sufficient resolution to map medium scale landforms. Using the raw DEM data, as well as derived aspect and slope, Definiens Developer (v.7) was used to perform multiresolution segmentation, followed by rule-based classification in order to extract individual polygons that represent drumlins. Drumlins obtained by automated extraction were visually and statistically compared to those identified via manual digitization. Detailed morphometric descriptive statistics such as means, ranges, and standard deviations were inspected and compared for length, width, elongation ratio, area, and perimeter. Although the manual and automated results were not always statistically identical, a more detailed comparison of just the drumlins identified by both procedures showed that the automated methods easily matched the manual digitization. Differences in the two methods related to mapping compound drumlins, and smaller and larger drumlins. The automated method generally identified more features in these categories than the manual method and thus outperformed the manual method.

  20. The constellation of dietary factors in adolescent acne: a semantic connectivity map approach.

    PubMed

    Grossi, E; Cazzaniga, S; Crotti, S; Naldi, L; Di Landro, A; Ingordo, V; Cusano, F; Atzori, L; Tripodi Cutrì, F; Musumeci, M L; Pezzarossa, E; Bettoli, V; Caproni, M; Bonci, A

    2016-01-01

    Different lifestyle and dietetic factors have been linked with the onset and severity of acne. To assess the complex interconnection between dietetic variables and acne. This was a reanalysis of data from a case-control study by using a semantic connectivity map approach. 563 subjects, aged 10-24 years, involved in a case-control study of acne between March 2009 and February 2010, were considered in this study. The analysis evaluated the link between a moderate to severe acne and anthropometric variables, family history and dietetic factors. Analyses were conducted by relying on an artificial adaptive system, the Auto Semantic Connectivity Map (AutoCM). The AutoCM map showed that moderate-severe acne was closely associated with family history of acne in first degree relatives, obesity (BMI ≥ 30), and high consumption of milk, in particular skim milk, cheese/yogurt, sweets/cakes, chocolate, and a low consumption of fish, and limited intake of fruits/vegetables. Our analyses confirm the link between several dietetic items and acne. When providing care, dermatologists should also be aware of the complex interconnection between dietetic factors and acne. © 2014 European Academy of Dermatology and Venereology.

  1. A topographical map approach to representing treatment efficacy: a focus on positive psychology interventions.

    PubMed

    Gorlin, Eugenia I; Lee, Josephine; Otto, Michael W

    2017-07-31

    A recent meta-analysis by Bolier et al. indicated that positive psychology interventions have overall small to moderate effects on well-being, but results were quite heterogeneous across intervention trials. Such meta-analytic research helps condense information on the efficacy of a broad psychosocial intervention by averaging across many effects; however, such global averages may provide limited navigational guidance for selecting among specific interventions. Here, we introduce a novel method for displaying qualitative and quantitative information on the efficacy of interventions using a topographical map approach. As an initial prototype for demonstrating this method, we mapped 50 positive psychology interventions targeting well-being (as captured in the Bolier et al. [2013] meta-analysis, [Bolier, L., Haverman, M., Westerhof, G. J., Riper, H., Smit, F., & Bohlmeijer, E. (2013). Positive psychology interventions: A meta-analysis of randomized controlled studies. BMC Public Health, 13, 83]). Each intervention domain/subdomain was mapped according to its average effect size (indexed by vertical elevation), number of studies providing effect sizes (indexed by horizontal area), and therapist/client burden (indexed by shading). The geographical placement of intervention domains/subdomains was determined by their conceptual proximity, allowing viewers to gauge the general conceptual "direction" in which promising intervention effects can be found. The resulting graphical displays revealed several prominent features of the well-being intervention "landscape," such as more strongly and uniformly positive effects of future-focused interventions (including, goal-pursuit and optimism training) compared to past/present-focused ones.

  2. Flood mapping using VHR satellite imagery: a comparison between different classification approaches

    NASA Astrophysics Data System (ADS)

    Franci, Francesca; Boccardo, Piero; Mandanici, Emanuele; Roveri, Elena; Bitelli, Gabriele

    2016-10-01

    Various regions in Europe have suffered from severe flooding over the last decades. Flood disasters often have a broad extent and a high frequency. They are considered the most devastating natural hazards because of the tremendous fatalities, injuries, property damages, economic and social disruption that they cause. In this context, Earth Observation techniques have become a key tool for flood risk and damage assessment. In particular, remote sensing facilitates flood surveying, providing valuable information, e.g. flood occurrence, intensity and progress of flood inundation, spurs and embankments affected/threatened. The present work aims to investigate the use of Very High Resolution satellite imagery for mapping flood-affected areas. The case study is the November 2013 flood event which occurred in Sardinia region (Italy), affecting a total of 2,700 people and killing 18 persons. The investigated zone extends for 28 km2 along the Posada river, from the Maccheronis dam to the mouth in the Tyrrhenian sea. A post-event SPOT6 image was processed by means of different classification methods, in order to produce the flood map of the analysed area. The unsupervised classification algorithm ISODATA was tested. A pixel-based supervised technique was applied using the Maximum Likelihood algorithm; moreover, the SPOT 6 image was processed by means of object-oriented approaches. The produced flood maps were compared among each other and with an independent data source, in order to evaluate the performance of each method, also in terms of time demand.

  3. A Novel Approach on Designing Augmented Fuzzy Cognitive Maps Using Fuzzified Decision Trees

    NASA Astrophysics Data System (ADS)

    Papageorgiou, Elpiniki I.

    This paper proposes a new methodology for designing Fuzzy Cognitive Maps using crisp decision trees that have been fuzzified. Fuzzy cognitive map is a knowledge-based technique that works as an artificial cognitive network inheriting the main aspects of cognitive maps and artificial neural networks. Decision trees, in the other hand, are well known intelligent techniques that extract rules from both symbolic and numeric data. Fuzzy theoretical techniques are used to fuzzify crisp decision trees in order to soften decision boundaries at decision nodes inherent in this type of trees. Comparisons between crisp decision trees and the fuzzified decision trees suggest that the later fuzzy tree is significantly more robust and produces a more balanced decision making. The approach proposed in this paper could incorporate any type of fuzzy decision trees. Through this methodology, new linguistic weights were determined in FCM model, thus producing augmented FCM tool. The framework is consisted of a new fuzzy algorithm to generate linguistic weights that describe the cause-effect relationships among the concepts of the FCM model, from induced fuzzy decision trees.

  4. Searching for new heavy neutral gauge bosons using vector boson fusion processes at the LHC

    NASA Astrophysics Data System (ADS)

    Flórez, Andrés; Gurrola, Alfredo; Johns, Will; Oh, Young Do; Sheldon, Paul; Teague, Dylan; Weiler, Thomas

    2017-04-01

    New massive resonances are predicted in many extensions to the Standard Model (SM) of particle physics and constitutes one of the most promising searches for new physics at the LHC. We present a feasibility study to search for new heavy neutral gauge bosons using vector boson fusion (VBF) processes, which become especially important as the LHC probes higher collision energies. In particular, we consider the possibility that the discovery of a Z‧ boson may have eluded searches at the LHC. The coupling of the Z‧ boson to the SM quarks can be small, and thus the Z‧ would not be discoverable by the searches conducted thus far. In the context of a simplified phenomenological approach, we consider the Z‧ → ττ and Z‧ → μμ decay modes to show that the requirement of a dilepton pair combined with two high pT forward jets with large separation in pseudorapidity and with large dijet mass is effective in reducing SM backgrounds. The expected exclusion bounds (at 95% confidence level) are m (Z‧) < 1.8 TeV and m (Z‧) < 2.5 TeV in the ττjfjf and μμjfjf channels, respectively, assuming 1000 fb-1 of 13 TeV data from the LHC. The use of the VBF topology to search for massive neutral gauge bosons provides a discovery reach with expected significances greater than 5σ (3σ) for Z‧ masses up to 1.4 (1.6) TeV and 2.0 (2.2) TeV in the ττjfjf and μμjfjf channels.

  5. Searching for new heavy neutral gauge bosons using vector boson fusion processes at the LHC

    DOE PAGES

    Flórez, Andrés; Gurrola, Alfredo; Johns, Will; ...

    2017-02-01

    Here, new massive resonances are predicted in many extensions to the Standard Model (SM) of particle physics and constitutes one of the most promising searches for new physics at the LHC. We present a feasibility study to search for new heavy neutral gauge bosons using vector boson fusion (VBF) processes, which become especially important as the LHC probes higher collision energies. In particular, we consider the possibility that the discovery of a Z' boson may have eluded searches at the LHC. The coupling of the Z' boson to the SM quarks can be small, and thus the Z' would notmore » be discoverable by the searches conducted thus far. In the context of a simplified phenomenological approach, we consider the Z'→ττ and Z'→μμ decay modes to show that the requirement of a dilepton pair combined with two high pT forward jets with large separation in pseudorapidity and with large dijet mass is effective in reducing SM backgrounds. The expected exclusion bounds (at 95% confidence level) are m(Z') < 1.8 TeV and m(Z')<2.5 TeV in the ττjfjf and μμjfjf channels, respectively, assuming 1000 fb–1 of 13 TeV data from the LHC. The use of the VBF topology to search for massive neutral gauge bosons provides a discovery reach with expected significances greater than 5σ (3σ) for Z' masses up to 1.4 (1.6) TeV and 2.0 (2.2) TeV in the ττjfjf and μμjfjf channels.« less

  6. A national scale flood hazard mapping methodology: The case of Greece - Protection and adaptation policy approaches.

    PubMed

    Kourgialas, Nektarios N; Karatzas, George P

    2017-12-01

    The present work introduces a national scale flood hazard assessment methodology, using multi-criteria analysis and artificial neural networks (ANNs) techniques in a GIS environment. The proposed methodology was applied in Greece, where flash floods are a relatively frequent phenomenon and it has become more intense over the last decades, causing significant damages in rural and urban sectors. In order the most prone flooding areas to be identified, seven factor-maps (that are directly related to flood generation) were combined in a GIS environment. These factor-maps are: a) the Flow accumulation (F), b) the Land use (L), c) the Altitude (A), b) the Slope (S), e) the soil Erodibility (E), f) the Rainfall intensity (R), and g) the available water Capacity (C). The name to the proposed method is "FLASERC". The flood hazard for each one of these factors is classified into five categories: Very low, low, moderate, high, and very high. The above factors are combined and processed using the appropriate ANN algorithm tool. For the ANN training process spatial distribution of historical flooded points in Greece within the five different flood hazard categories of the aforementioned seven factor-maps were combined. In this way, the overall flood hazard map for Greece was determined. The final results are verified using additional historical flood events that have occurred in Greece over the last 100years. In addition, an overview of flood protection measures and adaptation policy approaches were proposed for agricultural and urban areas located at very high flood hazard areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Flight investigation of helicopter IFR approaches to oil rigs using airborne weather and mapping radar

    NASA Technical Reports Server (NTRS)

    Bull, J. S.; Hegarty, D. M.; Phillips, J. D.; Sturgeon, W. R.; Hunting, A. W.; Pate, D. P.

    1979-01-01

    Airborne weather and mapping radar is a near-term, economical method of providing 'self-contained' navigation information for approaches to offshore oil rigs and its use has been rapidly expanding in recent years. A joint NASA/FAA flight test investigation of helicopter IFR approaches to offshore oil rigs in the Gulf of Mexico was initiated in June 1978 and conducted under contract to Air Logistics. Approximately 120 approaches were flown in a Bell 212 helicopter by 15 operational pilots during the months of August and September 1978. The purpose of the tests was to collect data to (1) support development of advanced radar flight director concepts by NASA and (2) aid the establishment of Terminal Instrument Procedures (TERPS) criteria by the FAA. The flight test objectives were to develop airborne radar approach procedures, measure tracking errors, determine accpetable weather minimums, and determine pilot acceptability. Data obtained will contribute significantly to improved helicopter airborne radar approach capability and to the support of exploration, development, and utilization of the Nation's offshore oil supplies.

  8. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids

    PubMed Central

    Tangprasertchai, Narin S.; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S.; Qin, Peter Z.

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve “correct” all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. PMID:26477260

  9. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids.

    PubMed

    Tangprasertchai, Narin S; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S; Qin, Peter Z

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve "correct" all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements.

  10. A novel linear programming approach to fluence map optimization for intensity modulated radiation therapy treatment planning.

    PubMed

    Romeijn, H Edwin; Ahuja, Ravindra K; Dempsey, James F; Kumar, Arvind; Li, Jonathan G

    2003-11-07

    We present a novel linear programming (LP) based approach for efficiently solving the intensity modulated radiation therapy (IMRT) fluence-map optimization (FMO) problem to global optimality. Our model overcomes the apparent limitations of a linear-programming approach by approximating any convex objective function by a piecewise linear convex function. This approach allows us to retain the flexibility offered by general convex objective functions, while allowing us to formulate the FMO problem as a LP problem. In addition, a novel type of partial-volume constraint that bounds the tail averages of the differential dose-volume histograms of structures is imposed while retaining linearity as an alternative approach to improve dose homogeneity in the target volumes, and to attempt to spare as many critical structures as possible. The goal of this work is to develop a very rapid global optimization approach that finds high quality dose distributions. Implementation of this model has demonstrated excellent results. We found globally optimal solutions for eight 7-beam head-and-neck cases in less than 3 min of computational time on a single processor personal computer without the use of partial-volume constraints. Adding such constraints increased the running times by a factor of 2-3, but improved the sparing of critical structures. All cases demonstrated excellent target coverage (> 95%), target homogeneity (< 10% overdosing and < 7% underdosing) and organ sparing using at least one of the two models.

  11. Flight investigation of helicopter IFR approaches to oil rigs using airborne weather and mapping radar

    NASA Technical Reports Server (NTRS)

    Bull, J. S.; Hegarty, D. M.; Phillips, J. D.; Sturgeon, W. R.; Hunting, A. W.; Pate, D. P.

    1979-01-01

    Airborne weather and mapping radar is a near-term, economical method of providing 'self-contained' navigation information for approaches to offshore oil rigs and its use has been rapidly expanding in recent years. A joint NASA/FAA flight test investigation of helicopter IFR approaches to offshore oil rigs in the Gulf of Mexico was initiated in June 1978 and conducted under contract to Air Logistics. Approximately 120 approaches were flown in a Bell 212 helicopter by 15 operational pilots during the months of August and September 1978. The purpose of the tests was to collect data to (1) support development of advanced radar flight director concepts by NASA and (2) aid the establishment of Terminal Instrument Procedures (TERPS) criteria by the FAA. The flight test objectives were to develop airborne radar approach procedures, measure tracking errors, determine accpetable weather minimums, and determine pilot acceptability. Data obtained will contribute significantly to improved helicopter airborne radar approach capability and to the support of exploration, development, and utilization of the Nation's offshore oil supplies.

  12. What is a Higgs Boson?

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  13. Chiral Bosonization of Superconformal Ghosts

    NASA Technical Reports Server (NTRS)

    Shi, Deheng; Shen, Yang; Liu, Jinling; Xiong, Yongjian

    1996-01-01

    We explain the difference of the Hilbert space of the superconformal ghosts (beta,gamma) system from that of its bosonized fields phi and chi. We calculate the chiral correlation functions of phi, chi fields by inserting appropriate projectors.

  14. What is a Higgs Boson?

    SciTech Connect

    Lincoln, Don

    2011-07-07

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  15. Towards quantum supremacy with lossy scattershot boson sampling

    NASA Astrophysics Data System (ADS)

    Latmiral, Ludovico; Spagnolo, Nicolò; Sciarrino, Fabio

    2016-11-01

    Boson sampling represents a promising approach to obtain evidence of the supremacy of quantum systems as a resource for the solution of computational problems. The classical hardness of Boson Sampling has been related to the so called Permanent-of-Gaussians Conjecture and has been extended to some generalizations such as Scattershot Boson Sampling, approximate and lossy sampling under some reasonable constraints. However, it is still unclear how demanding these techniques are for a quantum experimental sampler. Starting from a state of the art analysis and taking account of the foreseeable practical limitations, we evaluate and discuss the bound for quantum supremacy for different recently proposed approaches, accordingly to today’s best known classical simulators.

  16. The field line map approach for simulations of magnetically confined plasmas

    NASA Astrophysics Data System (ADS)

    Stegmeir, Andreas; Coster, David; Maj, Omar; Hallatschek, Klaus; Lackner, Karl

    2016-01-01

    Predictions of plasma parameters in the edge and scrape-off layer of tokamaks is difficult since most modern tokamaks have a divertor and the associated separatrix causes the usually employed field/flux-aligned coordinates to become singular on the separatrix/X-point. The presented field line map approach avoids such problems as it is based on a cylindrical grid: standard finite-difference methods can be used for the discretisation of perpendicular (w.r.t. magnetic field) operators, and the characteristic flute mode property (k∥ ≪k⊥) of structures is exploited computationally via a field line following discretisation of parallel operators which leads to grid sparsification in the toroidal direction. This paper is devoted to the discretisation of the parallel diffusion operator (the approach taken is very similar to the flux-coordinate independent (FCI) approach which has already been adopted to a hyperbolic problem (Ottaviani, 2011; Hariri, 2013)). Based on the support operator method, schemes are derived which maintain the self-adjointness property of the parallel diffusion operator on the discrete level. These methods have very low numerical perpendicular diffusion compared to a naive discretisation which is a critical issue since magnetically confined plasmas exhibit a very strong anisotropy. Two different versions of the discrete parallel diffusion operator are derived: the first is based on interpolation where the order of interpolation and therefore the numerical diffusion is adjustable; the second is based on integration and is advantageous in cases where the field line map is strongly distorted. The schemes are implemented in the new code GRILLIX, and extensive benchmarks and numerous examples are presented which show the validity of the approach in general and GRILLIX in particular.

  17. Non-invasive computation of aortic pressure maps: a phantom-based study of two approaches

    NASA Astrophysics Data System (ADS)

    Delles, Michael; Schalck, Sebastian; Chassein, Yves; Müller, Tobias; Rengier, Fabian; Speidel, Stefanie; von Tengg-Kobligk, Hendrik; Kauczor, Hans-Ulrich; Dillmann, Rüdiger; Unterhinninghofen, Roland

    2014-03-01

    Patient-specific blood pressure values in the human aorta are an important parameter in the management of cardiovascular diseases. A direct measurement of these values is only possible by invasive catheterization at a limited number of measurement sites. To overcome these drawbacks, two non-invasive approaches of computing patient-specific relative aortic blood pressure maps throughout the entire aortic vessel volume are investigated by our group. The first approach uses computations from complete time-resolved, three-dimensional flow velocity fields acquired by phasecontrast magnetic resonance imaging (PC-MRI), whereas the second approach relies on computational fluid dynamics (CFD) simulations with ultrasound-based boundary conditions. A detailed evaluation of these computational methods under realistic conditions is necessary in order to investigate their overall robustness and accuracy as well as their sensitivity to certain algorithmic parameters. We present a comparative study of the two blood pressure computation methods in an experimental phantom setup, which mimics a simplified thoracic aorta. The comparative analysis includes the investigation of the impact of algorithmic parameters on the MRI-based blood pressure computation and the impact of extracting pressure maps in a voxel grid from the CFD simulations. Overall, a very good agreement between the results of the two computational approaches can be observed despite the fact that both methods used completely separate measurements as input data. Therefore, the comparative study of the presented work indicates that both non-invasive pressure computation methods show an excellent robustness and accuracy and can therefore be used for research purposes in the management of cardiovascular diseases.

  18. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  19. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps

    PubMed Central

    Santos, Diego G.; Fernandes, Bruno J. T.; Bezerra, Byron L. D.

    2015-01-01

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset. PMID:26569262

  20. In Silico Design of Human IMPDH Inhibitors Using Pharmacophore Mapping and Molecular Docking Approaches

    PubMed Central

    Li, Rui-Juan; Wang, Ya-Li; Wang, Qing-He; Wang, Jian; Cheng, Mao-Sheng

    2015-01-01

    Inosine 5′-monophosphate dehydrogenase (IMPDH) is one of the crucial enzymes in the de novo biosynthesis of guanosine nucleotides. It has served as an attractive target in immunosuppressive, anticancer, antiviral, and antiparasitic therapeutic strategies. In this study, pharmacophore mapping and molecular docking approaches were employed to discover novel Homo sapiens IMPDH (hIMPDH) inhibitors. The Güner-Henry (GH) scoring method was used to evaluate the quality of generated pharmacophore hypotheses. One of the generated pharmacophore hypotheses was found to possess a GH score of 0.67. Ten potential compounds were selected from the ZINC database using a pharmacophore mapping approach and docked into the IMPDH active site. We find two hits (i.e., ZINC02090792 and ZINC00048033) that match well the optimal pharmacophore features used in this investigation, and it is found that they form interactions with key residues of IMPDH. We propose that these two hits are lead compounds for the development of novel hIMPDH inhibitors. PMID:25784957

  1. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps.

    PubMed

    Santos, Diego G; Fernandes, Bruno J T; Bezerra, Byron L D

    2015-11-12

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset.

  2. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties.

  3. Coriolis effects on rotating Hele-Shaw flows: a conformal-mapping approach.

    PubMed

    Miranda, José A; Gadêlha, Hermes; Dorsey, Alan T

    2010-12-01

    The zero surface tension fluid-fluid interface dynamics in a radial Hele-Shaw cell driven by both injection and rotation is studied by a conformal-mapping approach. The situation in which one of the fluids is inviscid and has negligible density is analyzed. When Coriolis force effects are ignored, exact solutions of the zero surface tension rotating Hele-Shaw problem with injection reveal suppression of cusp singularities for sufficiently high rotation rates. We study how the Coriolis force affects the time-dependent solutions of the problem, and the development of finite time singularities. By employing Richardson's harmonic moments approach we obtain conformal maps which describe the time evolution of the fluid boundary. Our results demonstrate that the inertial Coriolis contribution plays an important role in determining the time for cusp formation. Moreover, it introduces a phase drift that makes the evolving patterns rotate. The Coriolis force acts against centrifugal effects, promoting (inhibiting) cusp breakdown if the more viscous and dense fluid lies outside (inside) the interface. Despite the presence of Coriolis effects, the occurrence of finger bending events has not been detected in the exact solutions.

  4. Multi-scale hierarchical approach for parametric mapping: assessment on multi-compartmental models.

    PubMed

    Rizzo, G; Turkheimer, F E; Bertoldo, A

    2013-02-15

    This paper investigates a new hierarchical method to apply basis function to mono- and multi-compartmental models (Hierarchical-Basis Function Method, H-BFM) at a voxel level. This method identifies the parameters of the compartmental model in its nonlinearized version, integrating information derived at the region of interest (ROI) level by segmenting the cerebral volume based on anatomical definition or functional clustering. We present the results obtained by using a two tissue-four rate constant model with two different tracers ([(11)C]FLB457 and [carbonyl-(11)C]WAY100635), one of the most complex models used in receptor studies, especially at the voxel level. H-BFM is robust and its application on both [(11)C]FLB457 and [carbonyl-(11)C]WAY100635 allows accurate and precise parameter estimates, good quality parametric maps and a low percentage of voxels out of physiological bound (<8%). The computational time depends on the number of basis functions selected and can be compatible with clinical use (~6h for a single subject analysis). The novel method is a robust approach for PET quantification by using compartmental modeling at the voxel level. In particular, different from other proposed approaches, this method can also be used when the linearization of the model is not appropriate. We expect that applying it to clinical data will generate reliable parametric maps. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Pooling sexes when assessing ground reaction forces during walking: Statistical Parametric Mapping versus traditional approach.

    PubMed

    Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo

    2015-07-16

    Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    NASA Astrophysics Data System (ADS)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Roger Anderson, O.

    2011-04-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample of the study consisted of six classes (140 Form 3 students of 13-15 years old) selected from a typical coeducational school in Brunei. Three classes (40 boys and 30 girls) were taught using the TTA while three other classes (41 boys and 29 girls) used the CMA, enriched with PowerPoint presentations. After the interventions (lessons on magnetism), the students in both groups were asked to describe in writing their understanding of magnetism accrued from the lessons. Their written descriptions were analyzed using flow map analyses to assess their content knowledge and its organisation in memory as evidence of cognitive structure. The extent of CLE was measured using a published CLE survey. The results showed that the cognitive structures of the CMA students were more extensive, thematically organised and richer in interconnectedness of thoughts than those of TTA students. Moreover, CMA students also perceived their classroom learning environment to be more constructivist than their counterparts. It is, therefore, recommended that teachers consider using the CMA teaching technique to help students enrich their understanding, especially for more complex or abstract scientific content.

  7. Mapping CO2 emission in highly urbanized region using standardized microbial respiration approach

    NASA Astrophysics Data System (ADS)

    Vasenev, V. I.; Stoorvogel, J. J.; Ananyeva, N. D.

    2012-12-01

    Urbanization is a major recent land-use change pathway. Land conversion to urban has a tremendous and still unclear effect on soil cover and functions. Urban soil can act as a carbon source, although its potential for CO2 emission is also very high. The main challenge in analysis and mapping soil organic carbon (SOC) in urban environment is its high spatial heterogeneity and temporal dynamics. The urban environment provides a number of specific features and processes that influence soil formation and functioning and results in a unique spatial variability of carbon stocks and fluxes at short distance. Soil sealing, functional zoning, settlement age and size are the predominant factors, distinguishing heterogeneity of urban soil carbon. The combination of these factors creates a great amount of contrast clusters with abrupt borders, which is very difficult to consider in regional assessment and mapping of SOC stocks and soil CO2 emission. Most of the existing approaches to measure CO2 emission in field conditions (eddy-covariance, soil chambers) are very sensitive to soil moisture and temperature conditions. They require long-term sampling set during the season in order to obtain relevant results. This makes them inapplicable for the analysis of CO2 emission spatial variability at the regional scale. Soil respiration (SR) measurement in standardized lab conditions enables to overcome this difficulty. SR is predominant outgoing carbon flux, including autotrophic respiration of plant roots and heterotrophic respiration of soil microorganisms. Microbiota is responsible for 50-80% of total soil carbon outflow. Microbial respiration (MR) approach provides an integral CO2 emission results, characterizing microbe CO2 production in optimal conditions and thus independent from initial difference in soil temperature and moisture. The current study aimed to combine digital soil mapping (DSM) techniques with standardized microbial respiration approach in order to analyse and

  8. Pixel-based flood mapping from SAR imagery: a comparison of approaches

    NASA Astrophysics Data System (ADS)

    Landuyt, Lisa; Van Wesemael, Alexandra; Van Coillie, Frieke M. B.; Verhoest, Niko E. C.

    2017-04-01

    Due to their all-weather, day and night capabilities, SAR sensors have been shown to be particularly suitable for flood mapping applications. Thus, they can provide spatially-distributed flood extent data which are valuable for calibrating, validating and updating flood inundation models. These models are an invaluable tool for water managers, to take appropriate measures in times of high water levels. Image analysis approaches to delineate flood extent on SAR imagery are numerous. They can be classified into two categories, i.e. pixel-based and object-based approaches. Pixel-based approaches, e.g. thresholding, are abundant and in general computationally inexpensive. However, large discrepancies between these techniques exist and often subjective user intervention is needed. Object-based approaches require more processing but allow for the integration of additional object characteristics, like contextual information and object geometry, and thus have significant potential to provide an improved classification result. As means of benchmark, a selection of pixel-based techniques is applied on a ERS-2 SAR image of the 2006 flood event of River Dee, United Kingdom. This selection comprises Otsu thresholding, Kittler & Illingworth thresholding, the Fine To Coarse segmentation algorithm and active contour modelling. The different classification results are evaluated and compared by means of several accuracy measures, including binary performance measures.

  9. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  10. A machine learning approach for the identification of protein secondary structure elements from electron cryo-microscopy density maps.

    PubMed

    Si, Dong; Ji, Shuiwang; Nasr, Kamal Al; He, Jing

    2012-09-01

    The accuracy of the secondary structure element (SSE) identification from volumetric protein density maps is critical for de-novo backbone structure derivation in electron cryo-microscopy (cryoEM). It is still challenging to detect the SSE automatically and accurately from the density maps at medium resolutions (∼5-10 Å). We present a machine learning approach, SSELearner, to automatically identify helices and β-sheets by using the knowledge from existing volumetric maps in the Electron Microscopy Data Bank. We tested our approach using 10 simulated density maps. The averaged specificity and sensitivity for the helix detection are 94.9% and 95.8%, respectively, and those for the β-sheet detection are 86.7% and 96.4%, respectively. We have developed a secondary structure annotator, SSID, to predict the helices and β-strands from the backbone Cα trace. With the help of SSID, we tested our SSELearner using 13 experimentally derived cryo-EM density maps. The machine learning approach shows the specificity and sensitivity of 91.8% and 74.5%, respectively, for the helix detection and 85.2% and 86.5% respectively for the β-sheet detection in cryoEM maps of Electron Microscopy Data Bank. The reduced detection accuracy reveals the challenges in SSE detection when the cryoEM maps are used instead of the simulated maps. Our results suggest that it is effective to use one cryoEM map for learning to detect the SSE in another cryoEM map of similar quality. Copyright © 2012 Wiley Periodicals, Inc.

  11. A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.

    PubMed

    Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V

    2016-07-01

    In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Mapping of Protein–Protein Interaction Sites by the ‘Absence of Interference’ Approach

    PubMed Central

    Dhayalan, Arunkumar; Jurkowski, Tomasz P.; Laser, Heike; Reinhardt, Richard; Jia, Da; Cheng, Xiaodong; Jeltsch, Albert

    2008-01-01

    Protein–protein interactions are critical to most biological processes, and locating protein–protein interfaces on protein structures is an important task in molecular biology. We developed a new experimental strategy called the ‘absence of interference’ approach to determine surface residues involved in protein–protein interaction of established yeast two-hybrid pairs of interacting proteins. One of the proteins is subjected to high-level randomization by error-prone PCR. The resulting library is selected by yeast two-hybrid system for interacting clones that are isolated and sequenced. The interaction region can be identified by an absence or depletion of mutations. For data analysis and presentation, we developed a Web interface that analyzes the mutational spectrum and displays the mutational frequency on the surface of the structure (or a structural model) of the randomized protein†. Additionally, this interface might be of use for the display of mutational distributions determined by other types of random mutagenesis experiments. We applied the approach to map the interface of the catalytic domain of the DNA methyltransferase Dnmt3a with its regulatory factor Dnmt3L. Dnmt3a was randomized with high mutational load. A total of 76 interacting clones were isolated and sequenced, and 648 mutations were identified. The mutational pattern allowed to identify a unique interaction region on the surface of Dnmt3a, which comprises about 500−600 Å2. The results were confirmed by site-directed mutagenesis and structural analysis. The absence-of-interference approach will allow high-throughput mapping of protein interaction sites suitable for functional studies and protein docking. PMID:18191145

  13. Divide and Conquer Approach to Contact Map Overlap Problem Using 2D-Pattern Mining of Protein Contact Networks.

    PubMed

    Koneru, Suvarna Vani; Bhavani, Durga S

    2015-01-01

    A novel approach to Contact Map Overlap (CMO) problem is proposed using the two dimensional clusters present in the contact maps. Each protein is represented as a set of the non-trivial clusters of contacts extracted from its contact map. The approach involves finding matching regions between the two contact maps using approximate 2D-pattern matching algorithm and dynamic programming technique. These matched pairs of small contact maps are submitted in parallel to a fast heuristic CMO algorithm. The approach facilitates parallelization at this level since all the pairs of contact maps can be submitted to the algorithm in parallel. Then, a merge algorithm is used in order to obtain the overall alignment. As a proof of concept, MSVNS, a heuristic CMO algorithm is used for global as well as local alignment. The divide and conquer approach is evaluated for two benchmark data sets that of Skolnick and Ding et al. It is interesting to note that along with achieving saving of time, better overlap is also obtained for certain protein folds.

  14. Rotating boson stars in five dimensions

    SciTech Connect

    Hartmann, Betti; Kleihaus, Burkhard; Kunz, Jutta; List, Meike

    2010-10-15

    We study rotating boson stars in five spacetime dimensions. The boson fields consist of a complex doublet scalar field. Considering boson stars rotating in two orthogonal planes with both angular momenta of equal magnitude, a special ansatz for the boson field and the metric allows for solutions with nontrivial dependence on the radial coordinate only. The charge of the scalar field equals the sum of the angular momenta. The rotating boson stars are globally regular and asymptotically flat. For our choice of a sextic potential, the rotating boson star solutions possess a flat spacetime limit. We study the solutions in flat and curved spacetime.

  15. Analytic boosted boson discrimination

    SciTech Connect

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2016-05-20

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D2, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits. By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. In conclusion, our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.

  16. Analytic boosted boson discrimination

    DOE PAGES

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2016-05-20

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D2, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits.more » By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. In conclusion, our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.« less

  17. The land morphology approach to flood risk mapping: An application to Portugal.

    PubMed

    Cunha, N S; Magalhães, M R; Domingos, T; Abreu, M M; Küpfer, C

    2017-05-15

    In the last decades, the increasing vulnerability of floodplains is linked to societal changes such as population density growth, land use changes, water use patterns, among other factors. Land morphology directly influences surface water flow, transport of sediments, soil genesis, local climate and vegetation distribution. Therefore, the land morphology, the land used and management directly influences flood risks genesis. However, attention is not always given to the underlying geomorphological and ecological processes that influence the dynamic of rivers and their floodplains. Floodplains are considered a part of a larger system called Wet System (WS). The WS includes permanent and temporary streams, water bodies, wetlands and valley bottoms. Valley bottom is a broad concept which comprehends not only floodplains but also flat and concave areas, contiguous to streams, in which slope is less than 5%. This will be addressed through a consistent method based on a land morphology approach that classifies landforms according to their hydrological position in the watershed. This method is based on flat areas (slopes less than 5%), surface curvature and hydrological features. The comparison between WS and flood risk data from the Portuguese Environmental Agency for the main rivers of mainland Portugal showed that in downstream areas of watersheds, valley bottoms are coincident with floodplains modelled by hydrological methods. Mapping WS has a particular interest in analysing river ecosystems position and function in the landscape, from upstream to downstream areas in the watershed. This morphological approach is less demanding data and time-consuming than hydrological methods and can be used as the preliminary delimitation of floodplains and potential flood risk areas in situations where there is no hydrological data available. The results were also compared with the land use/cover map at a national level and detailed in Trancão river basin, located in Lisbon

  18. Mapping tissue inhomogeneity in acute myocarditis: a novel analytical approach to quantitative myocardial edema imaging by T2-mapping.

    PubMed

    Baeßler, Bettina; Schaarschmidt, Frank; Dick, Anastasia; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido; Maintz, David; Bunck, Alexander C

    2015-12-23

    The purpose of the present study was to investigate the diagnostic value of T2-mapping in acute myocarditis (ACM) and to define cut-off values for edema detection. Cardiovascular magnetic resonance (CMR) data of 31 patients with ACM were retrospectively analyzed. 30 healthy volunteers (HV) served as a control. Additionally to the routine CMR protocol, T2-mapping data were acquired at 1.5 T using a breathhold Gradient-Spin-Echo T2-mapping sequence in six short axis slices. T2-maps were segmented according to the 16-segments AHA-model and segmental T2 values as well as the segmental pixel-standard deviation (SD) were analyzed. Mean differences of global myocardial T2 or pixel-SD between HV and ACM patients were only small, lying in the normal range of HV. In contrast, variation of segmental T2 values and pixel-SD was much larger in ACM patients compared to HV. In random forests and multiple logistic regression analyses, the combination of the highest segmental T2 value within each patient (maxT2) and the mean absolute deviation (MAD) of log-transformed pixel-SD (madSD) over all 16 segments within each patient proved to be the best discriminators between HV and ACM patients with an AUC of 0.85 in ROC-analysis. In classification trees, a combined cut-off of 0.22 for madSD and of 68 ms for maxT2 resulted in 83% specificity and 81% sensitivity for detection of ACM. The proposed cut-off values for maxT2 and madSD in the setting of ACM allow edema detection with high sensitivity and specificity and therefore have the potential to overcome the hurdles of T2-mapping for its integration into clinical routine.

  19. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  20. Mapping a near surface variable geologic regime using an integrated geophysical approach

    SciTech Connect

    Rogers, N.T.; Sandberg, S.K.; Miller, P.; Powell, G.

    1997-10-01

    An integrated geophysical approach involving seismic, electromagnetic, and electrical methods was employed to map fluvial, colluvial and bedrock geology, to delineate bedrock channels, and to determine fracture and joint orientations that may influence migration of petroleum hydrocarbons at the Glenrock Oil Seep. Both P (primary)-wave and S (shear)-wave seismic refraction techniques were used to map the bedrock surface topography, bedrock minima, stratigraphic boundaries, and possible structure. S-wave data were preferred because of better vertical resolution due to the combination of slower velocities and lower frequency wave train. Azimuthal resistivity/EP (induced polarization) and azimuthal electromagnetics were used to determine fracture orientations and groundwater flow directions. Terrain conductivity was used to map the fluvial sedimentary sequences (e.g., paleochannel and overbank deposits) in the North Platte River floodplain. Conductivity measurements were also used to estimate bedrock depth and to assist in the placement and recording parameters of the azimuthal soundings. The geophysical investigation indicated that groundwater flow pathways were controlled by the fluvial paleochannels and bedrock erosional features. Primary groundwater flow direction in the bedrock and colluvial sediments was determined from the azimuthal measurements and confirmed by drilling to be N20-40W along the measured strike of the bedrock and joint orientations. Joint/fracture orientations were measured at N20-40W and N10-30E from the azimuthal data and confirmed from measurements at a bedrock outcrop south of the site. The bedrock has an apparent N10E anisotropy in the seismic velocity profiles on the old refinery property that closely match that of measured joint/fracture orientations and may have a minor effect on groundwater flow.

  1. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    PubMed

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  2. Policy, Research and Residents’ Perspectives on Built Environments Implicated in Heart Disease: A Concept Mapping Approach

    PubMed Central

    Stankov, Ivana; Howard, Natasha J.; Daniel, Mark; Cargo, Margaret

    2017-01-01

    An underrepresentation of stakeholder perspectives within urban health research arguably limits our understanding of what is a multi-dimensional and complex relationship between the built environment and health. By engaging a wide range of stakeholders using a participatory concept mapping approach, this study aimed to achieve a more holistic and nuanced understanding of the built environments shaping disease risk, specifically cardiometabolic risk (CMR). Moreover, this study aimed to ascertain the importance and changeability of identified environments through government action. Through the concept mapping process, community members, researchers, government and non-government stakeholders collectively identified eleven clusters encompassing 102 built environmental domains related to CMR, a number of which are underrepresented within the literature. Among the identified built environments, open space, public transportation and pedestrian environments were highlighted as key targets for policy intervention. Whilst there was substantive convergence in stakeholder groups’ perspectives concerning the built environment and CMR, there were disparities in the level of importance government stakeholders and community members respectively assigned to pedestrian environments and street connectivity. These findings support the role of participatory methods in strengthening how urban health issues are understood and in affording novel insights into points of action for public health and policy intervention. PMID:28208786

  3. New approaches to high-resolution mapping of marine vertical structures.

    PubMed

    Robert, Katleen; Huvenne, Veerle A I; Georgiopoulou, Aggeliki; Jones, Daniel O B; Marsh, Leigh; D O Carter, Gareth; Chaumillon, Leo

    2017-08-21

    Vertical walls in marine environments can harbour high biodiversity and provide natural protection from bottom-trawling activities. However, traditional mapping techniques are usually restricted to down-looking approaches which cannot adequately replicate their 3D structure. We combined sideways-looking multibeam echosounder (MBES) data from an AUV, forward-looking MBES data from ROVs and ROV-acquired videos to examine walls from Rockall Bank and Whittard Canyon, Northeast Atlantic. High-resolution 3D point clouds were extracted from each sonar dataset and structure from motion photogrammetry (SfM) was applied to recreate 3D representations of video transects along the walls. With these reconstructions, it was possible to interact with extensive sections of video footage and precisely position individuals. Terrain variables were derived on scales comparable to those experienced by megabenthic individuals. These were used to show differences in environmental conditions between observed and background locations as well as explain spatial patterns in ecological characteristics. In addition, since the SfM 3D reconstructions retained colours, they were employed to separate and quantify live coral colonies versus dead framework. The combination of these new technologies allows us, for the first time, to map the physical 3D structure of previously inaccessible habitats and demonstrates the complexity and importance of vertical structures.

  4. REASONS FOR ELECTRONIC CIGARETTE USE BEYOND CIGARETTE SMOKING CESSATION: A CONCEPT MAPPING APPROACH

    PubMed Central

    Soule, Eric K.; Rosas, Scott R.; Nasim, Aashir

    2016-01-01

    Introduction Electronic cigarettes (ECIGs) continue to grow in popularity, however, limited research has examined reasons for ECIG use. Methods This study used an integrated, mixed-method participatory research approach called concept mapping (CM) to characterize and describe adults’ reasons for using ECIGs. A total of 108 adults completed a multi-module online CM study that consisted of brainstorming statements about their reasons for ECIG use, sorting each statement into conceptually similar categories, and then rating each statement based on whether it represented a reason why they have used an ECIG in the past month. Results Participants brainstormed a total of 125 unique statements related to their reasons for ECIG use. Multivariate analyses generated a map revealing 11, interrelated components or domains that characterized their reasons for use. Importantly, reasons related to Cessation Methods, Perceived Health Benefits, Private Regard, Convenience and Conscientiousness were rated significantly higher than other categories/types of reasons related to ECIG use (p<.05). There also were significant model differences in participants’ endorsement of reasons based on their demography and ECIG behaviors. Conclusions This study shows that ECIG users are motivated to use ECIGs for many reasons. ECIG regulations should address these reasons for ECIG use in addition to smoking cessation. PMID:26803400

  5. A computational approach to map nucleosome positions and alternative chromatin states with base pair resolution

    PubMed Central

    Zhou, Xu; Blocker, Alexander W; Airoldi, Edoardo M; O'Shea, Erin K

    2016-01-01

    Understanding chromatin function requires knowing the precise location of nucleosomes. MNase-seq methods have been widely applied to characterize nucleosome organization in vivo, but generally lack the accuracy to determine the precise nucleosome positions. Here we develop a computational approach leveraging digestion variability to determine nucleosome positions at a base-pair resolution from MNase-seq data. We generate a variability template as a simple error model for how MNase digestion affects the mapping of individual nucleosomes. Applied to both yeast and human cells, this analysis reveals that alternatively positioned nucleosomes are prevalent and create significant heterogeneity in a cell population. We show that the periodic occurrences of dinucleotide sequences relative to nucleosome dyads can be directly determined from genome-wide nucleosome positions from MNase-seq. Alternatively positioned nucleosomes near transcription start sites likely represent different states of promoter nucleosomes during transcription initiation. Our method can be applied to map nucleosome positions in diverse organisms at base-pair resolution. DOI: http://dx.doi.org/10.7554/eLife.16970.001 PMID:27623011

  6. Mapping the distribution of neuroepithelial bodies of the rat lung. A whole-mount immunohistochemical approach.

    PubMed Central

    Avadhanam, K. P.; Plopper, C. G.; Pinkerton, K. E.

    1997-01-01

    We report an immunohistochemical method for mapping the distribution of neuroepithelial bodies (NEBs) in whole-mount preparations of the intrapulmonary airways. The lungs of 8- and 50-day-old male Sprague-Dawley rats were fixed with ethanol-acetic acid by intratracheal instillation. The major axial airway path of the infracardiac lobe was exposed and isolated by microdissection. NEBs were identified by calcitonin gene-related peptide immunoreactivity and their distribution mapped by generation and branch-point number. A distinct pattern was noted with greater prevalence of NEBs in proximal airway generations compared with more distal airways. No significant difference was noted in the distribution pattern or absolute number of NEBs between neonates and adults when compared by airway generation. NEBs were found more frequently on the ridges of the bifurcation than in other regions of the bifurcating airway wall. The ease of identification of total numbers of NEBs and their specific location by airway generation in whole-mount preparations of the bronchial tree completely removes the necessity of examining multiple sections and performing extensive morphometric procedures. Whole-mount airway preparations allow for the analysis and comparison of larger sample sizes per experimental group without labor-intensive approaches. The application of this method should enhance our knowledge of the role of NEBs in lung development and in response to disease. Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 PMID:9060823

  7. Object-based approach to national land cover mapping using HJ satellite imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Xiaosong; Yuan, Quanzhi; Liu, Yu

    2014-01-01

    To meet the carbon storage estimate in ecosystems for a national carbon strategy, we introduce a consistent database of China land cover. The Chinese Huan Jing (HJ) satellite is proven efficient in the cloud-free acquisition of seasonal image series in a monsoon region and in vegetation identification for mesoscale land cover mapping. Thirty-eight classes of level II land cover are generated based on the Land Cover Classification System of the United Nations Food and Agriculture Organization that follows a standard and quantitative definition. Twenty-four layers of derivative spectral, environmental, and spatial features compose the classification database. Object-based approach characterizing additional nonspectral features is conducted through mapping, and multiscale segmentations are applied on object boundary match to target real-world conditions. This method sufficiently employs spatial information, in addition to spectral characteristics, to improve classification accuracy. The algorithm of hierarchical classification is employed to follow step-by-step procedures that effectively control classification quality. This algorithm divides the dual structures of universal and local trees. Consistent universal trees suitable to most regions are performed first, followed by local trees that depend on specific features of nine climate stratifications. The independent validation indicates the overall accuracy reaches 86%.

  8. A new LPV modeling approach using PCA-based parameter set mapping to design a PSS.

    PubMed

    Jabali, Mohammad B Abolhasani; Kazemi, Mohammad H

    2017-01-01

    This paper presents a new methodology for the modeling and control of power systems based on an uncertain polytopic linear parameter-varying (LPV) approach using parameter set mapping with principle component analysis (PCA). An LPV representation of the power system dynamics is generated by linearization of its differential-algebraic equations about the transient operating points for some given specific faults containing the system nonlinear properties. The time response of the output signal in the transient state plays the role of the scheduling signal that is used to construct the LPV model. A set of sample points of the dynamic response is formed to generate an initial LPV model. PCA-based parameter set mapping is used to reduce the number of models and generate a reduced LPV model. This model is used to design a robust pole placement controller to assign the poles of the power system in a linear matrix inequality (LMI) region, such that the response of the power system has a proper damping ratio for all of the different oscillation modes. The proposed scheme is applied to controller synthesis of a power system stabilizer, and its performance is compared with a tuned standard conventional PSS using nonlinear simulation of a multi-machine power network. The results under various conditions show the robust performance of the proposed controller.

  9. Pattern selection in extended periodically forced systems: a continuum coupled map approach.

    PubMed

    Venkataramani, S C; Ott, E

    2001-04-01

    We propose that a useful approach to the modeling of periodically forced extended systems is through continuum coupled map (CCM) models. CCM models are discrete time, continuous space models, mapping a continuous spatially varying field xi(n)(x) from time n to time n+1. The efficacy of CCM models is illustrated by an application to experiments of Umbanhowar, Melo, and Swinney [Nature 382, 793 (1996)] on vertically vibrated granular layers. Using a simple CCM model incorporating temporal period doubling and spatial patterning at a preferred length scale, we obtain results that bear remarkable similarities to the experimental observations. The fact that the model does not make use of physics specific to granular layers suggests that similar phenomena may be observed in other (nongranular) periodically forced, strongly dissipative systems. We also present a framework for the analysis of pattern selection in CCM models using a truncated modal expansion. Through the analysis, we predict scaling laws of various quantities, and these laws may be verifiable experimentally.

  10. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms.

    PubMed

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting; Guo, Feng-Biao

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus, which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates that machine learning-based method can assign more efficient weight coefficients than using empirical formula based on biological knowledge.

  11. Electrofacies classification using the self-organizing map approach with an example from the Algerian Sahara

    NASA Astrophysics Data System (ADS)

    Sokhal, Abdallah; Benaissa, Zahia; Ouadfeul, Sid Ali; Boudella, Amar

    2017-04-01

    The characterization of electrofacies is essential for reservoir modeling. However, this is a process that depends on many variables, with errors and associated noise that interfere on visual interpretation. In this paper, we propose an approach to characterize the reservoir properties of the Quartzite of Hamra formation. This method integrates geological and petrophysical data and compares them with the field performance analysis to achieve a practical electrofacies clustering. The petrophysical data used are from Hassi Guettar field in Hassi Messaoud basin. The neutron porosity, gamma ray and density profiles were studied to classify the field's reservoir lithology. An unsupervised neural network was employed based on the self-organizing map (SOM) technique to identify and extract electrofacies groups. Based on the results of the SOM method, the target reservoir is classified into five electrofacies clusters (EF1-EF5) among which the EF1, EF2 and EF3 show the best reservoir quality. The EF4 indicates moderate reservoir quality, while the EF5 shows no reservoir quality. Key words: Neural networks - Self-organizing map - Electrofacies - Porosity - Gamma ray - Density.

  12. A network-based phenotype mapping approach to identify genes that modulate drug response phenotypes

    PubMed Central

    Cairns, Junmei; Ung, Choong Yong; da Rocha, Edroaldo Lummertz; Zhang, Cheng; Correia, Cristina; Weinshilboum, Richard; Wang, Liewei; Li, Hu

    2016-01-01

    To better address the problem of drug resistance during cancer chemotherapy and explore the possibility of manipulating drug response phenotypes, we developed a network-based phenotype mapping approach (P-Map) to identify gene candidates that upon perturbed can alter sensitivity to drugs. We used basal transcriptomics data from a panel of human lymphoblastoid cell lines (LCL) to infer drug response networks (DRNs) that are responsible for conferring response phenotypes for anthracycline and taxane, two common anticancer agents use in clinics. We further tested selected gene candidates that interact with phenotypic differentially expressed genes (PDEGs), which are up-regulated genes in LCL for a given class of drug response phenotype in triple-negative breast cancer (TNBC) cells. Our results indicate that it is possible to manipulate a drug response phenotype, from resistant to sensitive or vice versa, by perturbing gene candidates in DRNs and suggest plausible mechanisms regulating directionality of drug response sensitivity. More important, the current work highlights a new way to formulate systems-based therapeutic design: supplementing therapeutics that aim to target disease culprits with phenotypic modulators capable of altering DRN properties with the goal to re-sensitize resistant phenotypes. PMID:27841317

  13. An efficient unsupervised index based approach for mapping urban vegetation from IKONOS imagery

    NASA Astrophysics Data System (ADS)

    Anchang, Julius Y.; Ananga, Erick O.; Pu, Ruiliang

    2016-08-01

    Despite the increased availability of high resolution satellite image data, their operational use for mapping urban land cover in Sub-Saharan Africa continues to be limited by lack of computational resources and technical expertise. As such, there is need for simple and efficient image classification techniques. Using Bamenda in North West Cameroon as a test case, we investigated two completely unsupervised pixel based approaches to extract tree/shrub (TS) and ground vegetation (GV) cover from an IKONOS derived soil adjusted vegetation index. These included: (1) a simple Jenks Natural Breaks classification and (2) a two-step technique that combined the Jenks algorithm with agglomerative hierarchical clustering. Both techniques were compared with each other and with a non-linear support vector machine (SVM) for classification performance. While overall classification accuracy was generally high for all techniques (>90%), One-Way Analysis of Variance tests revealed the two step technique to outperform the simple Jenks classification in terms of predicting the GV class. It also outperformed the SVM in predicting the TS class. We conclude that the unsupervised methods are technically as good and practically superior for efficient urban vegetation mapping in budget and technically constrained regions such as Sub-Saharan Africa.

  14. Policy, Research and Residents' Perspectives on Built Environments Implicated in Heart Disease: A Concept Mapping Approach.

    PubMed

    Stankov, Ivana; Howard, Natasha J; Daniel, Mark; Cargo, Margaret

    2017-02-09

    An underrepresentation of stakeholder perspectives within urban health research arguably limits our understanding of what is a multi-dimensional and complex relationship between the built environment and health. By engaging a wide range of stakeholders using a participatory concept mapping approach, this study aimed to achieve a more holistic and nuanced understanding of the built environments shaping disease risk, specifically cardiometabolic risk (CMR). Moreover, this study aimed to ascertain the importance and changeability of identified environments through government action. Through the concept mapping process, community members, researchers, government and non-government stakeholders collectively identified eleven clusters encompassing 102 built environmental domains related to CMR, a number of which are underrepresented within the literature. Among the identified built environments, open space, public transportation and pedestrian environments were highlighted as key targets for policy intervention. Whilst there was substantive convergence in stakeholder groups' perspectives concerning the built environment and CMR, there were disparities in the level of importance government stakeholders and community members respectively assigned to pedestrian environments and street connectivity. These findings support the role of participatory methods in strengthening how urban health issues are understood and in affording novel insights into points of action for public health and policy intervention.

  15. An automated approach for tone mapping operator parameter adjustment in security applications

    NASA Astrophysics Data System (ADS)

    Krasula, LukáÅ.¡; Narwaria, Manish; Le Callet, Patrick

    2014-05-01

    High Dynamic Range (HDR) imaging has been gaining popularity in recent years. Different from the traditional low dynamic range (LDR), HDR content tends to be visually more appealing and realistic as it can represent the dynamic range of the visual stimuli present in the real world. As a result, more scene details can be faithfully reproduced. As a direct consequence, the visual quality tends to improve. HDR can be also directly exploited for new applications such as video surveillance and other security tasks. Since more scene details are available in HDR, it can help in identifying/tracking visual information which otherwise might be difficult with typical LDR content due to factors such as lack/excess of illumination, extreme contrast in the scene, etc. On the other hand, with HDR, there might be issues related to increased privacy intrusion. To display the HDR content on the regular screen, tone-mapping operators (TMO) are used. In this paper, we present the universal method for TMO parameters tuning, in order to maintain as many details as possible, which is desirable in security applications. The method's performance is verified on several TMOs by comparing the outcomes from tone-mapping with default and optimized parameters. The results suggest that the proposed approach preserves more information which could be of advantage for security surveillance but, on the other hand, makes us consider possible increase in privacy intrusion.

  16. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms

    PubMed Central

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting

    2016-01-01

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus, which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates that machine learning-based method can assign more efficient weight coefficients than using empirical formula based on biological knowledge. PMID:27660763

  17. Geographical information system approaches for hazard mapping of dilute lahars on Montserrat, West Indies

    NASA Astrophysics Data System (ADS)

    Darnell, A. R.; Barclay, J.; Herd, R. A.; Phillips, J. C.; Lovett, A. A.; Cole, P.

    2012-08-01

    Many research tools for lahar hazard assessment have proved wholly unsuitable for practical application to an active volcanic system where field measurements are challenging to obtain. Two simple routing models, with minimal data demands and implemented in a geographical information system (GIS), were applied to dilute lahars originating from Soufrière Hills Volcano, Montserrat. Single-direction flow routing by path of steepest descent, commonly used for simulating normal stream-flow, was tested against LAHARZ, an established lahar model calibrated for debris flows, for ability to replicate the main flow routes. Comparing the ways in which these models capture observed changes, and how the different modelled paths deviate can also provide an indication of where dilute lahars, do not follow behaviour expected from single-phase flow models. Data were collected over two field seasons and provide (1) an overview of gross morphological change after one rainy season, (2) details of dominant channels at the time of measurement, and (3) order of magnitude estimates of individual flow volumes. Modelling results suggested both GIS-based predictive tools had associated benefits. Dominant flow routes observed in the field were generally well-predicted using the hydrological approach with a consideration of elevation error, while LAHARZ was comparatively more successful at mapping lahar dispersion and was better suited to long-term hazard assessment. This research suggests that end-member models can have utility for first-order dilute lahar hazard mapping.

  18. An Automatic Approach for Satisfying Dose-Volume Constraints in Linear Fluence Map Optimization for IMPT

    PubMed Central

    Zaghian, Maryam; Lim, Gino; Liu, Wei; Mohan, Radhe

    2014-01-01

    Prescriptions for radiation therapy are given in terms of dose-volume constraints (DVCs). Solving the fluence map optimization (FMO) problem while satisfying DVCs often requires a tedious trial-and-error for selecting appropriate dose control parameters on various organs. In this paper, we propose an iterative approach to satisfy DVCs using a multi-objective linear programming (LP) model for solving beamlet intensities. This algorithm, starting from arbitrary initial parameter values, gradually updates the values through an iterative solution process toward optimal solution. This method finds appropriate parameter values through the trade-off between OAR sparing and target coverage to improve the solution. We compared the plan quality and the satisfaction of the DVCs by the proposed algorithm with two nonlinear approaches: a nonlinear FMO model solved by using the L-BFGS algorithm and another approach solved by a commercial treatment planning system (Eclipse 8.9). We retrospectively selected from our institutional database five patients with lung cancer and one patient with prostate cancer for this study. Numerical results show that our approach successfully improved target coverage to meet the DVCs, while trying to keep corresponding OAR DVCs satisfied. The LBFGS algorithm for solving the nonlinear FMO model successfully satisfied the DVCs in three out of five test cases. However, there is no recourse in the nonlinear FMO model for correcting unsatisfied DVCs other than manually changing some parameter values through trial and error to derive a solution that more closely meets the DVC requirements. The LP-based heuristic algorithm outperformed the current treatment planning system in terms of DVC satisfaction. A major strength of the LP-based heuristic approach is that it is not sensitive to the starting condition. PMID:25506501

  19. An Automatic Approach for Satisfying Dose-Volume Constraints in Linear Fluence Map Optimization for IMPT.

    PubMed

    Zaghian, Maryam; Lim, Gino; Liu, Wei; Mohan, Radhe

    2014-02-01

    Prescriptions for radiation therapy are given in terms of dose-volume constraints (DVCs). Solving the fluence map optimization (FMO) problem while satisfying DVCs often requires a tedious trial-and-error for selecting appropriate dose control parameters on various organs. In this paper, we propose an iterative approach to satisfy DVCs using a multi-objective linear programming (LP) model for solving beamlet intensities. This algorithm, starting from arbitrary initial parameter values, gradually updates the values through an iterative solution process toward optimal solution. This method finds appropriate parameter values through the trade-off between OAR sparing and target coverage to improve the solution. We compared the plan quality and the satisfaction of the DVCs by the proposed algorithm with two nonlinear approaches: a nonlinear FMO model solved by using the L-BFGS algorithm and another approach solved by a commercial treatment planning system (Eclipse 8.9). We retrospectively selected from our institutional database five patients with lung cancer and one patient with prostate cancer for this study. Numerical results show that our approach successfully improved target coverage to meet the DVCs, while trying to keep corresponding OAR DVCs satisfied. The LBFGS algorithm for solving the nonlinear FMO model successfully satisfied the DVCs in three out of five test cases. However, there is no recourse in the nonlinear FMO model for correcting unsatisfied DVCs other than manually changing some parameter values through trial and error to derive a solution that more closely meets the DVC requirements. The LP-based heuristic algorithm outperformed the current treatment planning system in terms of DVC satisfaction. A major strength of the LP-based heuristic approach is that it is not sensitive to the starting condition.

  20. Mapping the progress and impacts of public health approaches to palliative care: a scoping review protocol

    PubMed Central

    Archibald, Daryll; Patterson, Rebecca; Haraldsdottir, Erna; Hazelwood, Mark; Fife, Shirley; Murray, Scott A

    2016-01-01

    Introduction Public health palliative care is a term that can be used to encompass a variety of approaches that involve working with communities to improve people's experience of death, dying and bereavement. Recently, public health palliative care approaches have gained recognition and momentum within UK health policy and palliative care services. There is general consensus that public health palliative care approaches can complement and go beyond the scope of formal service models of palliative care. However, there is no clarity about how these approaches can be undertaken in practice or how evidence can be gathered relating to their effectiveness. Here we outline a scoping review protocol that will systematically map and categorise the variety of activities and programmes that could be classified under the umbrella term ‘public health palliative care’ and highlight the impact of these activities where measured. Methods and analysis This review will be guided by Arksey and O'Malley's scoping review methodology and incorporate insights from more recent innovations in scoping review methodology. Sensitive searches of 9 electronic databases from 1999 to 2016 will be supplemented by grey literature searches. Eligible studies will be screened independently by two reviewers using a data charting tool developed for this scoping review. Ethics and dissemination This scoping review will undertake a secondary analysis of data already collected and does not require ethical approval. The results will facilitate better understanding of the practical application of public health approaches to palliative care, the impacts these activities can have and how to build the evidence base for this work in future. The results will be disseminated through traditional academic routes such as conferences and journals and also policy and third sector seminars. PMID:27417201

  1. Duality of boson and fermion: New intermediate-statistics

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang; Algin, Abdullah

    2017-10-01

    In this work, we propose a new model for describing an intermediate-statistics particles system. Starting with a deformed grand partition function, we investigate several thermodynamical and statistical properties of a gas model of two-parameter deformed particles. We specifically focus on the low-temperature behavior of the model and the conditions under which either boson condensation or fermion condensation would occur in such a model are discussed. Our results obtained in this study reveal that the present deformed gas model exhibits duality of boson and fermion, and can be useful for approaching the thermostatistics of condensation characteristics in quantum systems.

  2. Two-dimensional thermofield bosonization II: Massive fermions

    SciTech Connect

    Amaral, R.L.P.G.

    2008-11-15

    We consider the perturbative computation of the N-point function of chiral densities of massive free fermions at finite temperature within the thermofield dynamics approach. The infinite series in the mass parameter for the N-point functions are computed in the fermionic formulation and compared with the corresponding perturbative series in the interaction parameter in the bosonized thermofield formulation. Thereby we establish in thermofield dynamics the formal equivalence of the massive free fermion theory with the sine-Gordon thermofield model for a particular value of the sine-Gordon parameter. We extend the thermofield bosonization to include the massive Thirring model.

  3. Mind-mapping for lung cancer: Towards a personalized therapeutics approach

    PubMed Central

    Mollberg, N; Surati, M; Demchuk, C; Fathi, R; Salama, AK; Husain, AN; Hensing, T; Salgia, R

    2011-01-01

    There will be over 220,000 people diagnosed with lung cancer and over 160,000 dying of lung cancer this year alone in the United States. In order to arrive at better control, prevention, diagnosis, and therapeutics for lung cancer, we must be able to personalize the approach towards lung cancer. Mind-mapping has existed for centuries for physicians to properly think about various “flows” of personalized medicine. We include here the epidemiology, diagnosis, histology, and treatment of lung cancer—specifically, non-small cell lung cancer. As we have new molecular signatures for lung cancer, this is further detailed. This review is not meant to be a comprehensive review, but rather its purpose is to highlight important aspects of lung cancer diagnosis, management, and personalized treatment options. PMID:21337123

  4. Multipoint linkage analysis using sib pairs: A interval mapping approach for dichotomous outcomes

    SciTech Connect

    Olson, J.M.

    1995-03-01

    I propose an interval mapping approach suitable for a dichotomous outcome, with emphasis on samples of affected sib pairs. The method computes a lod score for each of a set of locations in the interval between two flanking markers and takes as its estimate of trait-locus location the maximum lod score in the interval, provided it exceeds the prespecified critical value. Use of the method depends on prior knowledge of the genetic model for the disease only through available estimates of recurrence risk to relatives of affected individuals. The method gives an unbiased estimate of location, provided the recurrence risks are correctly specified and provided the marker identity-by-descent probabilities are jointly, rather than individually, estimated. I also discuss use of the method for traits determined by two loci and give an approximation that has good power for a wide range of two-locus models. 25 refs., 2 figs., 9 tabs.

  5. Mind-mapping for lung cancer: towards a personalized therapeutics approach.

    PubMed

    Mollberg, N; Surati, M; Demchuk, C; Fathi, R; Salama, A K; Husain, A N; Hensing, T; Salgia, R

    2011-03-01

    There were over 220,000 people diagnosed with lung cancer and over 160,000 people dying of lung cancer during 2010 alone in the United States. In order to arrive at better control, prevention, diagnosis, and therapeutics for lung cancer, we must be able to personalize the approach towards the disease. Mind-mapping has existed for centuries for physicians to properly think about various "flows" of personalized medicine. We include here the epidemiology, diagnosis, histology, and treatment of lung cancer-in particular, non-small cell lung cancer. As we have new molecular signatures for lung cancer, this is further detailed. This review is not meant to be a comprehensive review, but rather its purpose is to highlight important aspects of lung cancer diagnosis, management, and personalized treatment options.

  6. Contemporary approaches to neural circuit manipulation and mapping: focus on reward and addiction

    PubMed Central

    Saunders, Benjamin T.; Richard, Jocelyn M.; Janak, Patricia H.

    2015-01-01

    Tying complex psychological processes to precisely defined neural circuits is a major goal of systems and behavioural neuroscience. This is critical for understanding adaptive behaviour, and also how neural systems are altered in states of psychopathology, such as addiction. Efforts to relate psychological processes relevant to addiction to activity within defined neural circuits have been complicated by neural heterogeneity. Recent advances in technology allow for manipulation and mapping of genetically and anatomically defined neurons, which when used in concert with sophisticated behavioural models, have the potential to provide great insight into neural circuit bases of behaviour. Here we discuss contemporary approaches for understanding reward and addiction, with a focus on midbrain dopamine and cortico-striato-pallidal circuits. PMID:26240425

  7. Segmentation of angiodysplasia lesions in WCE images using a MAP approach with Markov Random Fields.

    PubMed

    Vieira, Pedro M; Goncalves, Bruno; Goncalves, Carla R; Lima, Carlos S

    2016-08-01

    This paper deals with the segmentation of angiodysplasias in wireless capsule endoscopy images. These lesions are the cause of almost 10% of all gastrointestinal bleeding episodes, and its detection using the available software presents low sensitivity. This work proposes an automatic selection of a ROI using an image segmentation module based on the MAP approach where an accelerated version of the EM algorithm is used to iteratively estimate the model parameters. Spatial context is modeled in the prior probability density function using Markov Random Fields. The color space used was CIELab, specially the a component, which highlighted most these type of lesions. The proposed method is the first regarding this specific type of lesions, but when compared to other state-of-the-art segmentation methods, it almost doubles the results.

  8. Target attractor tracking of relative phase in Bosonic Josephson junction

    NASA Astrophysics Data System (ADS)

    Borisenok, Sergey

    2016-06-01

    The relative phase of Bosonic Josephson junction in the Josephson regime of Bose-Hubbard model is tracked via the target attractor (`synergetic') feedback algorithm with the inter-well coupling parameter presented as a control function. The efficiency of our approach is demonstrated numerically for Gaussian and harmonic types of target phases.

  9. Measure of tripartite entanglement in bosonic and fermionic systems

    SciTech Connect

    Buscemi, Fabrizio

    2011-08-15

    We describe an efficient theoretical criterion suitable for the evaluation of the tripartite entanglement of any mixed three-boson or three-fermion state, based on the notion of the entanglement of particles for bipartite systems of identical particles. Our approach allows one to quantify the accessible number of quantum correlations in the systems without any violation of the local particle number superselection rule. A generalization of the tripartite negativity is here applied to some correlated systems including the continuous-time quantum walks of identical particles (for both bosons and fermions) and compared with other criteria recently proposed in the literature. Our results show the dependence of the entanglement dynamics upon the quantum statistics: The bosonic bunching results in a low number of quantum correlations while Fermi-Dirac statistics allows for higher values of the entanglement.

  10. Nonlocal Symmetry Reductions for Bosonized Supersymmetric Burgers Equation

    NASA Astrophysics Data System (ADS)

    Ren, Bo; Lin, Ji; Le, Jia-Yi; Wang, Sheng; Dai, Tian-Zhao

    2017-08-01

    Based on the bosonization approach, the supersymmetric Burgers (SB) system is transformed to a coupled bosonic system. By solving the bosonized SB (BSB) equation, the difficulties caused by the anticommutative fermionic field of the SB equation can be avoided. The nonlocal symmetry for the BSB equation is obtained by the truncated Painlevé method. By introducing multiple new fields, the finite symmetry transformation for the BSB equation is derived by solving the first Lie’s principle of the prolonged systems. Some group invariant solutions are obtained with the similarity reductions related by the nonlocal symmetry. Supported by the National Natural Science Foundation of China under Grant Nos. 11675146, 11305106, 11472177, 11275129, and the Natural Science Foundation of Zhejiang Province of China under Grant No. LZ15A050001

  11. An Improved Map-Matching Technique Based on the Fréchet Distance Approach for Pedestrian Navigation Services

    PubMed Central

    Bang, Yoonsik; Kim, Jiyoung; Yu, Kiyun

    2016-01-01

    Wearable and smartphone technology innovations have propelled the growth of Pedestrian Navigation Services (PNS). PNS need a map-matching process to project a user’s locations onto maps. Many map-matching techniques have been developed for vehicle navigation services. These techniques are inappropriate for PNS because pedestrians move, stop, and turn in different ways compared to vehicles. In addition, the base map data for pedestrians are more complicated than for vehicles. This article proposes a new map-matching method for locating Global Positioning System (GPS) trajectories of pedestrians onto road network datasets. The theory underlying this approach is based on the Fréchet distance, one of the measures of geometric similarity between two curves. The Fréchet distance approach can provide reasonable matching results because two linear trajectories are parameterized with the time variable. Then we improved the method to be adaptive to the positional error of the GPS signal. We used an adaptation coefficient to adjust the search range for every input signal, based on the assumption of auto-correlation between consecutive GPS points. To reduce errors in matching, the reliability index was evaluated in real time for each match. To test the proposed map-matching method, we applied it to GPS trajectories of pedestrians and the road network data. We then assessed the performance by comparing the results with reference datasets. Our proposed method performed better with test data when compared to a conventional map-matching technique for vehicles. PMID:27782091

  12. Effect of delayed feedback on the dynamics of a scalar map via a frequency-domain approach.

    PubMed

    Gentile, Franco S; Bel, Andrea L; Belén D'Amico, M; Moiola, Jorge L

    2011-06-01

    The effect of delayed feedback on the dynamics of a scalar map is studied by using a frequency-domain approach. Explicit conditions for the occurrence of period-doubling and Neimark-Sacker bifurcations in the controlled map are found analytically. The appearance of a 1:2 resonance for certain values of the delay is also formalized, revealing that this phenomenon is independent of the system parameters. A detailed study of the well-known logistic map under delayed feedback is included for illustration.

  13. A universal airborne LiDAR approach for tropical forest carbon mapping.

    PubMed

    Asner, Gregory P; Mascaro, Joseph; Muller-Landau, Helene C; Vieilledent, Ghislain; Vaudry, Romuald; Rasamoelina, Maminiaina; Hall, Jefferson S; van Breugel, Michiel

    2012-04-01

    Airborne light detection and ranging (LiDAR) is fast turning the corner from demonstration technology to a key tool for assessing carbon stocks in tropical forests. With its ability to penetrate tropical forest canopies and detect three-dimensional forest structure, LiDAR may prove to be a major component of international strategies to measure and account for carbon emissions from and uptake by tropical forests. To date, however, basic ecological information such as height-diameter allometry and stand-level wood density have not been mechanistically incorporated into methods for mapping forest carbon at regional and global scales. A better incorporation of these structural patterns in forests may reduce the considerable time needed to calibrate airborne data with ground-based forest inventory plots, which presently necessitate exhaustive measurements of tree diameters and heights, as well as tree identifications for wood density estimation. Here, we develop a new approach that can facilitate rapid LiDAR calibration with minimal field data. Throughout four tropical regions (Panama, Peru, Madagascar, and Hawaii), we were able to predict aboveground carbon density estimated in field inventory plots using a single universal LiDAR model (r ( 2 ) = 0.80, RMSE = 27.6 Mg C ha(-1)). This model is comparable in predictive power to locally calibrated models, but relies on limited inputs of basal area and wood density information for a given region, rather than on traditional plot inventories. With this approach, we propose to radically decrease the time required to calibrate airborne LiDAR data and thus increase the output of high-resolution carbon maps, supporting tropical forest conservation and climate mitigation policy.

  14. Soil mapping in northern Thailand based on an radiometrically calibrated Maximum likelihood approach

    NASA Astrophysics Data System (ADS)

    Schuler, U.; Herrmann, L.; Rangnugpit, W.; Stahr, K.

    2009-04-01

    area with low background radiation showed different gamma-ray spectra for the respective reference soil groups, so that these points can be used as secondary training data. In conclusion, the calibration of the Maximum likelihood approach with airborne radiometric data offers a promising possibility for efficient soil mapping of larger regions in northern Thailand.

  15. An Improved Approach for Mapping Quantitative Trait Loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Tuskan, Gerald A; Yin, Tongming; Wullschleger, Stan D; Yang, Jie; Huang, Youjun; Li, Yao; Wu, Rongling

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  16. An Improved Approach for Mapping Quantitative Trait loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Wullschleger, Stan D; Wu, Song; Wu, Rongling; Yang, Jie; Li, Yao; Yin, Tongming; Tuskan, Gerald A

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  17. An improved approach for mapping quantitative trait Loci in a pseudo-testcross: revisiting a poplar mapping study.

    PubMed

    Wu, Song; Yang, Jie; Huang, Youjun; Li, Yao; Yin, Tongming; Wullschleger, Stan D; Tuskan, Gerald A; Wu, Rongling

    2010-02-04

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  18. A coherent geostatistical approach for combining choropleth map and field data in the spatial interpolation of soil properties.

    PubMed

    Goovaerts, P

    2011-06-01

    Information available for mapping continuous soil attributes often includes point field data and choropleth maps (e.g. soil or geological maps) that model the spatial distribution of soil attributes as the juxtaposition of polygons (areas) with constant values. This paper presents two approaches to incorporate both point and areal data in the spatial interpolation of continuous soil attributes. In the first instance, area-to-point kriging is used to map the variability within soil units while ensuring the coherence of the prediction so that the average of disaggregated estimates is equal to the original areal datum. The resulting estimates are then used as local means in residual kriging. The second approach proceeds in one step and capitalizes on: 1) a general formulation of kriging that allows the combination of both point and areal data through the use of area-to-area, area-to-point, and point-to-point covariances in the kriging system, 2) the availability of GIS to discretize polygons of irregular shape and size, and 3) knowledge of the point-support variogram model that can be inferred directly from point measurements, thereby eliminating the need for deconvolution procedures. The two approaches are illustrated using the geological map and heavy metal concentrations recorded in the topsoil of the Swiss Jura. Sensitivity analysis indicates that the new procedures improve prediction over ordinary kriging and traditional residual kriging based on the assumption that the local mean is constant within each mapping unit.

  19. A coherent geostatistical approach for combining choropleth map and field data in the spatial interpolation of soil properties

    PubMed Central

    Goovaerts, P.

    2011-01-01

    Summary Information available for mapping continuous soil attributes often includes point field data and choropleth maps (e.g. soil or geological maps) that model the spatial distribution of soil attributes as the juxtaposition of polygons (areas) with constant values. This paper presents two approaches to incorporate both point and areal data in the spatial interpolation of continuous soil attributes. In the first instance, area-to-point kriging is used to map the variability within soil units while ensuring the coherence of the prediction so that the average of disaggregated estimates is equal to the original areal datum. The resulting estimates are then used as local means in residual kriging. The second approach proceeds in one step and capitalizes on: 1) a general formulation of kriging that allows the combination of both point and areal data through the use of area-to-area, area-to-point, and point-to-point covariances in the kriging system, 2) the availability of GIS to discretize polygons of irregular shape and size, and 3) knowledge of the point-support variogram model that can be inferred directly from point measurements, thereby eliminating the need for deconvolution procedures. The two approaches are illustrated using the geological map and heavy metal concentrations recorded in the topsoil of the Swiss Jura. Sensitivity analysis indicates that the new procedures improve prediction over ordinary kriging and traditional residual kriging based on the assumption that the local mean is constant within each mapping unit. PMID:22308075

  20. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    NASA Astrophysics Data System (ADS)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts

  1. Auxiliary-Field Monte Carlo Method to Tackle Strong Interactions and Frustration in Lattice Bosons

    NASA Astrophysics Data System (ADS)

    Malpetti, Daniele; Roscilde, Tommaso

    2017-07-01

    We introduce a new numerical technique, the bosonic auxiliary-field Monte Carlo method, which allows us to calculate the thermal properties of large lattice-boson systems within a systematically improvable semiclassical approach, and which is virtually applicable to any bosonic model. Our method amounts to a decomposition of the lattice into clusters, and to an ansatz for the density matrix of the system in the form of a cluster-separable state—with nonentangled, yet classically correlated clusters. This approximation eliminates any sign problem, and can be systematically improved upon by using clusters of growing size. Extrapolation in the cluster size allows us to reproduce numerically exact results for the superfluid transition of hard-core bosons on the square lattice, and to provide a solid quantitative prediction for the superfluid and chiral transition of hardcore bosons on the frustrated triangular lattice.

  2. A Stochastic Simulation Framework for the Prediction of Strategic Noise Mapping and Occupational Noise Exposure Using the Random Walk Approach

    PubMed Central

    Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019

  3. Interhemispheric transfalcine approach and awake cortical mapping for resection of peri-atrial gliomas associated with the central lobule.

    PubMed

    Malekpour, Mahdi; Cohen-Gadol, Aaron A

    2015-02-01

    Medial posterior frontal and parietal gliomas extending to the peri-atrial region are difficult to reach surgically because of the working angle required to expose the lateral aspect of the tumor and the proximity of the tumor to the sensorimotor lobule; retraction of the sensorimotor cortex may lead to morbidity. The interhemispheric transfalcine approach is favorable and safe for resection of medial hemispheric tumors adjacent to the falx cerebri, but the literature on this approach is scarce. Awake cortical mapping using this operative route for tumors associated with the sensorimotor cortex has not been previously reported to our knowledge. We present the first case of a right medial posterior frontoparietal oligoastrocytoma that was resected through the interhemispheric transfalcine approach using awake cortical and subcortical mapping. Through a contralateral frontoparietal craniotomy, we excised a section of the falx and exposed the contralateral medial hemisphere. Cortical stimulation allowed localization of the supplementary motor cortex, and suprathreshold stimulation mapping excluded the primary motor cortex corresponding to the leg area. Gross total tumor resection was accomplished without any intraoperative or postoperative deficits. Awake cortical mapping using the contralateral transfalcine approach allows a "cross-court" operative route to map functional cortices and resect peri-atrial low-grade gliomas. This technique can minimize the otherwise necessary retraction on the ipsilateral hemisphere through an ipsilateral craniotomy.

  4. A stochastic simulation framework for the prediction of strategic noise mapping and occupational noise exposure using the random walk approach.

    PubMed

    Han, Lim Ming; Haron, Zaiton; Yahya, Khairulzan; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces.

  5. Proposal for Microwave Boson Sampling

    NASA Astrophysics Data System (ADS)

    Peropadre, Borja; Guerreschi, Gian Giacomo; Huh, Joonsuk; Aspuru-Guzik, Alán

    2016-09-01

    Boson sampling, the task of sampling the probability distribution of photons at the output of a photonic network, is believed to be hard for any classical device. Unlike other models of quantum computation that require thousands of qubits to outperform classical computers, boson sampling requires only a handful of single photons. However, a scalable implementation of boson sampling is missing. Here, we show how superconducting circuits provide such platform. Our proposal differs radically from traditional quantum-optical implementations: rather than injecting photons in waveguides, making them pass through optical elements like phase shifters and beam splitters, and finally detecting their output mode, we prepare the required multiphoton input state in a superconducting resonator array, control its dynamics via tunable and dispersive interactions, and measure it with nondemolition techniques.

  6. Working Group Report: Higgs Boson

    SciTech Connect

    Dawson, Sally; Gritsan, Andrei; Logan, Heather; Qian, Jianming; Tully, Chris; Van Kooten, Rick

    2013-10-30

    This report summarizes the work of the Energy Frontier Higgs Boson working group of the 2013 Community Summer Study (Snowmass). We identify the key elements of a precision Higgs physics program and document the physics potential of future experimental facilities as elucidated during the Snowmass study. We study Higgs couplings to gauge boson and fermion pairs, double Higgs production for the Higgs self-coupling, its quantum numbers and $CP$-mixing in Higgs couplings, the Higgs mass and total width, and prospects for direct searches for additional Higgs bosons in extensions of the Standard Model. Our report includes projections of measurement capabilities from detailed studies of the Compact Linear Collider (CLIC), a Gamma-Gamma Collider, the International Linear Collider (ILC), the Large Hadron Collider High-Luminosity Upgrade (HL-LHC), Very Large Hadron Colliders up to 100 TeV (VLHC), a Muon Collider, and a Triple-Large Electron Positron Collider (TLEP).

  7. A new approach of mapping soils in the Alps - Challenges of deriving soil information and creating soil maps for sustainable land use. An example from South Tyrol (Italy)

    NASA Astrophysics Data System (ADS)

    Baruck, Jasmin; Gruber, Fabian E.; Geitner, Clemens

    2015-04-01

    Nowadays sustainable land use management is gaining importance because intensive land use leads to increasing soil degradation. Especially in mountainous regions like the Alps sustainable land use management is important, as topography limits land use. Therefore, a database containing detailed information of soil characteristics is required. However, information of soil properties is far from being comprehensive. The project "ReBo - Terrain classification based on airborne laser scanning data to support soil mapping in the Alps", founded by the Autonomous Province of Bolzano, aims at developing a methodical framework of how to obtain soil data. The approach combines geomorphometric analysis and soil mapping to generate modern soil maps at medium-scale in a time and cost efficient way. In this study the open source GRASS GIS extension module r.geomorphon (Jasciewicz and Stepinski, 2013) is used to derive topographically homogeneous landform units out of high resolution DTMs on scale 1:5.000. Furthermore, for terrain segmentation and classification we additionally use medium-scale data sets (geology, parent material, land use etc.). As the Alps are characterized by a great variety of topography, parent material, wide range of moisture regimes etc. getting reliable soil data is difficult. Additionally, geomorphic activity (debris flow, landslide etc.) leads to natural disturbances. Thus, soil properties are highly diverse and largely scale dependent. Furthermore, getting soil information of anthropogenically influenced soils is an added challenge. Due to intensive cultivation techniques the natural link between the soil forming factors is often repealed. In South Tyrol we find the largest pome producing area in Europe. Normally, the annual precipitation is not enough for intensive orcharding. Thus, irrigation strategies are in use. However, as knowledge about the small scaled heterogeneous soil properties is mostly lacking, overwatering and modifications of the

  8. Nonequilibrium functional bosonization of quantum wire networks

    SciTech Connect

    Ngo Dinh, Stephane; Bagrets, Dmitry A.; Mirlin, Alexander D.

    2012-11-15

    We develop a general approach to nonequilibrium nanostructures formed by one-dimensional channels coupled by tunnel junctions and/or by impurity scattering. The formalism is based on nonequilibrium version of functional bosonization. A central role in this approach is played by the Keldysh action that has a form reminiscent of the theory of full counting statistics. To proceed with evaluation of physical observables, we assume the weak-tunneling regime and develop a real-time instanton method. A detailed exposition of the formalism is supplemented by two important applications: (i) tunneling into a biased Luttinger liquid with an impurity, and (ii) quantum Hall Fabry-Perot interferometry. - Highlights: Black-Right-Pointing-Pointer A nonequilibrium functional bosonization framework for quantum wire networks is developed Black-Right-Pointing-Pointer For the study of observables in the weak tunneling regime a real-time instanton method is elaborated. Black-Right-Pointing-Pointer We consider tunneling into a biased Luttinger liquid with an impurity. Black-Right-Pointing-Pointer We analyze electronic Fabry-Perot interferometers in the integer quantum Hall regime.

  9. Exotic Gauge Bosons in the 331 Model

    SciTech Connect

    Romero, D.; Ravinez, O.; Diaz, H.; Reyes, J.

    2009-04-30

    We analize the bosonic sector of the 331 model which contains exotic leptons, quarks and bosons (E,J,U,V) in order to satisfy the weak gauge SU(3){sub L} invariance. We develop the Feynman rules of the entire kinetic bosonic sector which will let us to compute some of the Z(0)' decays modes.

  10. Using a conceptual approach with a concept map of psychosis as an exemplar to promote critical thinking.

    PubMed

    Vacek, Jenny E

    2009-01-01

    Teaching students to think critically is an important component of nursing education. A literature review suggests that a conceptual approach and a concept map may help facilitate critical thinking in the nursing student. Currently, there are patients in various health care settings who manifest psychosis and need treatment for the disorder. This article proposes using both a concept map and a conceptual approach to teach the concept of psychosis instead of focusing on content. If students understand the general concept of psychosis, they can identify and implement nursing actions for patients with psychosis regardless of the etiology or health care setting.

  11. A fast and cost-effective approach to develop and map EST-SSR markers: oak as a case study

    PubMed Central

    2010-01-01

    Background Expressed Sequence Tags (ESTs) are a source of simple sequence repeats (SSRs) that can be used to develop molecular markers for genetic studies. The availability of ESTs for Quercus robur and Quercus petraea provided a unique opportunity to develop microsatellite markers to accelerate research aimed at studying adaptation of these long-lived species to their environment. As a first step toward the construction of a SSR-based linkage map of oak for quantitative trait locus (QTL) mapping, we describe the mining and survey of EST-SSRs as well as a fast and cost-effective approach (bin mapping) to assign these markers to an approximate map position. We also compared the level of polymorphism between genomic and EST-derived SSRs and address the transferability of EST-SSRs in Castanea sativa (chestnut). Results A catalogue of 103,000 Sanger ESTs was assembled into 28,024 unigenes from which 18.6% presented one or more SSR motifs. More than 42% of these SSRs corresponded to trinucleotides. Primer pairs were designed for 748 putative unigenes. Overall 37.7% (283) were found to amplify a single polymorphic locus in a reference full-sib pedigree of Quercus robur. The usefulness of these loci for establishing a genetic map was assessed using a bin mapping approach. Bin maps were constructed for the male and female parental tree for which framework linkage maps based on AFLP markers were available. The bin set consisting of 14 highly informative offspring selected based on the number and position of crossover sites. The female and male maps comprised 44 and 37 bins, with an average bin length of 16.5 cM and 20.99 cM, respectively. A total of 256 EST-SSRs were assigned to bins and their map position was further validated by linkage mapping. EST-SSRs were found to be less polymorphic than genomic SSRs, but their transferability rate to chestnut, a phylogenetically related species to oak, was higher. Conclusion We have generated a bin map for oak comprising 256 EST

  12. Multiconfigurational Time-Dependent Hartree Methods for Bosonic Systems:. Theory and Applications

    NASA Astrophysics Data System (ADS)

    Alon, Ofir E.; Streltsov, Alexej I.; Sakmann, Kaspar; Cederbaum, Lorenz S.

    2013-02-01

    We review the multiconfigurational time-dependent Hartree method for bosons, which is a formally exact many-body theory for the propagation of the time dependent Schrödinger equation of N interacting identical bosons. In this approach, the time-dependent many-boson wavefunction is written as a sum of all permanents assembled from M orthogonal orbitals, where both the expansion coefficients and the permanents (orbitals) themselves are time-dependent and determined according to the Dirac-Frenkel time-dependent variational principle. In this way, a much larger effective subspace of the many-boson Hilbert space can be spanned in practice, in contrast to multiconfigurational expansions with timeindependent configurations. We also briefly discuss the extension of this method to bosonic mixtures and resonantly coupled bosonic atoms and molecules. Two applications in one dimension are presented: (i) the numerically exact solution of the time-dependent many-boson Schrödinger equation for the population dynamics in a repulsive bosonic Josephson junction is shown to deviate significantly from the predictions of the commonly used Gross-Pitaevskii equation and Bose-Hubbard model; and (ii) the many-body dynamics of a soliton train in an attractive Bose-Einstein condensate is shown to deviate substantially from the widely accepted predictions of the Gross--Pitaevskii mean-field theory.

  13. Seeing the whole picture: A comprehensive imaging approach to functional mapping of circuits in behaving zebrafish.

    PubMed

    Feierstein, C E; Portugues, R; Orger, M B

    2015-06-18

    In recent years, the zebrafish has emerged as an appealing model system to tackle questions relating to the neural circuit basis of behavior. This can be attributed not just to the growing use of genetically tractable model organisms, but also in large part to the rapid advances in optical techniques for neuroscience, which are ideally suited for application to the small, transparent brain of the larval fish. Many characteristic features of vertebrate brains, from gross anatomy down to particular circuit motifs and cell-types, as well as conserved behaviors, can be found in zebrafish even just a few days post fertilization, and, at this early stage, the physical size of the brain makes it possible to analyze neural activity in a comprehensive fashion. In a recent study, we used a systematic and unbiased imaging method to record the pattern of activity dynamics throughout the whole brain of larval zebrafish during a simple visual behavior, the optokinetic response (OKR). This approach revealed the broadly distributed network of neurons that were active during the behavior and provided insights into the fine-scale functional architecture in the brain, inter-individual variability, and the spatial distribution of behaviorally relevant signals. Combined with mapping anatomical and functional connectivity, targeted electrophysiological recordings, and genetic labeling of specific populations, this comprehensive approach in zebrafish provides an unparalleled opportunity to study complete circuits in a behaving vertebrate animal.

  14. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches

    PubMed Central

    Ma, Ying; Shaik, Mohammed A.; Kozberg, Mariel G.; Thibodeaux, David N.; Zhao, Hanzhi T.; Yu, Hang

    2016-01-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574312

  15. Bosonic cascades of indirect excitons

    NASA Astrophysics Data System (ADS)

    Nalitov, A. V.; De Liberato, S.; Lagoudakis, P.; Savvidis, P. G.; Kavokin, A. V.

    2017-08-01

    Recently, the concept of the terahertz bosonic cascade laser (BCL) based on a parabolic quantum well (PQW) embedded in a microcavity was proposed. We refine this proposal by suggesting transitions between indirect exciton (IX) states as a source of terahertz emission. We explicitly propose a structure containing a narrow-square QW and a wide-parabolic QW for the realisation of a bosonic cascade. Advantages of this type of structures are in large dipole matrix elements for terahertz transitions and in long exciton radiative lifetimes which are crucial for realisation of threshold and quantum efficiency BCLs.

  16. Andreev Reflection in Bosonic Condensates

    SciTech Connect

    Zapata, I.; Sols, F.

    2009-05-08

    We study the bosonic analog of Andreev reflection at a normal-superfluid interface where the superfluid is a boson condensate. We model the normal region as a zone where nonlinear effects can be neglected. Against the background of a decaying condensate, we identify a novel contribution to the current of reflected atoms. The group velocity of this Andreev reflected component differs from that of the normally reflected one. For a three-dimensional planar or two-dimensional linear interface Andreev reflection is neither specular nor conjugate.

  17. Andreev reflection in bosonic condensates.

    PubMed

    Zapata, I; Sols, F

    2009-05-08

    We study the bosonic analog of Andreev reflection at a normal-superfluid interface where the superfluid is a boson condensate. We model the normal region as a zone where nonlinear effects can be neglected. Against the background of a decaying condensate, we identify a novel contribution to the current of reflected atoms. The group velocity of this Andreev reflected component differs from that of the normally reflected one. For a three-dimensional planar or two-dimensional linear interface Andreev reflection is neither specular nor conjugate.

  18. Tagging b jets associated with heavy neutral MSSM Higgs bosons

    NASA Astrophysics Data System (ADS)

    Heikkinen, A.; Lehti, S.

    2006-04-01

    Since a neural network (NN) approach has been shown to be applicable to the problem of Higgs boson detection at LHC [I. Iashvili, A. Kharchilava, CMS TN-1996/100; M. Mjahed, Nucl. Phys. B 140 (2005) 799], we study the use of NNs in the problem of tagging b jets in pp →bb¯HSUSY, HSUSY→ττ in the Compact Muons Solenoid experiment [F. Hakl, et al., Nucl. Instr. and Meth. A 502 (2003) 489; S. Lehti, CMS NOTE-2001/019; G. Segneri, F. Palla, CMS NOTE-2002/046]. B tagging is an important tool for separating the Higgs events with associated b jets from the Drell-Yan background Z,γ*→ττ, for which the associated jets are mostly light quark and gluon jets. We teach multi-layer perceptrons (MLPs) available in the object oriented implementation of data analysis framework ROOT [ROOT—An Object Oriented Data Analysis Framework, in: Proceedings of the AIHENP'96 Workshop, Lausanne, September 1996, Nucl. Instr. and Meth. A 389 (1997) 81]. The following learning methods are evaluated: steepest descent algorithm, (BFGS) Broyden-Fletcher-Goldfarb-Shanno algorithm, and variants of conjugate gradients. The ROOT code generation feature of standalone C++ classifiers is utilized. We compare the b tagging performance of MLPs with another ROOT based feed forward NN tool NeuNet [J.P. Ernenwein, NeuNet software for ROOT], which uses a common back-propagation learning method. In addition, we demonstrate the use of the self-organizing map program package (SOM_PAK) and the learning vector quantization program package (LVQ_PAK) [T. Kohonen, et al., SOM_PAK: the self-organizing map program package, Technical Report A31; T. Kohonen, et al., LVQ_PAK: the learning vector quantization program package, Technical Report A30, Laboratory of Computer and Information Science, Helsinki University of Technology, FIN-02150 Espoo, Finland, 1996] in the b tagging problem.

  19. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    USDA-ARS?s Scientific Manuscript database

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  20. The influence of mapped hazards on risk beliefs: a proximity-based modeling approach.

    PubMed

    Severtson, Dolores J; Burt, James E

    2012-02-01

    Interview findings suggest perceived proximity to mapped hazards influences risk beliefs when people view environmental hazard maps. For dot maps, four attributes of mapped hazards influenced beliefs: hazard value, proximity, prevalence, and dot patterns. In order to quantify the collective influence of these attributes for viewers' perceived or actual map locations, we present a model to estimate proximity-based hazard or risk (PBH) and share study results that indicate how modeled PBH and map attributes influenced risk beliefs. The randomized survey study among 447 university students assessed risk beliefs for 24 dot maps that systematically varied by the four attributes. Maps depicted water test results for a fictitious hazardous substance in private residential wells and included a designated "you live here" location. Of the nine variables that assessed risk beliefs, the numerical susceptibility variable was most consistently and strongly related to map attributes and PBH. Hazard value, location in or out of a clustered dot pattern, and distance had the largest effects on susceptibility. Sometimes, hazard value interacted with other attributes, for example, distance had stronger effects on susceptibility for larger than smaller hazard values. For all combined maps, PBH explained about the same amount of variance in susceptibility as did attributes. Modeled PBH may have utility for studying the influence of proximity to mapped hazards on risk beliefs, protective behavior, and other dependent variables. Further work is needed to examine these influences for more realistic maps and representative study samples.

  1. Sequencing the Pig Genome Using a Mapped BAC by BAC Approach

    USDA-ARS?s Scientific Manuscript database

    We have generated a highly contiguous physical map covering >98% of the pig genome in just 176 contigs. The map is localised to the genome through integration with the UIUC RH map as well BAC end sequence alignments to the human genome. Over 265k HindIII restriction digest fingerprints totalling 1...

  2. Use of mapping and spatial and space-time modeling approaches in operational control of Aedes aegypti and dengue.

    PubMed

    Eisen, Lars; Lozano-Fuentes, Saul

    2009-01-01

    The aims of this review paper are to 1) provide an overview of how mapping and spatial and space-time modeling approaches have been used to date to visualize and analyze mosquito vector and epidemiologic data for dengue; and 2) discuss the potential for these approaches to be included as routine activities in operational vector and dengue control programs. Geographical information system (GIS) software are becoming more user-friendly and now are complemented by free mapping software that provide access to satellite imagery and basic feature-making tools and have the capacity to generate static maps as well as dynamic time-series maps. Our challenge is now to move beyond the research arena by transferring mapping and GIS technologies and spatial statistical analysis techniques in user-friendly packages to operational vector and dengue control programs. This will enable control programs to, for example, generate risk maps for exposure to dengue virus, develop Priority Area Classifications for vector control, and explore socioeconomic associations with dengue risk.

  3. Use of Mapping and Spatial and Space-Time Modeling Approaches in Operational Control of Aedes aegypti and Dengue

    PubMed Central

    Eisen, Lars; Lozano-Fuentes, Saul

    2009-01-01

    The aims of this review paper are to 1) provide an overview of how mapping and spatial and space-time modeling approaches have been used to date to visualize and analyze mosquito vector and epidemiologic data for dengue; and 2) discuss the potential for these approaches to be included as routine activities in operational vector and dengue control programs. Geographical information system (GIS) software are becoming more user-friendly and now are complemented by free mapping software that provide access to satellite imagery and basic feature-making tools and have the capacity to generate static maps as well as dynamic time-series maps. Our challenge is now to move beyond the research arena by transferring mapping and GIS technologies and spatial statistical analysis techniques in user-friendly packages to operational vector and dengue control programs. This will enable control programs to, for example, generate risk maps for exposure to dengue virus, develop Priority Area Classifications for vector control, and explore socioeconomic associations with dengue risk. PMID:19399163

  4. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    NASA Astrophysics Data System (ADS)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels

  5. Mapping Natural Terroir Units using a multivariate approach and legacy data

    NASA Astrophysics Data System (ADS)

    Priori, Simone; Barbetti, Roberto; L'Abate, Giovanni; Bucelli, Piero; Storchi, Paolo; Costantini, Edoardo A. C.

    2014-05-01

    was then subdivided into 9 NTUs, statistically differentiated for the used variables. The vineyard areas of Siena province was subdivided into 9 NTU, statistically differentiated for the used variables. The study demonstrated the strength of a multivariate approach for NTU mapping at province scale (1:125,000), using viticultural legacy data. Identification and mapping of terroir diversity within the DOC and DOCG at the province scale suggest the adoption of viticultural subzones. The subzones, based on the NTU, could bring to the fruition of different wine-production systems that enhanced the peculiarities of the terroir.

  6. An innovative multimodality approach for sentinel node mapping and biopsy in head and neck malignancies.

    PubMed

    Borbón-Arce, M; Brouwer, O R; van den Berg, N S; Mathéron, H; Klop, W M C; Balm, A J M; van Leeuwen, F W B; Valdés-Olmos, R A

    2014-01-01

    Recent innovations such as preoperative SPECT/CT, intraoperative imaging using portable devices and a hybrid tracer were evaluated in a multimodality approach for sentinel node (SN) mapping and biopsy in head and neck malignancies. The evaluation included 25 consecutive patients with head and neck malignancies (16 melanomas and 9 oral cavity squamous cell carcinomas). Patients were peritumorally injected with the hybrid tracer ICG-(99m)Tc-nanocolloid. SNs were initially identified with lymphoscintigraphy followed by single photon emission computed tomography (SPECT/CT) 2 hours after tracer administration. During surgery a portable gamma camera in combination with a near-infrared fluorescence camera was used in addition to a handheld gamma ray detection probe to locate the SNs. In all patients the use of conventional lymphoscintigraphy, SPECT/CT and the additional help of the portable gamma camera in one case were able to depict a total of 67 SNs (55 of them visualized on planar images, 11 additional on SPECT/CT and 1 additional with the portable gamma camera). A total of 67 of the preoperatively defined SNs together with 22 additional SNs were removed intraoperatively; 12 out of the 22 additional SNs found during operation were located in the vicinity of the injection site in anatomical areas such as the periauricular or submental regions. The other 10 additional SNs were found by radioguided post-resection control of the excision SN site. In the present series 26% additional SNs were found using the multimodal approach, that incorporates SPECT/CT and intraoperative imaging to the conventional procedure. This approach appears to be useful in malignancies located close to the area of lymphatic drainage such as the periauricular area and the oral cavity. Copyright © 2013 Elsevier España, S.L.U. and SEMNIM. All rights reserved.

  7. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    USGS Publications Warehouse

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  8. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach.

    PubMed

    Tracey, Jeff A; Zhu, Jun; Boydston, Erin; Lyren, Lisa; Fisher, Robert N; Crooks, Kevin R

    2013-04-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We took a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combined particle swarm optimization and an expectation-maximization (EM) algorithm to obtain maximum-likelihood estimates of the model parameters. We used this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  9. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    PubMed Central

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map. PMID:26742857

  10. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    NASA Astrophysics Data System (ADS)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  11. Number of spin I states for bosons

    SciTech Connect

    Zhang, L. H.; Zhao, Y. M.; Jia, L. Y.; Arima, A.

    2008-01-15

    We study number of spin I states for bosons in this article. We extend Talmi's recursion formulas for number of states with given spin I to boson systems, and we prove empirical formulas for five bosons by using these recursions. We obtain number of states with given spin I and F spin for three and four bosons by using sum rules of six-j and nine-j symbols. We also present empirical formulas of states for d bosons with given spin I and F=F{sub max}-1 and F{sub max}-2.0.

  12. Using a constructivist approach with online concept maps: relationship between theory and nursing education.

    PubMed

    Conceição, Simone C O; Taylor, Linda D

    2007-01-01

    Concept maps have been used in nursing education as a method for students to organize and analyze data. This article describes an online course that used concept maps and self-reflective journals to assess students' thinking processes. The self-reflective journals of 21 students collected over two semesters were qualitatively examined. Three major themes emerged from students' use of concept maps: 1) factors influencing the map creation, 2) developmental learning process over time, and 3) validation of existing knowledge and construction of new knowledge. The use of concept maps with reflective journaling provided a learning experience that allowed students to integrate content consistent with a constructivist paradigm. This integration is a developmental process influenced by the personal preferences of students, concept map design, and content complexity. This developmental process provides early evidence that the application of concept mapping in the online environment, along with reflective journaling, allows students to make new connections, integrate previous knowledge, and validate existing knowledge.

  13. Mapping irrigation potential from renewable groundwater in Africa - a quantitative hydrological approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Y.; Villholth, K. G.

    2015-02-01

    Groundwater provides an important buffer to climate variability in Africa. Yet, groundwater irrigation contributes only a relatively small share of cultivated land, approximately 1% (about 2 × 106 hectares) as compared to 14% in Asia. While groundwater is over-exploited for irrigation in many parts in Asia, previous assessments indicate an underutilized potential in parts of Africa. As opposed to previous country-based estimates, this paper derives a continent-wide, distributed (0.5° spatial resolution) map of groundwater irrigation potential, indicated in terms of fractions of cropland potentially irrigable with renewable groundwater. The method builds on an annual groundwater balance approach using 41 years of hydrological data, allocating only that fraction of groundwater recharge that is in excess after satisfying other present human needs and environmental requirements, while disregarding socio-economic and physical constraints in access to the resource. Due to high uncertainty of groundwater environmental needs, three scenarios, leaving 30, 50 and 70% of recharge for the environment, were implemented. Current dominating crops and cropping rotations and associated irrigation requirements in a zonal approach were applied in order to convert recharge excess to potential irrigated cropland. Results show an inhomogeneously distributed groundwater irrigation potential across the continent, even within individual countries, mainly reflecting recharge patterns and presence or absence of cultivated cropland. Results further show that average annual renewable groundwater availability for irrigation ranges from 692 to 1644 km3 depending on scenario. The total area of cropland irrigable with renewable groundwater ranges from 44.6 to 105.3 × 106 ha, corresponding to 20.5 to 48.6% of the cropland over the continent. In particular, significant potential exists in the semi-arid Sahel and eastern African regions which could support poverty alleviation if developed

  14. Non-linear dynamics of operant behavior: a new approach via the extended return map.

    PubMed

    Li, Jay-Shake; Huston, Joseph P

    2002-01-01

    Previous efforts to apply non-linear dynamic tools to the analysis of operant behavior revealed some promise for this kind of approach, but also some doubts, since the complexity of animal behavior seemed to be beyond the analyzing ability of the available tools. We here outline a series of studies based on a novel approach. We modified the so-called 'return map' and developed a new method, the 'extended return map' (ERM) to extract information from the highly irregular time series data, the inter-response time (IRT) generated by Skinner-box experiments. We applied the ERM to operant lever pressing data from rats using the four fundamental reinforcement schedules: fixed interval (FI), fixed ratio (FR), variable interval (VI) and variable ratio (VR). Our results revealed interesting patterns in all experiment groups. In particular, the FI and VI groups exhibited well-organized clusters of data points. We calculated the fractal dimension out of these patterns and compared experimental data with surrogate data sets, that were generated by randomly shuffling the sequential order of original IRTs. This comparison supported the finding that patterns in ERM reflect the dynamics of the operant behaviors under study. We then built two models to simulate the functional mechanisms of the FI schedule. Both models can produce similar distributions of IRTs and the stereotypical 'scalloped' curve characteristic of FI responding. However, they differ in one important feature in their formulation: while one model uses a continuous function to describe the probability of occurrence of an operant behavior, the other one employs an abrupt switch of behavioral state. Comparison of ERMs showed that only the latter was able to produce patterns similar to the experimental results, indicative of the operation of an abrupt switch from one behavioral state to another over the course of the inter-reinforcement period. This example demonstrated the ERM to be a useful tool for the analysis of

  15. Crater Mapping in the Pluto-Charon System: Considerations, Approach, and Progress

    NASA Astrophysics Data System (ADS)

    Robbins, S. J.; Singer, K. N.; Bray, V. J.; Schenk, P.; Zangari, A. M.; McKinnon, W. B.; Young, L. A.; Runyon, K. D.; Beyer, R. A.; Porter, S.; Lauer, T.; Weaver, H. A., Jr.; Olkin, C.; Ennico Smith, K.; Stern, A.

    2015-12-01

    NASA's New Horizons mission successfully made its closest approach to Pluto on July 14, 2015, at 11:49A.M. UTC. The flyby nature of the mission, distance to the system, and multiple planetary bodies to observe with a diverse instrument set required a complex imaging campaign marked by numerous trade-offs; these lead to a more complicated crater population mapping than a basic orbital mission. The Pluto and Charon imaging campaigns were full-disk or mosaics of the full disk until ≈3.5 hrs before closest approach when the pixel scale was 0.9 km/px. After this, several LORRI-specific imaging campaigns were conducted of the partial disk and later the full crescent, while additional strips were ride-alongs with other instruments. These should supply partial coverage at up to 70-80 m/px for Pluto and 160 m/px for Charon. The LORRI coverage at ≈0.4 km/px does not cover the entire encounter hemisphere, but the MVIC instrument provided comparable full-disk coverage (0.5 km/px) and partial disk at 0.3 km/px. The best images of the non-encounter hemispheres of Pluto and Charon are ≈21 km/px (taken midnight July 10-11). As with any single flyby mission, we are constrained by the best pixel scales and incidence angles at which images were taken during the flyby. While most high-resolution imaging by quantity has been done over areas of variable solar incidence as the spacecraft passed by Pluto and Charon, these cover a relatively small fraction of the bodies and most coverage has been at near-noon sun which makes crater identification difficult. Numerous team members are independently using a variety of crater mapping tools and image products, which will be reconciled and merged to make a more robust final database. We will present our consensus crater database to-date of both plutonian and charonian impact craters as well as correlations with preliminary geologic units. We will also discuss how the crater population compares with predictions and modeled Kuiper Belt

  16. An approach for mapping large-area impervious surfaces: Synergistic use of Landsat-7 ETM+ and high spatial resolution imagery

    USGS Publications Warehouse

    Yang, L.; Huang, C.; Homer, C.G.; Wylie, B.K.; Coan, M.J.

    2003-01-01

    A wide range of urban ecosystem studies, including urban hydrology, urban climate, land use planning, and resource management, require current and accurate geospatial data of urban impervious surfaces. We developed an approach to quantify urban impervious surfaces as a continuous variable by using multisensor and multisource datasets. Subpixel percent impervious surfaces at 30-m resolution were mapped using a regression tree model. The utility, practicality, and affordability of the proposed method for large-area imperviousness mapping were tested over three spatial scales (Sioux Falls, South Dakota, Richmond, Virginia, and the Chesapeake Bay areas of the United States). Average error of predicted versus actual percent impervious surface ranged from 8.8 to 11.4%, with correlation coefficients from 0.82 to 0.91. The approach is being implemented to map impervious surfaces for the entire United States as one of the major components of the circa 2000 national land cover database.

  17. Higgs Boson Discovery and Properties

    SciTech Connect

    Rowson, Peter C.

    2003-06-02

    We outline issues examined and progress made by the Light Higgs Snowmass 1996 working group regarding discovering Higgs bosons and measuring their detailed properties. We focused primarily on what could be learned at LEP2, the Tevatron (after upgrade), the LHC, a next linear e{sup +}e{sup -} collider and a {mu}{sup +}{mu}{sup -} collider.

  18. Boson sampling with Gaussian measurements

    NASA Astrophysics Data System (ADS)

    Chakhmakhchyan, L.; Cerf, N. J.

    2017-09-01

    We develop an alternative boson sampling model operating on single-photon states followed by linear interferometry and Gaussian measurements. The hardness proof for simulating such continuous-variable measurements is established in two main steps, making use of the symmetry of quantum evolution under time reversal. Namely, we first construct a twofold version of scattershot boson sampling in which, as opposed to the original proposal, both legs of a collection of two-mode squeezed vacuum states undergo parallel linear-optical transformations. This twofold scattershot model yields, as a corollary, an instance of boson sampling from Gaussian states where photon counting is hard to simulate. Then, a time-reversed setup is used to exhibit a boson sampling model in which the simulation of Gaussian measurements—namely the outcome of eight-port homodyne detection—is proven to be computationally hard. These results illustrate how the symmetry of quantum evolution under time reversal may serve as a tool for analyzing the computational complexity of novel physically motivated computational problems.

  19. Tailoring online information retrieval to user's needs based on a logical semantic approach to natural language processing and UMLS mapping.

    PubMed

    Kossman, Susan; Jones, Josette; Brennan, Patricia Flatley

    2007-10-11

    Depression can derail teenagers' lives and cause serious chronic health problems. Acquiring pertinent knowledge and skills supports care management, but retrieving appropriate information can be difficult. This poster presents a strategy to tailor online information to user attributes using a logical semantic approach to natural language processing (NLP) and mapping propositions to UMLS terms. This approach capitalizes on existing NLM resources and presents a potentially sustainable plan for meeting consumers and providers information needs.

  20. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    ERIC Educational Resources Information Center

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  1. The Effect of Concept Mapping-Guided Discovery Integrated Teaching Approach on Chemistry Students' Achievement and Retention

    ERIC Educational Resources Information Center

    Fatokun, K. V. F.; Eniayeju, P. A.

    2014-01-01

    This study investigates the effects of Concept Mapping-Guided Discovery Integrated Teaching Approach on the achievement and retention of chemistry students. The sample comprised 162 Senior Secondary two (SS 2) students drawn from two Science Schools in Nasarawa State, Central Nigeria with equivalent mean scores of 9.68 and 9.49 in their pre-test.…

  2. Making sense of human ecology mapping: an overview of approaches to integrating socio-spatial data into environmental planning

    Treesearch

    Rebecca McLain; Melissa R. Poe; Kelly Biedenweg; Lee K. Cerveny; Diane Besser; Dale J. Blahna

    2013-01-01

    Ecosystem-based planning and management have stimulated the need to gather sociocultural values and human uses of land in formats accessible to diverse planners and researchers. Human Ecology Mapping (HEM) approaches offer promising spatial data gathering and analytical tools, while also addressing important questions about human-landscape connections. This article...

  3. Using an empirical and rule-based modeling approach to map cause of disturbance in U.S

    Treesearch

    Todd A. Schroeder; Gretchen G. Moisen; Karen Schleeweis; Chris Toney; Warren B. Cohen; Zhiqiang Yang; Elizabeth A. Freeman

    2015-01-01

    Recently completing over a decade of research, the NASA/NACP funded North American Forest Dynamics (NAFD) project has led to several important advancements in the way U.S. forest disturbance dynamics are mapped at regional and continental scales. One major contribution has been the development of an empirical and rule-based modeling approach which addresses two of the...

  4. A dominance-based approach to map risks of ecological invasions in the presence of severe uncertainty

    Treesearch

    Denys Yemshanov; Frank H. Koch; D. Barry Lyons; Mark Ducey; Klaus Koehler

    2012-01-01

    Aim Uncertainty has been widely recognized as one of the most critical issues in predicting the expansion of ecological invasions. The uncertainty associated with the introduction and spread of invasive organisms influences how pest management decision makers respond to expanding incursions. We present a model-based approach to map risk of ecological invasions that...

  5. The Effect of Concept Mapping-Guided Discovery Integrated Teaching Approach on Chemistry Students' Achievement and Retention

    ERIC Educational Resources Information Center

    Fatokun, K. V. F.; Eniayeju, P. A.

    2014-01-01

    This study investigates the effects of Concept Mapping-Guided Discovery Integrated Teaching Approach on the achievement and retention of chemistry students. The sample comprised 162 Senior Secondary two (SS 2) students drawn from two Science Schools in Nasarawa State, Central Nigeria with equivalent mean scores of 9.68 and 9.49 in their pre-test.…

  6. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    ERIC Educational Resources Information Center

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  7. Mapping Trends in Pedagogical Approaches and Learning Technologies: Perspectives from the Canadian, International, and Military Education Contexts

    ERIC Educational Resources Information Center

    Scoppio, Grazia; Covell, Leigha

    2016-01-01

    Increased technological advances, coupled with new learners' needs, have created new realities for higher education contexts. This study explored and mapped trends in pedagogical approaches and learning technologies in postsecondary education and identified how these innovations are affecting teaching and learning practices in higher education…

  8. Agricultural Land Use mapping by multi-sensor approach for hydrological water quality monitoring

    NASA Astrophysics Data System (ADS)

    Brodsky, Lukas; Kodesova, Radka; Kodes, Vit

    2010-05-01

    The main objective of this study is to demonstrate potential of operational use of the high and medium resolution remote sensing data for hydrological water quality monitoring by mapping agriculture intensity and crop structures. In particular use of remote sensing mapping for optimization of pesticide monitoring. The agricultural mapping task is tackled by means of medium spatial and high temporal resolution ESA Envisat MERIS FR images together with single high spatial resolution IRS AWiFS image covering the whole area of interest (the Czech Republic). High resolution data (e.g. SPOT, ALOS, Landsat) are often used for agricultural land use classification, but usually only at regional or local level due to data availability and financial constraints. AWiFS data (nominal spatial resolution 56 m) due to the wide satellite swath seems to be more suitable for use at national level. Nevertheless, one of the critical issues for such a classification is to have sufficient image acquisitions over the whole vegetation period to describe crop development in appropriate way. ESA MERIS middle-resolution data were used in several studies for crop classification. The high temporal and also spectral resolution of MERIS data has indisputable advantage for crop classification. However, spatial resolution of 300 m results in mixture signal in a single pixel. AWiFS-MERIS data synergy brings new perspectives in agricultural Land Use mapping. Also, the developed methodology procedure is fully compatible with future use of ESA (GMES) Sentinel satellite images. The applied methodology of hybrid multi-sensor approach consists of these main stages: a/ parcel segmentation and spectral pre-classification of high resolution image (AWiFS); b/ ingestion of middle resolution (MERIS) vegetation spectro-temporal features; c/ vegetation signatures unmixing; and d/ semantic object-oriented classification of vegetation classes into final classification scheme. These crop groups were selected to be

  9. The derivation of tropospheric column ozone using the TOR approach and mapping technique

    NASA Astrophysics Data System (ADS)

    Yang, Qing

    2007-12-01

    Tropospheric ozone columns (TCOs) derived from differences between the Dutch-Finnish Aura Ozone Monitoring Instrument (OMI) measurements of the total atmospheric ozone column and the Aura Microwave Limb Sounder (MLS) measurements of stratospheric ozone columns are discussed. Because the measurements by these two instruments are not spatially coincident, interpolation techniques, with emphasis on mapping the stratospheric columns in space and time using the relationships between lower stratospheric ozone and potential vorticity (PV) and geopotential heights (Z), are evaluated at mid-latitudes. It is shown that this PV mapping procedure produces somewhat better agreement in comparisons with ozonesonde measurements, particularly in winter, than does simple linear interpolation of the MLS stratospheric columns or the use of typical coincidence criteria at mid-latitudes. The OMI/MLS derived tropospheric columns are calculated to be 4 Dobson units (DU) smaller than the sonde measured columns at mid-latitudes. This mean difference is consistent with the MLS (version 1.5) stratospheric ozone columns being high relative to Stratospheric Aerosol and Gas Experiment (SAGE II) columns by 3 DU. Standard deviations between the derived tropospheric columns and those measured by ozonesondes are 9 DU (30%) annually but they are just 6 DU (15%) in summer. Uncertainties in the interpolated MLS stratospheric columns are likely to be the primary cause of these standard deviations. An important advantage of the PV mapping approach is that it works well when MLS data are missing (e.g., when an orbit of measurements is missing). In the comparisons against ozonesonde measurements, it provides up to twice as many comparisons compared to the other techniques. The OMI/MLS derived tropospheric ozone columns have been compared with corresponding columns based on the Tropospheric Emission Spectrometer (TES) measurements, and Regional chEmical trAnsport Model (REAM) simulations. The variability of

  10. Dynamics of open bosonic quantum systems in coherent state representation

    SciTech Connect

    Dalvit, D. A. R.; Berman, G. P.; Vishik, M.

    2006-01-15

    We consider the problem of decoherence and relaxation of open bosonic quantum systems from a perspective alternative to the standard master equation or quantum trajectories approaches. Our method is based on the dynamics of expectation values of observables evaluated in a coherent state representation. We examine a model of a quantum nonlinear oscillator with a density-density interaction with a collection of environmental oscillators at finite temperature. We derive the exact solution for dynamics of observables and demonstrate a consistent perturbation approach.

  11. Mapping Agricultural Fields in Sub-Saharan Africa with a Computer Vision Approach

    NASA Astrophysics Data System (ADS)

    Debats, S. R.; Luo, D.; Estes, L. D.; Fuchs, T.; Caylor, K. K.

    2014-12-01

    Sub-Saharan Africa is an important focus for food security research, because it is experiencing unprecedented population growth, agricultural activities are largely dominated by smallholder production, and the region is already home to 25% of the world's undernourished. One of the greatest challenges to monitoring and improving food security in this region is obtaining an accurate accounting of the spatial distribution of agriculture. Households are the primary units of agricultural production in smallholder communities and typically rely on small fields of less than 2 hectares. Field sizes are directly related to household crop productivity, management choices, and adoption of new technologies. As population and agriculture expand, it becomes increasingly important to understand both the distribution of field sizes as well as how agricultural communities are spatially embedded in the landscape. In addition, household surveys, a common tool for tracking agricultural productivity in Sub-Saharan Africa, would greatly benefit from spatially explicit accounting of fields. Current gridded land cover data sets do not provide information on individual agricultural fields or the distribution of field sizes. Therefore, we employ cutting edge approaches from the field of computer vision to map fields across Sub-Saharan Africa, including semantic segmentation, discriminative classifiers, and automatic feature selection. Our approach aims to not only improve the binary classification accuracy of cropland, but also to isolate distinct fields, thereby capturing crucial information on size and geometry. Our research focuses on the development of descriptive features across scales to increase the accuracy and geographic range of our computer vision algorithm. Relevant data sets include high-resolution remote sensing imagery and Landsat (30-m) multi-spectral imagery. Training data for field boundaries is derived from hand-digitized data sets as well as crowdsourcing.

  12. Development of Return Period Inundation Maps In A Changing Climate Using a Systems of Systems Approach

    NASA Astrophysics Data System (ADS)

    Bilskie, M. V.; Hagen, S. C.; Alizad, K.; Passeri, D. L.; Irish, J. L.

    2016-12-01

    Worldwide, coastal land margins are experiencing increased vulnerability from natural and manmade disasters [Nicholls et al., 1999]. Specifically, coastal flooding is expected to increase due to the effects of climate change, and sea level rise (SLR) in particular. A systems of systems (SoS) approach has been implemented to better understand the dynamic and nonlinear effects of SLR on tropical cyclone-induced coastal flooding along a low-gradient coastal landscape (northern Gulf of Mexico [NGOM]). The backbone of the SoS framework is a high-resolution, physics-based, tide, wind-wave, and hurricane storm surge model [Bilskie et al., 2016a] that includes systems of SLR scenarios [Parris et al., 2012], shoreline morphology [Passeri et al., 2016; Plant et al., 2016], marsh evolution [Alizad et al., 2016], and population dynamics driven by carbon emission scenarios [Bilskie et al., 2016b]. Prior to considering future conditions, the storm surge model was comprehensively validated for present-day conditions [Bilskie et al., 2016a]. The present-day model was then modified to represent the potential future landscape based on four SLR scenarios prescribed by Parris et al. [2012] linked to carbon emissions scenarios for the year 2100. Numerical simulations forced by hundreds of synthetic tropical cyclones were performed and the results facilitate the development of return period inundation maps across the NGOM that reflect changes to the coastal landscape. The SoS approach allows new patterns and properties to emerge (i.e. nonlinear and dynamic effects of SLR) that would otherwise be unobserved using a static SLR model.

  13. 3D mapping of airway wall thickening in asthma with MSCT: a level set approach

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Brillet, Pierre-Yves; Hartley, Ruth; Grenier, Philippe A.; Brightling, Christopher

    2014-03-01

    Assessing the airway wall thickness in multi slice computed tomography (MSCT) as image marker for airway disease phenotyping such asthma and COPD is a current trend and challenge for the scientific community working in lung imaging. This paper addresses the same problem from a different point of view: considering the expected wall thickness-to-lumen-radius ratio for a normal subject as known and constant throughout the whole airway tree, the aim is to build up a 3D map of airway wall regions of larger thickness and to define an overall score able to highlight a pathological status. In this respect, the local dimension (caliber) of the previously segmented airway lumen is obtained on each point by exploiting the granulometry morphological operator. A level set function is defined based on this caliber information and on the expected wall thickness ratio, which allows obtaining a good estimate of the airway wall throughout all segmented lumen generations. Next, the vascular (or mediastinal dense tissue) contact regions are automatically detected and excluded from analysis. For the remaining airway wall border points, the real wall thickness is estimated based on the tissue density analysis in the airway radial direction; thick wall points are highlighted on a 3D representation of the airways and several quantification scores are defined. The proposed approach is fully automatic and was evaluated (proof of concept) on a patient selection coming from different databases including mild, severe asthmatics and normal cases. This preliminary evaluation confirms the discriminative power of the proposed approach regarding different phenotypes and is currently extending to larger cohorts.

  14. Improving semi-automated glacier mapping with a multi-method approach: applications in central Asia

    NASA Astrophysics Data System (ADS)

    Smith, T.; Bookhagen, B.; Cannon, F.

    2015-09-01

    Studies of glaciers generally require precise glacier outlines. Where these are not available, extensive manual digitization in a geographic information system (GIS) must be performed, as current algorithms struggle to delineate glacier areas with debris cover or other irregular spectral profiles. Although several approaches have improved upon spectral band ratio delineation of glacier areas, none have entered wide use due to complexity or computational intensity. In this study, we present and apply a glacier mapping algorithm in Central Asia which delineates both clean glacier ice and debris-covered glacier tongues. The algorithm is built around the unique velocity and topographic characteristics of glaciers and further leverages spectral and spatial relationship data. We found that the algorithm misclassifies between 2 and 10 % of glacier areas, as compared to a ~ 750 glacier control data set, and can reliably classify a given Landsat scene in 3-5 min. The algorithm does not completely solve the difficulties inherent in classifying glacier areas from remotely sensed imagery but does represent a significant improvement over purely spectral-based classification schemes, such as the band ratio of Landsat 7 bands three and five or the normalized difference snow index. The main caveats of the algorithm are (1) classification errors at an individual glacier level, (2) reliance on manual intervention to separate connected glacier areas, and (3) dependence on fidelity of the input Landsat data.

  15. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M., Jr.

    2004-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  16. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M.

    2006-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  17. Fractionation profiling: a fast and versatile approach for mapping vesicle proteomes and protein–protein interactions

    PubMed Central

    Borner, Georg H. H.; Hein, Marco Y.; Hirst, Jennifer; Edgar, James R.; Mann, Matthias; Robinson, Margaret S.

    2014-01-01

    We developed “fractionation profiling,” a method for rapid proteomic analysis of membrane vesicles and protein particles. The approach combines quantitative proteomics with subcellular fractionation to generate signature protein abundance distribution profiles. Functionally associated groups of proteins are revealed through cluster analysis. To validate the method, we first profiled >3500 proteins from HeLa cells and identified known clathrin-coated vesicle proteins with >90% accuracy. We then profiled >2400 proteins from Drosophila S2 cells, and we report the first comprehensive insect clathrin-coated vesicle proteome. Of importance, the cluster analysis extends to all profiled proteins and thus identifies a diverse range of known and novel cytosolic and membrane-associated protein complexes. We show that it also allows the detailed compositional characterization of complexes, including the delineation of subcomplexes and subunit stoichiometry. Our predictions are presented in an interactive database. Fractionation profiling is a universal method for defining the clathrin-coated vesicle proteome and may be adapted for the analysis of other types of vesicles and particles. In addition, it provides a versatile tool for the rapid generation of large-scale protein interaction maps. PMID:25165137

  18. Flux Optimization in Human Specific Map-Kinase Pathways: A Systems Biology Approach to Study Cancer

    NASA Astrophysics Data System (ADS)

    Sahu, Sombeet

    2010-10-01

    Mitogen-Activated Protein Kinase (MAP kinases) transduces signals that are involved in a multitude of cellular pathways and functions in response to variety of ligands and cell stimuli. Aberrant or inappropriate functions of MAPKs have now been identified in diseases ranging from Cancer to Alzheimer disease to Leshmaniasis however the pathway is still growing and little is known about the dynamics of the pathway. Here we model the MAPK metabolic pathways and thus find the key metabolites or reactions involved on perturbing which the transcription factors are affected. The approach, which we used for modeling of this pathway, is Flux Balance Analysis (FBA). Further we established the growth factors EGF, PDGF were also responsible for the determination of downstream species concentrations. Tuning the parameters gave the optimum kinetics of the growth factor for which the downstream events were at the minimum. Also the Ras and Braf steady state concentrations were significantly affected when the Growth factor kinetics were tuned. This type of study can shed light on controlling various diseases and also may be helpful for identifying important drug targets.

  19. Densely mapping the phase diagram of cuprate superconductors using a spatial composition spread approach

    NASA Astrophysics Data System (ADS)

    Saadat, Mehran; George, A. E.; Hewitt, Kevin C.

    2010-12-01

    A novel spatial composition spread approach was used successfully to deposit a 52-member library of La2-xSrxCuO4 (0 ⩽ x ⩽ 0.18) using magnetron sputtering combined with physical masking techniques. Two homemade targets of La2CuO4 and La1.82Sr0.18CuO4 were sputtered at a power of 41 W RF and 42 W DC, respectively, in a process gas of 15 mTorr argon. The libraries were sputtered onto LaSrAlO4 (0 0 1), SrTiO3 (1 0 0) and MgO (1 0 0) substrates through a 52-slot shadow mask for which a -20 V substrate bias was applied to prevent resputtering. The resulting amorphous films were post-annealed (800 °C for 1 h then at 950 °C for 2 h) in a tube sealed with oxygen gas. Wavelength Dispersive Spectroscopy (WDS) analysis revealed the expected linear variation of Sr content from 0 to 0.18 with an approximate change of 0.003 per library member. Transport measurements revealed superconducting transitions as well as changes in the quasiparticle scattering rate. These transitions and scattering rate changes were mapped to produce the T-hole concentration phase diagram.

  20. Mapping and structural dissection of human 20 S proteasome using proteomic approaches.

    PubMed

    Claverol, Stephane; Burlet-Schiltz, Odile; Girbal-Neuhauser, Elisabeth; Gairin, Jean Edouard; Monsarrat, Bernard

    2002-08-01

    The proteasome, a proteolytic complex present in all eukaryotic cells, is part of the ATP-dependent ubiquitin/proteasome pathway. It plays a critical role in the regulation of many physiological processes. The 20 S proteasome, the catalytic core of the 26 S proteasome, is made of four stacked rings of seven subunits each (alpha7beta7beta7alpha7). Here we studied the human 20 S proteasome using proteomics. This led to the establishment of a fine subunit reference map and to the identification of post-translational modifications. We found that the human 20 S proteasome, purified from erythrocytes, exhibited a high degree of structural heterogeneity, characterized by the presence of multiple isoforms for most of the alpha and beta subunits, including the catalytic ones, resulting in a total of at least 32 visible spots after Coomassie Blue staining. The different isoforms of a given subunit displayed shifted pI values, suggesting that they likely resulted from post-translational modifications. We then took advantage of the efficiency of complementary mass spectrometric approaches to investigate further these protein modifications at the structural level. In particular, we focused our efforts on the alpha7 subunit and characterized its N-acetylation and its phosphorylation site localized on Ser(250).