Science.gov

Sample records for boson mapping approach

  1. A general approach to bosonization

    NASA Astrophysics Data System (ADS)

    Setlur, Girish S.; Meera, V.

    2007-10-01

    We summarize recent developments in the field of higher dimensional bosonization made by Setlur and collaborators and propose a general formula for the field operator in terms of currents and densities in one dimension using a new ingredient known as a `singular complex number'. Using this formalism, we compute the Green function of the homogeneous electron gas in one spatial dimension with short-range interaction leading to the Luttinger liquid and also with long-range interactions that lead to a Wigner crystal whose momentum distribution computed recently exhibits essential singularities. We generalize the formalism to finite temperature by combining with the author's hydrodynamic approach. The one-particle Green function of this system with essential singularities cannot be easily computed using the traditional approach to bosonization which involves the introduction of momentum cutoffs, hence the more general approach of the present formalism is proposed as a suitable alternative.

  2. Schematic microscopic approach to the description of M1 transitions between mixed-symmetry and fully symmetric collective states in {gamma}-soft nuclei based on RPA-IBM boson mapping

    SciTech Connect

    Jolos, R. V.; Shirikova, N. Yu.; Voronov, V. V.; Pietralla, N.

    2011-07-15

    A schematic microscopic method is developed to calculate the M1 transition probabilities between the mixed-symmetry and the fully symmetric states in {gamma}-soft nuclei. The method is based on the random-phase approximation-interacting boson model (RPA-IBM) boson mapping of the most collective isoscalar boson. All other boson modes with higher excitation energies, including the mixed-symmetry boson, are described in the framework of RPA. As an example the M1 transition probabilities are calculated for the {sup 124-134}Xe isotopes and compared with the experimental data. The results agree well with the data for the ratio B(M1;1{sub ms}{sup +}{yields}2{sub 2}{sup +})/B(M1;1{sub ms}{sup +}{yields}0{sub 1}{sup +}). However, the calculated ratio B(M1;2{sub ms}{sup +}{yields}2{sub 1}{sup +})/B(M1;1{sub ms}{sup +}{yields}0{sub 1}{sup +}) shows a significantly weaker dependence on the mass number than the experimental data.

  3. Similarity-transformed dyson mapping and SDG-interacting boson hamiltonian

    NASA Astrophysics Data System (ADS)

    Navrátil, P.; Dobeš, J.

    1991-10-01

    The sdg-interacting boson hamiltonian is constructed from the fermion shell-model input. The seniority boson mapping as given by the similarity-transformed Dyson boson mapping is used. The s, d, and g collective boson amplitudes are determined consistently from the mapped hamiltonian. Influence of the starting shell-model parameters is discussed. Calculations for the Sm isotopic chain and for the 148Sm, 150Nd, and 196Pt nuclei are presented. Calculated energy levels as well as E2 and E4 properties agree rather well with experimental ones. To obtain such agreement, the input shell-model parameters cannot be fixed at a constant set for several nuclei but have to be somewhat varied, especially in the deformed region. Possible reasons for this variation are discussed. Effects of the explicit g-boson consideration are shown.

  4. Bosonization approach for "atomic collapse" in graphene

    NASA Astrophysics Data System (ADS)

    Kagimura, Aya; Onogi, Tetsuya

    2016-02-01

    We study quantum electrodynamics with 2+1 dimensional massless Dirac fermion around a Coulomb impurity. Around a large charge with atomic number Z > 137, the QED vacuum is expected to collapse due to the strong Coulombic force. While the relativistic quantum mechanics fails to make reliable predictions for the fate of the vacuum, the heavy ion collision experiment also does not give clear understanding of this system. Recently, the "atomic collapse" resonances were observed on graphene where an artificial nuclei can be made. In this paper, we present our nonperturbative study of the vacuum structure of the quasiparticles in graphene with a charge impurity which contains multi-body effect using bosonization method.

  5. TFD Approach to Bosonic Strings and Dp-Branes

    NASA Astrophysics Data System (ADS)

    Abdalla, M. C. B.; Gadelha, A. L.; Vancea, I. V.

    In this work we explain the construction of the thermal vacuum for the bosonic string, as well that of the thermal boundary state interpreted as a Dp-brane at finite temperature. In both case we calculate the respective entropy using the entropy operator of the Thermo Field Dynamics theory. We show that the contribution of the thermal string entropy is explicitly present in the Dp-brane entropy. Furthermore, we show that the Thermo Field approach is suitable to introduce temperature in boundary states.

  6. Bosonic Dp-branes at finite temperature in TFD approach

    NASA Astrophysics Data System (ADS)

    Abdalla, M. C. B.; Gadelha, A. L.; Vancea, I. V.

    2004-02-01

    A general formulation of Thermo Field Dynamics using transformation generators that form the SU(1, 1) group, is presented and applied to the closed bosonic string and for bosonic Dp-brane with an external field.

  7. Self-consistent Hartree-Fock approach for interacting bosons in optical lattices

    NASA Astrophysics Data System (ADS)

    Lü, Qin-Qin; Patton, Kelly R.; Sheehy, Daniel E.

    2014-12-01

    A theoretical study of interacting bosons in a periodic optical lattice is presented. Instead of the commonly used tight-binding approach (applicable near the Mott-insulating regime of the phase diagram), the present work starts from the exact single-particle states of bosons in a cubic optical lattice, satisfying the Mathieu equation, an approach that can be particularly useful at large boson fillings. The effects of short-range interactions are incorporated using a self-consistent Hartree-Fock approximation, and predictions for experimental observables such as the superfluid transition temperature, condensate fraction, and boson momentum distribution are presented.

  8. Map Projections: Approaches and Themes

    ERIC Educational Resources Information Center

    Steward, H. J.

    1970-01-01

    Map projections take on new meaning with location systems needed for satellites, other planets and space. A classroom approach deals first with the relationship between the earth and the globe, then with transformations to flat maps. Problems of preserving geometric qualities: distance, angles, directions are dealt with in some detail as are…

  9. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  10. Mean-field plus various types of pairing models and an exact boson mapping of the standard pairing model

    SciTech Connect

    Pan Feng; Wang Yin; Guan Xin; Jia Lu; Chen Xiangrong; Draayer, J. P.

    2011-06-28

    Exact solutions of Nilsson mean-field with various pairing interactions are reviewed. Some even-odd mass differences and moments of inertia of low-lying states for rare earth and actinide nuclei are calculated for the nearest-orbit pairing approximation as well as for the extended pairing model and compared to available experimental data. An exact boson mapping of the standard pairing Hamiltonian is also reported. Under the mapping, fermion pair operators are mapped exactly onto corresponding bosons. The image of the mapping is a Bose-Hubbard model with orbit-dependent hopping.

  11. bosons production in the quantum statistical parton distributions approach

    NASA Astrophysics Data System (ADS)

    Bourrely, Claude; Buccella, Franco; Soffer, Jacques

    2013-10-01

    We consider W± gauge bosons production in connection with recent results from BNL-RHIC and FNAL-Tevatron and interesting predictions from the statistical parton distributions. They concern relevant aspects of the structure of the nucleon sea and the high-x region of the valence quark distributions. We also give predictions in view of future proton-neutron collisions experiments at BNL-RHIC.

  12. Nonunitary and unitary approach to Eigenvalue problem of Boson operators and squeezed coherent states

    NASA Technical Reports Server (NTRS)

    Wunsche, A.

    1993-01-01

    The eigenvalue problem of the operator a + zeta(boson creation operator) is solved for arbitrarily complex zeta by applying a nonunitary operator to the vacuum state. This nonunitary approach is compared with the unitary approach leading for the absolute value of zeta less than 1 to squeezed coherent states.

  13. Coherent state approach to the interacting boson model: Test of its validity in the transitional region

    SciTech Connect

    Inci, I.; Alonso, C. E.; Arias, J. M.; Fortunato, L.; Vitturi, A.

    2009-09-15

    The predictive power of the coherent state (CS) approach to the interacting boson model (IBM) is tested far from the IBM dynamical symmetry limits. The transitional region along the {gamma}-unstable path from U(5) to O(6) is considered. Excitation energy of the excited {beta} band and intraband and interband transitions obtained within the CS approach are compared with the exact results as a function of the boson number N. We find that the CS formalism provides approximations to the exact results that are correct up to the order 1/N in the transitional region, except in a narrow region close to the critical point.

  14. Mapping between the classical and pseudoclassical models of a relativistic spinning particle in external bosonic and fermionic fields. I

    NASA Astrophysics Data System (ADS)

    Markov, Yu. A.; Markova, M. A.

    2015-06-01

    The problem on mapping between two Lagrangian descriptions (using a commuting c-number spinor ψα or anticommuting pseudovector ξμ and pseudoscalar ξ5 variables) of the spin degrees of freedom of a color spinning massive particle interacting with background non-Abelian gauge field, is considered. A general analysis of the mapping between a pair of Majorana spinors (ψα, θα) (θα is some auxiliary anticommuting spinor) and a real anticommuting tensor aggregate (S, Vμ, T*μν, Aμ, P), is presented. A complete system of bilinear relations between the tensor quantities, is obtained. The analysis we have given is used for the above problem of the equivalence of two different ways of describing the spin degrees of freedom of the relativistic particle. The mapping of the kinetic term (iħ / 2) (θ bar θ) (ψ bar ˙ ψ - ψ bar ψ ˙), the term (1 / e) (θ bar θ)x˙μ (ψ bar γμ ψ) that provides a couple of the spinning variable ψ and the particle velocity x˙μ, and the interaction term ħ (θ bar θ)Qa Fμνa (ψ bar σμν ψ) with an external non-Abelian gauge field, are considered in detail. In the former case a corresponding system of bilinear identities including both the tensor variables and their derivatives (S ˙, V˙μ, ˙ μν *T, A˙μ, P ˙), is defined. A detailed analysis of the local bosonic symmetry of the Lagrangian with the commuting spinor ψα, is carried out. A connection of this symmetry with the local SUSY transformation of the Lagrangian containing anticommuting pseudovector and pseudoscalar variables, is considered. The approach of obtaining a supersymmetric Lagrangian in terms of the even ψα and odd θα spinors, is offered.

  15. Usage-Oriented Topic Maps Building Approach

    NASA Astrophysics Data System (ADS)

    Ellouze, Nebrasse; Lammari, Nadira; Métais, Elisabeth; Ben Ahmed, Mohamed

    In this paper, we present a collaborative and incremental construction approach of multilingual Topic Maps based on enrichment and merging techniques. In recent years, several Topic Map building approaches have been proposed endowed with different characteristics. Generally, they are dedicated to particular data types like text, semi-structured data, relational data, etc. We note also that most of these approaches take as input monolingual documents to build the Topic Map. The problem is that the large majority of resources available today are written in various languages, and these resources could be relevant even to non-native speakers. Thus, our work is driven towards a collaborative and incremental method for Topic Map construction from textual documents available in different languages. To enrich the Topic Map, we take as input a domain thesaurus and we propose also to explore the Topic Map usage which means available potential questions related to the source documents.

  16. Variational cluster approach for strongly correlated lattice bosons in the superfluid phase

    SciTech Connect

    Knap, Michael; Arrigoni, Enrico; Linden, Wolfgang von der

    2011-04-01

    We extend the variational cluster approach to deal with strongly correlated lattice bosons in the superfluid phase. To this end, we reformulate the approach within a pseudoparticle formalism, whereby cluster excitations are described by particlelike excitations. The approximation amounts to solving a multicomponent noninteracting bosonic system by means of a multimode Bogoliubov approximation. A source-and-drain term is introduced in order to break U(1) symmetry at the cluster level. We provide an expression for the grand potential, the single-particle normal and anomalous Green's functions, the condensate density, and other static quantities. As a first nontrivial application of the method we choose the two-dimensional Bose-Hubbard model and evaluate results in both the Mott and the superfluid phases. Our results show an excellent agreement with quantum Monte Carlo calculations.

  17. A Tangible Approach to Concept Mapping

    NASA Astrophysics Data System (ADS)

    Tanenbaum, Karen; Antle, Alissa N.

    2009-05-01

    The Tangible Concept Mapping project investigates using a tangible user interface to engage learners in concept map creation. This paper describes a prototype implementation of the system, presents some preliminary analysis of its ease of use and effectiveness, and discusses how elements of tangible interaction support concept mapping by helping users organize and structure their knowledge about a domain. The role of physical engagement and embodiment in supporting the mental activity of creating the concept map is explored as one of the benefits of a tangible approach to learning.

  18. Reprint of : Scattering theory approach to bosonization of non-equilibrium mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Sukhorukov, Eugene V.

    2016-08-01

    Between many prominent contributions of Markus Büttiker to mesoscopic physics, the scattering theory approach to the electron transport and noise stands out for its elegance, simplicity, universality, and popularity between theorists working in this field. It offers an efficient way to theoretically investigate open electron systems far from equilibrium. However, this method is limited to situations where interactions between electrons can be ignored, or considered perturbatively. Fortunately, this is the case in a broad class of metallic systems, which are commonly described by the Fermi liquid theory. Yet, there exist another broad class of electron systems of reduced dimensionality, the so-called Tomonaga-Luttinger liquids, where interactions are effectively strong and cannot be neglected even at low energies. Nevertheless, strong interactions can be accounted exactly using the bosonization technique, which utilizes the free-bosonic character of collective excitations in these systems. In the present work, we use this fact in order to develop the scattering theory approach to the bosonization of open quasi-one dimensional electron systems far from equilibrium.

  19. The Higgs boson masses and mixings of the complex MSSM in the Feynman-diagrammatic approach

    NASA Astrophysics Data System (ADS)

    Frank, Meikel; Hahn, Thomas; Heinemeyer, Sven; Hollik, Wolfgang; Rzehak, Heidi; Weiglein, Georg

    2007-02-01

    New results for the complete one-loop contributions to the masses and mixing effects in the Higgs sector are obtained for the MSSM with complex parameters using the Feynman-diagrammatic approach. The full dependence on all relevant complex phases is taken into account, and all the imaginary parts appearing in the calculation are treated in a consistent way. The renormalization is discussed in detail, and a hybrid on-shell/bar Dbar R scheme is adopted. We also derive the wave function normalization factors needed in processes with external Higgs bosons and discuss effective couplings incorporating leading higher-order effects. The complete one-loop corrections, supplemented by the available two-loop corrections in the Feynman-diagrammatic approach for the MSSM with real parameters and a resummation of the leading (s)bottom corrections for complex parameters, are implemented into the public Fortran code FeynHiggs 2.5. In our numerical analysis the full results for the Higgs-boson masses and couplings are compared with various approximations, and Script CScript P-violating effects in the mixing of the heavy Higgs bosons are analyzed in detail. We find sizable deviations in comparison with the approximations often made in the literature.

  20. A new approach to shortest paths on networks based on the quantum bosonic mechanism

    NASA Astrophysics Data System (ADS)

    Jiang, Xin; Wang, Hailong; Tang, Shaoting; Ma, Lili; Zhang, Zhanli; Zheng, Zhiming

    2011-01-01

    This paper presents quantum bosonic shortest path searching (QBSPS), a natural, practical and highly heuristic physical algorithm for reasoning about the recognition of network structure via quantum dynamics. QBSPS is based on an Anderson-like itinerant bosonic system in which a boson's Green function is used as a navigation pointer for one to accurately approach the terminals. QBSPS is demonstrated by rigorous mathematical and physical proofs and plenty of simulations, showing how it can be used as a greedy routing to seek the shortest path between different locations. In methodology, it is an interesting and new algorithm rooted in the quantum mechanism other than combinatorics. In practice, for the all-pairs shortest-path problem in a random scale-free network with N vertices, QBSPS runs in O(μ(N) ln ln N) time. In application, we suggest that the corresponding experimental realizations are feasible by considering path searching in quantum optical communication networks; in this situation, the method performs a pure local search on networks without requiring the global structure that is necessary for current graph algorithms.

  1. Non-equilibrium slave bosons approach to quantum pumping in interacting quantum dots

    NASA Astrophysics Data System (ADS)

    Citro, Roberta; Romeo, Francesco

    2016-03-01

    We review a time-dependent slave bosons approach within the non-equilibrium Green's function technique to analyze the charge and spin pumping in a strongly interacting quantum dot. We study the pumped current as a function of the pumping phase and of the dot energy level and show that a parasitic current arises, beyond the pure pumping one, as an effect of the dynamical constraints. We finally illustrate an all-electrical mean for spin-pumping and discuss its relevance for spintronics applications.

  2. Double occupancy in dynamical mean-field theory and the dual boson approach

    NASA Astrophysics Data System (ADS)

    van Loon, Erik G. C. P.; Krien, Friedrich; Hafermann, Hartmut; Stepanov, Evgeny A.; Lichtenstein, Alexander I.; Katsnelson, Mikhail I.

    2016-04-01

    We discuss the calculation of the double occupancy using dynamical mean-field theory in finite dimensions. The double occupancy can be determined from the susceptibility of the auxiliary impurity model or from the lattice susceptibility. The former method typically overestimates, whereas the latter underestimates the double occupancy. We illustrate this for the square-lattice Hubbard model. We propose an approach for which both methods lead to identical results by construction and which resolves this ambiguity. This self-consistent dual boson scheme results in a double occupancy that is numerically close to benchmarks available in the literature.

  3. A Statistical Approach for Ambiguous Sequence Mappings

    Technology Transfer Automated Retrieval System (TEKTRAN)

    When attempting to map RNA sequences to a reference genome, high percentages of short sequence reads are often assigned to multiple genomic locations. One approach to handling these “ambiguous mappings” has been to discard them. This results in a loss of data, which can sometimes be as much as 45% o...

  4. An automated approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Sun, Weihua; Mckeown, Donald M.; Messinger, David W.

    2012-10-01

    Heavy rain from Tropical Storm Lee resulted in a major flood event for the southern tier of New York State in early September 2011 causing evacuation of approximately 20,000 people in and around the city of Binghamton. In support of the New York State Office of Emergency Management, a high resolution multispectral airborne sensor (WASP) developed by RIT was deployed over the flooded area to collect aerial images. One of the key benefits of these images is their provision for flood inundation area mapping. However, these images require a significant amount of storage space and the inundation mapping process is conventionally carried out using manual digitization. In this paper, we design an automated approach for flood inundation mapping from the WASP airborne images. This method employs Spectral Angle Mapper (SAM) for color RGB or multispectral aerial images to extract the flood binary map; then it uses a set of morphological processing and a boundary vectorization technique to convert the binary map into a shapefile. This technique is relatively fast and only requires the operator to select one pixel on the image. The generated shapefile is much smaller than the original image and can be imported to most GIS software packages. This enables critical flood information to be shared with and by disaster response managers very rapidly, even over cellular phone networks.

  5. Microscopic calculation of interacting boson model parameters by potential-energy surface mapping

    SciTech Connect

    Bentley, I.; Frauendorf, S.

    2011-06-15

    A coherent state technique is used to generate an interacting boson model (IBM) Hamiltonian energy surface which is adjusted to match a mean-field energy surface. This technique allows the calculation of IBM Hamiltonian parameters, prediction of properties of low-lying collective states, as well as the generation of probability distributions of various shapes in the ground state of transitional nuclei, the last two of which are of astrophysical interest. The results for krypton, molybdenum, palladium, cadmium, gadolinium, dysprosium, and erbium nuclei are compared with experiment.

  6. Analytical approach to a bosonic ladder subject to a magnetic field

    NASA Astrophysics Data System (ADS)

    Uchino, Shun

    2016-05-01

    We examine a bosonic two-leg ladder model subject to a magnetic flux and especially focus on a regime where the lower-energy band has two minima. By using a low-energy field theory approach, we study several issues discussed in the system: the existence of local patterns in density and current, chiral-current reversal, and the effect of a nearest-neighbor interaction along the rung direction. In our formalism, the local patterns are interpreted as a result of breaking of discrete symmetry. The chiral-current reversal occurs through a competition between a current component determined at a commensurate vortex density causing an enlargement of the unit cell and another component, which is proportional to the magnetic-field doping from the corresponding commensurate flux. The nearest-neighbor interaction along the rung direction available with the technique on a synthetic dimension is shown to favor a population-imbalance solution in an experimentally relevant regime.

  7. Constructing Linkage Disequilibrium Map with Iterative Approach

    NASA Astrophysics Data System (ADS)

    Ao, S. I.

    2008-05-01

    With recent advance of the genotyping single nucleotide polymorphisms (SNPs) in mass scale of high density in a candidate region of the human genome, the linkage disequilibrium analysis can offer a much higher resolution of the biological samples than the traditional linkage maps. We have formulated this LD mapping problem as a constrained unidimensional scaling problem. Our method, which is directly based on the measurement of LD among SNPs, is non-parametric. Therefore it is different from LD maps derived from the given Malecot model. We have formulated with the quadratic programming approach for solving this constrained unidimensional scaling problem. Different from the classical metric unidimensional scaling problem, the constrained problem is not an NP-hard combinatorial problem. The optimal solution is determined by using the quadratic programming solver. Nevertheless, because of the large requirement for memory during the running time that may cause the out of memory problems, and the high computational time of the quadratic programming algorithm, the iterative algorithm has been developed for solving this LD constrained unidimensional scaling problem.

  8. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  9. Hydrochromic Approaches to Mapping Human Sweat Pores.

    PubMed

    Park, Dong-Hoon; Park, Bum Jun; Kim, Jong-Man

    2016-06-21

    colorimetric change near body temperature. This feature enables the use of this technique to generate high-quality images of sweat pores. This Account also focuses on the results of the most recent phase of this investigation, which led to the development of a simple yet efficient and reliable technique for sweat pore mapping. The method utilizes a hydrophilic polymer composite film containing fluorescein, a commercially available dye that undergoes a fluorometric response as a result of water-dependent interconversion between its ring-closed spirolactone (nonfluorescent) and ring-opened fluorone (fluorescent) forms. Surface-modified carbon nanodots (CDs) have also been found to be efficient for hydrochromic mapping of human sweat pores. The results discovered by Lou et al. [ Adv. Mater. 2015 , 27 , 1389 ] are also included in this Account. Sweat pore maps obtained from fingertips using these materials were found to be useful for fingerprint analysis. In addition, this hydrochromism-based approach is sufficiently sensitive to enable differentiation between sweat-secreting active pores and inactive pores. As a result, the techniques can be applied to clinical diagnosis of malfunctioning sweat pores. The directions that future research in this area will follow are also discussed. PMID:27159417

  10. Self-consistent dual boson approach to single-particle and collective excitations in correlated systems

    NASA Astrophysics Data System (ADS)

    Stepanov, E. A.; van Loon, E. G. C. P.; Katanin, A. A.; Lichtenstein, A. I.; Katsnelson, M. I.; Rubtsov, A. N.

    2016-01-01

    We propose an efficient dual boson scheme, which extends the dynamical mean-field theory paradigm to collective excitations in correlated systems. The theory is fully self-consistent both on the one- and on the two-particle level, thus describing the formation of collective modes as well as the renormalization of electronic and bosonic spectra on equal footing. The method employs an effective impurity model comprising both fermionic and bosonic hybridization functions. Only single- and two-electron Green's functions of the reference problem enter the theory, due to the optimal choice of the self-consistency condition for the effective bosonic bath. We show that the theory is naturally described by a dual Luttinger-Ward functional and obeys the relevant conservation laws.

  11. Learning topological maps: An alternative approach

    SciTech Connect

    Buecken, A.; Thrun, S.

    1996-12-31

    Our goal is autonomous real-time control of a mobile robot. In this paper we want to show a possibility to learn topological maps of a large-scale indoor environment autonomously. In the literature there are two paradigms how to store information on the environment of a robot: as a grid-based (geometric) or as a topological map. While grid-based maps are considerably easy to learn and maintain, topological maps are quite compact and facilitate fast motion-planning.

  12. One-loop perturbative unitarity and the Higgs-boson mass: A new approach

    NASA Astrophysics Data System (ADS)

    Durand, Loyal; Johnson, James M.; Lopez, Jorge L.

    1990-03-01

    We reexamine the unitarity constraints on the high-energy scattering of longitudinal W 's and Z's and Higgs bosons in the standard model including one-loop corrections, and make an Argand-diagram analysis of the j=0 scattering amplitudes. We find that the theory is approximately unitary and weakly interacting at O(λ2) for Higgs-boson couplings λ<λc=1.5-2 (equivalent to MH<350-400 GeV), but that O(λ3) or higher corrections must be included to restore perturbative unitarity for larger values of λ or MH.

  13. Boson core compressibility

    NASA Astrophysics Data System (ADS)

    Khorramzadeh, Y.; Lin, Fei; Scarola, V. W.

    2012-04-01

    Strongly interacting atoms trapped in optical lattices can be used to explore phase diagrams of Hubbard models. Spatial inhomogeneity due to trapping typically obscures distinguishing observables. We propose that measures using boson double occupancy avoid trapping effects to reveal two key correlation functions. We define a boson core compressibility and core superfluid stiffness in terms of double occupancy. We use quantum Monte Carlo on the Bose-Hubbard model to empirically show that these quantities intrinsically eliminate edge effects to reveal correlations near the trap center. The boson core compressibility offers a generally applicable tool that can be used to experimentally map out phase transitions between compressible and incompressible states.

  14. Julia-Toulouse approach to (d+1)-dimensional bosonized Schwinger model with an application to large N QCD

    NASA Astrophysics Data System (ADS)

    Guimaraes, M. S.; Rougemont, R.; Wotzasek, C.; Zarro, C. A. D.

    2012-12-01

    The Julia-Toulouse approach for condensation of charges and defects is used to show that the bosonized Schwinger model can be obtained through a condensation of electric charges in 1+1 dimensions. The massive model is derived by taking into account the presence of vortices over the electric condensate, while the massless model is obtained when these vortices are absent. This construction is then straightforwardly generalized for arbitrary d+1 spacetime dimensions. The d=3 case corresponds to the large N chiral dynamics of SU(N) QCD in the limit N→∞.

  15. Quantitative Genetic Interaction Mapping Using the E-MAP Approach

    PubMed Central

    Collins, Sean R.; Roguev, Assen; Krogan, Nevan J.

    2010-01-01

    Genetic interactions represent the degree to which the presence of one mutation modulates the phenotype of a second mutation. In recent years, approaches for measuring genetic interactions systematically and quantitatively have proven to be effective tools for unbiased characterization of gene function and have provided valuable data for analyses of evolution. Here, we present protocols for systematic measurement of genetic interactions with respect to organismal growth rate for two yeast species. PMID:20946812

  16. A Canonical Ensemble Approach to the Fermion/Boson Random Point Processes and Its Applications

    NASA Astrophysics Data System (ADS)

    Tamura, H.; Ito, K. R.

    2006-04-01

    We introduce the boson and the fermion point processes from the elementary quantum mechanical point of view. That is, we consider quantum statistical mechanics of the canonical ensemble for a fixed number of particles which obey Bose-Einstein, Fermi-Dirac statistics, respectively, in a finite volume. Focusing on the distribution of positions of the particles, we have point processes of the fixed number of points in a bounded domain. By taking the thermodynamic limit such that the particle density converges to a finite value, the boson/fermion processes are obtained. This argument is a realization of the equivalence of ensembles, since resulting processes are considered to describe a grand canonical ensemble of points. Random point processes corresponding to para-particles of order two are discussed as an application of the formulation. Statistics of a system of composite particles at zero temperature are also considered as a model of determinantal random point processes.

  17. Energy fluctuation of a finite number of interacting bosons: A correlated many-body approach

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Satadal; Lekala, M. L.; Chakrabarti, Barnali; Rampho, G. J.

    2016-03-01

    We calculate the energy fluctuation of a truly finite number of interacting bosons and study the role of interaction. Although the ideal Bose gas in thermodynamic limit is an exactly solvable problem and analytic expression of various fluctuation measures exists, the experimental Bose-Einstein condensation (BEC) is a nontrivial many-body problem. We employ a two-body correlated basis function and utilize the realistic van der Waals interaction. We calculate the energy fluctuation (△E2) of the interacting trapped bosons and plot △E/2 kB2T2 as a function of T/Tc. In the classical limit △E2 is related to the specific heat per particle cv through the relation △E2=kBT2cv . We have obtained a distinct hump in △E/2 kB2T2 around the condensation point for three-dimesional harmonically trapped Bose gas when the particle number N ≃5000 and above which corresponds to the second-order phase transition. However for finite-size interacting bosons (N ≃ a few hundred) the hump is not sharp, and the maximum in △E/2 kB2T2 can be interpreted as a smooth increase in the scaled fluctuation below Tc and then a decrease above Tc. To illustrate the justification we also calculate cv, which exhibits the same feature, which leads to the conjecture that for finite-sized interacting bosons phase transition is ruled out.

  18. An integrative approach to genomic introgression mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Near-Isogenic Lines (NILs) are valuable genetic resources for many crop species, including soybean. The development of new molecular platforms promises to accelerate the mapping of genetic introgressions in these materials. Here we compare some existing and emerging methodologies for genetic intro...

  19. FEASIBILITY AND APPROACH FOR MAPPING RADON POTENTIALS IN FLORIDA

    EPA Science Inventory

    The report gives results of an analysis of the feasibility and approach for developing statewide maps of radon potentials in Florida. he maps would provide a geographic basis for implementing new radon-protective building construction standards to reduce public health risks from ...

  20. ModMAP: A Systematic Approach to Individualized Teacher Education

    ERIC Educational Resources Information Center

    Kranyik, Robert D.; Kielty, Joseph W.

    1974-01-01

    A description of a competency based, individualized graduate degree program, Modular Multiple Alternative Program (ModMAP). The program focuses on the training of elementary teachers, and offers an alternative approach to graduate studies. (Author)

  1. Recent developments in MAP - MODULAR APPROACH to PHYSICS

    NASA Astrophysics Data System (ADS)

    Rae, Jennifer; Austen, Dave; Brouwer, Wytze

    2002-05-01

    We present recent developments in MAP - MODULAR APPROACH to PHYSICS - JAVA enhanced modules to be used as aids in teaching the first 3 terms of university physics. The MAP project is very comprehensive and consists of a modular approach to physics that utilizes JAVA applets, FLASH animations and HTML based tutorials. The overall instructional philosophy of MAP is constructivist and the project emphasizes active learner participation. In this talk we will provide a quick overview of the project and the results of recent pilot testing at several Canadian universities. It will also include a discussion of the VIDEO LAB aspect of MAP. This is a component that is integrated into MAP and permits students to capture and evaluate otherwise difficult to study phenomena on video.

  2. Improving long time behavior of Poisson bracket mapping equation: A non-Hamiltonian approach

    SciTech Connect

    Kim, Hyun Woo; Rhee, Young Min

    2014-05-14

    Understanding nonadiabatic dynamics in complex systems is a challenging subject. A series of semiclassical approaches have been proposed to tackle the problem in various settings. The Poisson bracket mapping equation (PBME) utilizes a partial Wigner transform and a mapping representation for its formulation, and has been developed to describe nonadiabatic processes in an efficient manner. Operationally, it is expressed as a set of Hamilton's equations of motion, similar to more conventional classical molecular dynamics. However, this original Hamiltonian PBME sometimes suffers from a large deviation in accuracy especially in the long time limit. Here, we propose a non-Hamiltonian variant of PBME to improve its behavior especially in that limit. As a benchmark, we simulate spin-boson and photosynthetic model systems and find that it consistently outperforms the original PBME and its Ehrenfest style variant. We explain the source of this improvement by decomposing the components of the mapping Hamiltonian and by assessing the energy flow between the system and the bath. We discuss strengths and weaknesses of our scheme with a viewpoint of offering future prospects.

  3. Multilocal bosonization

    NASA Astrophysics Data System (ADS)

    Anguelova, Iana I.

    2015-12-01

    We present a bilocal isomorphism between the algebra generated by a single real twisted boson field and the algebra of the boson βγ ghost system. As a consequence of this twisted vertex algebra isomorphism, we show that each of these two algebras possesses both untwisted and twisted Heisenberg bosonic currents, as well as three separate families of Virasoro fields. We show that this bilocal isomorphism generalizes to an isomorphism between the algebra generated by the twisted boson field with 2n points of localization and the algebra of the 2n symplectic bosons.

  4. Combinatorial approach to generalized Bell and Stirling numbers and boson normal ordering problem

    SciTech Connect

    Mendez, M.A.; Blasiak, P.; Penson, K.A.

    2005-08-01

    We consider the numbers arising in the problem of normal ordering of expressions in boson creation a{sup {dagger}} and annihilation a operators ([a,a{sup {dagger}}]=1). We treat a general form of a boson string (a{sup {dagger}}){sup r{sub n}}a{sup s{sub n}}...(a{sup {dagger}}){sup r{sub 2}}a{sup s{sub 2}}(a{sup {dagger}}){sup r{sub 1}}a{sup s{sub 1}} which is shown to be associated with generalizations of Stirling and Bell numbers. The recurrence relations and closed-form expressions (Dobinski-type formulas) are obtained for these quantities by both algebraic and combinatorial methods. By extensive use of methods of combinatorial analysis we prove the equivalence of the aforementioned problem to the enumeration of special families of graphs. This link provides a combinatorial interpretation of the numbers arising in this normal ordering problem.

  5. Exact results in a slave boson saddle point approach for a strongly correlated electron model

    SciTech Connect

    Fresard, Raymond; Kopp, Thilo

    2008-08-15

    We revisit the Kotliar-Ruckenstein (KR) slave boson saddle point evaluation for a two-site correlated electron model. As the model can be solved analytically, it is possible to compare the KR saddle point results with the exact many-particle levels. The considered two-site cluster mimics an infinite-U single-impurity Anderson model with a nearest-neighbor Coulomb interaction: one site is strongly correlated with an infinite local Coulomb repulsion, which hybridizes with the second site, on which the local Coulomb repulsion vanishes. Making use of the flexibility of the representation, we introduce appropriate weight factors in the KR saddle point scheme. Ground-state and all excitation levels agree with the exact diagonalization results. Thermodynamics and correlation functions may be recovered in a suitably renormalized saddle point evaluation.

  6. Two-fluid behavior of the Kondo lattice in the 1/N slave boson approach

    NASA Astrophysics Data System (ADS)

    Barzykin, Victor

    2006-03-01

    It has been recently shown by Nakatsuji, Pines, and Fisk [S. Nakatsuji, D. Pines, and Z. Fisk, Phys. Rev. Lett. 92, 016401 (2004)] from the phenomenological analysis of experiments in Ce1-xLaxCoIn5 and CeIrIn5 that thermodynamic and transport properties of Kondo lattices below coherence temperature can be very successfully described in terms of a two-fluid model, with Kondo impurity and heavy electron Fermi liquid contributions. We analyze thermodynamic properties of Kondo lattices using 1/N slave boson treatment of the periodic Anderson model and show that these two contributions indeed arise below the coherence temperature. We find that the Kondo impurity contribution to thermodynamics corresponds to thermal excitations into the flat portion of the energy spectrum.

  7. Look before you leap: a new approach to mapping QTL.

    PubMed

    Huang, B Emma; George, Andrew W

    2009-09-01

    In this paper, we present an innovative and powerful approach for mapping quantitative trait loci (QTL) in experimental populations. This deviates from the traditional approach of (composite) interval mapping which uses a QTL profile to simultaneously determine the number and location of QTL. Instead, we look before we leap by employing separate detection and localization stages. In the detection stage, we use an iterative variable selection process coupled with permutation to identify the number and synteny of QTL. In the localization stage, we position the detected QTL through a series of one-dimensional interval mapping scans. Results from a detailed simulation study and real analysis of wheat data are presented. We achieve impressive increases in the power of QTL detection compared to composite interval mapping. We also accurately estimate the size and position of QTL. An R library, DLMap, implements the methods described here and is freely available from CRAN ( http://cran.r-project.org/ ). PMID:19585099

  8. An Incremental Map Building Approach via Static Stixel Integration

    NASA Astrophysics Data System (ADS)

    Muffert, M.; Anzt, S.; Franke, U.

    2013-10-01

    This paper presents a stereo-vision based incremental mapping approach for urban regions. As input, we use the 3D representation called multi-layered Stixel World which is computed from dense disparity images. More and more, researchers of Driver Assistance Systems rely on efficient and compact 3D representations like the Stixel World. The developed mapping approach takes into account the motion state of obstacles, as well as free space information obtained from the Stixel World. The presented work is based on the well known occupancy grid mapping technique and is formulated with evidential theory. A detailed sensor model is described which is used to determine the information whether a grid cell is occupied, free or has an unknown state. The map update is solved in a time recursive manner by using the Dempster`s Rule of Combination. 3D results of complex inner city regions are shown and are compared with Google Earth images.

  9. A Nonparametric Approach for Mapping Quantitative Trait Loci

    PubMed Central

    Kruglyak, L.; Lander, E. S.

    1995-01-01

    Genetic mapping of quantitative trait loci (QTLs) is performed typically by using a parametric approach, based on the assumption that the phenotype follows a normal distribution. Many traits of interest, however, are not normally distributed. In this paper, we present a nonparametric approach to QTL mapping applicable to any phenotypic distribution. The method is based on a statistic Z(w), which generalizes the nonparametric Wilcoxon rank-sum test to the situation of whole-genome search by interval mapping. We determine the appropriate significance level for the statistic Z(w), by showing that its asymptotic null distribution follows an Ornstein-Uhlenbeck process. These results provide a robust, distribution-free method for mapping QTLs. PMID:7768449

  10. Concept maps and nursing theory: a pedagogical approach.

    PubMed

    Hunter Revell, Susan M

    2012-01-01

    Faculty seek to teach nursing students how to link clinical and theoretical knowledge with the intent of improving patient outcomes. The author discusses an innovative 9-week concept mapping activity as a pedagogical approach to teach nursing theory in a graduate theory course. Weekly concept map building increased student engagement and fostered theoretical thinking. Unexpectedly, this activity also benefited students through group work and its ability to enhance theory-practice knowledge. PMID:22513774

  11. Tank Update System: A novel asset mapping approach for verifying and updating lakes using Google Maps

    NASA Astrophysics Data System (ADS)

    Reddy Pulsani, Bhaskar

    2016-06-01

    Mission Kakatiya is one of prestigious programs of Telangana state government under which restoration of tank across ten districts is being implemented. As part of the program, government plans to restore about 9,000 lakes. Therefore, to have a comprehensive list of lakes existing in Telangana state, Samagra Tank Survey was carried out. Data collected in this survey contained about 45,000 tanks. Since the mode of collection of data was not in a standard format and was made using excel, a web interface was created to fill the gaps and to standardise the data. A new approach for spatially identifying the lakes through Google maps was successfully implemented by developing a web interface. This approach is less common since it implements the nature of asset mapping for the lakes of Telangana state and shows the advantages of using online mapping applications such as Google maps in identifying and cross checking already existing lakes on it.

  12. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  13. Quasiparticle random-phase approximation and {beta}-decay physics: Higher-order approximations in a boson formalism

    SciTech Connect

    Sambataro, M.; Suhonen, J.

    1997-08-01

    The quasiparticle random-phase approximation (QRPA) is reviewed and higher-order approximations are discussed with reference to {beta}-decay physics. The approach is fully developed in a boson formalism. Working within a schematic model, we first illustrate a fermion-boson mapping procedure and apply it to construct boson images of the fermion Hamiltonian at different levels of approximation. The quality of these images is tested through a comparison between approximate and exact spectra. Standard QRPA equations are derived in correspondence with the quasi-boson limit of the first-order boson Hamiltonian. The use of higher-order Hamiltonians is seen to improve considerably the stability of the approximate solutions. The mapping procedure is also applied to Fermi {beta} operators: exact and approximate transition amplitudes are discussed together with the Ikeda sum rule. The range of applicabilty of the QRPA formalism is analyzed. {copyright} {ital 1997} {ital The American Physical Society}

  14. Dark matter coupling to electroweak gauge and Higgs bosons: An effective field theory approach

    NASA Astrophysics Data System (ADS)

    Chen, Jing-Yuan; Kolb, Edward W.; Wang, Lian-Tao

    2013-12-01

    If dark matter is a new species of particle produced in the early universe as a cold thermal relic (a weakly-interacting massive particle-WIMP), its present abundance, its scattering with matter in direct-detection experiments, its present-day annihilation signature in indirect-detection experiments, and its production and detection at colliders, depend crucially on the WIMP coupling to standard-model (SM) particles. It is usually assumed that the WIMP couples to the SM sector through its interactions with quarks and leptons. In this paper we explore the possibility that the WIMP coupling to the SM sector is via electroweak gauge and Higgs bosons. In the absence of an ultraviolet-complete particle-physics model, we employ effective field theory to describe the WIMP-SM coupling. We consider both scalars and Dirac fermions as possible dark-matter candidates. Starting with an exhaustive list of operators up to dimension 8, we present detailed calculation of dark-matter annihilations to all possible final states, including γγ, γZ, γh, ZZ, Zh, W+W-, hh, and ffbar, and demonstrate the correlations among them. We compute the mass scale of the effective field theory necessary to obtain the correct dark-matter mass density, and well as the resulting photon line signals.

  15. Dynamics of hadronic molecule in one-boson exchange approach and possible heavy flavor molecules

    SciTech Connect

    Ding Guijun; Liu Jiafeng; Yan Mulin

    2009-03-01

    We extend the one pion exchange model at quark level to include the short distance contributions coming from {eta}, {sigma}, {rho} and {omega} exchange. This formalism is applied to discuss the possible molecular states of DD*/DD*, BB*/BB*, DD*, BB*, the pseudoscalar-vector systems with C=B=1 and C=-B=1 respectively. The ''{delta} function'' term contribution and the S-D mixing effects have been taken into account. We find the conclusions reached after including the heavier mesons exchange are qualitatively the same as those in the one pion exchange model. The previous suggestion that 1{sup ++} BB*/BB* molecule should exist, is confirmed in the one-boson exchange model, whereas DD* bound state should not exist. The DD*/DD* system can accommodate a 1{sup ++} molecule close to the threshold, the mixing between the molecule and the conventional charmonium has to be considered to identify this state with X(3872). For the BB* system, the pseudoscalar-vector systems with C=B=1 and C=-B=1, near threshold molecular states may exist. These bound states should be rather narrow, isospin is violated and the I=0 component is dominant. Experimental search channels for these states are suggested.

  16. Hyperspherical approach to a three-boson problem in two dimensions with a magnetic field

    NASA Astrophysics Data System (ADS)

    Rittenhouse, Seth T.; Wray, Andrew; Johnson, B. L.

    2016-01-01

    We examine a system of three-bosons confined to two dimensions in the presence of a perpendicular magnetic field within the framework of the adiabatic hyperspherical method. For the case of zero-range, regularized pseudopotential interactions, we find that the system is nearly separable in hyperspherical coordinates and that, away from a set of narrow avoided crossings, the full energy eigenspectrum as a function of the two-dimensional (2D) s -wave scattering length is well described by ignoring coupling between adiabatic hyperradial potentials. In the case of weak attractive or repulsive interactions, we find the lowest three-body energy states exhibit even-odd parity oscillations as a function of total internal 2D angular momentum and that for weak repulsive interactions, the universal lowest energy interacting state has an internal angular momentum of M =3 . With the inclusion of repulsive higher angular momentum we surmise that the origin of a set of "magic number" states (states with anomalously low energy) might emerge as the result of a combination of even-odd parity oscillations and the pattern of degeneracy in the noninteracting lowest Landau level states.

  17. Technology Mapping: An Approach for Developing Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    Technology mapping[TM] is proposed as an approach for developing technological pedagogical content knowledge (TPCK). The study discusses in detail instructional design guidelines in relation to the enactment of TM, and reports on empirical findings from a study with 72 pre-service primary teachers within the context of teaching them how to teach…

  18. Mapping between the classical and pseudoclassical models of a relativistic spinning particle in external bosonic and fermionic fields. II

    NASA Astrophysics Data System (ADS)

    Markov, Yu. A.; Markova, M. A.

    2016-06-01

    The exact solution of a system of bilinear identities derived in the first part of our work [1] for the case of real Grassmann-odd tensor aggregate of the type (S ,Vμ ,*Tμν ,Aμ , P) is obtained. The consistency of the solution with a corresponding system of bilinear identities including both the tensor variables and their derivatives (S ˙ ,V˙μ ,*T˙μν ,A˙μ , P ˙) is considered. The alternative approach in solving of the algebraic system based on introducing complex tensor quantities is discussed. This solution is used in constructing the mapping of the interaction terms of spinning particle with a background (Majorana) fermion field ΨMαi (x). A way of the extension of the obtained results for the case of the Dirac spinors (ψDα ,θDα) and a background Dirac field ΨDαi (x), is suggested. It is shown that for the construction of one-to-one correspondence between the most general spinors and the tensor variables, we need a four-fold increase of the number of the tensor ones. A connection with the higher-order derivative Lagrangians for a point particle and in particular, with the Lagrangian suggested by A.M. Polyakov, is proposed.

  19. One-Boson Approach to Dilepton Production in Nucleon-Nucleon Collisions.

    NASA Astrophysics Data System (ADS)

    Haglin, Kevin Lee

    1990-01-01

    We calculate energy dependent nucleon-nucleon total elastic cross sections and invariant mass dependent electron-positron pair production differential cross sections for the processes pp to pp, np to np and pp to ppe ^+e^-, pn to pne^+e ^- at laboratory kinetic energies in the 1-5 GeV range. These calculations will be based on relativistic quantum field theory in the one-boson-exchange (pi,rho,omega,sigma,delta, eta) approximation to the nucleon-nucleon scattering problem. There are several independent Feynman diagrams for each process--twenty-five for the case np to npe^+e^ - and forty-eight for the case pp to ppe^+e^- --which, for evaluation, require taking the trace of as many as ten gamma matrices and evaluating an angular integral of a quotient of polynomial functions of initial and final energies, particle masses, coupling constants and so on. These mathematical operations are carried out with the aid of the following algebraic manipulators: for the trace operations we use REDUCE 3.3 on the VAX at the ACS facility and for testing the angular integration algorithms we use MAPLE on the Cray-2 at the Minnesota Supercomputer Institute. Finally, we use Cray-2 Fortran for the resulting numerical substitutions. Gauge invariance is strictly observed while including strong and electromagnetic form factors. The numerical results for these calculations are compared with existing data from the Particle Data Group Booklet and compared with recently released data from the Dilepton Spectrometer (DLS) at the Bevalac of proton on Beryllium. For the latter comparison, the spectrometer's finite acceptance function is introduced before a rapidity and transverse momentum integration.

  20. An improved probability mapping approach to assess genome mosaicism

    PubMed Central

    Zhaxybayeva, Olga; Gogarten, J Peter

    2003-01-01

    Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events. PMID:12974984

  1. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as 'highly likely corn,' 'likely corn' or 'unlikely corn.' To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data. ?? 2003 Elsevier B.V. All rights reserved.

  2. Comparison of mapping approaches of design annual maximum daily precipitation

    NASA Astrophysics Data System (ADS)

    Szolgay, J.; Parajka, J.; Kohnová, S.; Hlavčová, K.

    2009-05-01

    In this study 2-year and 100-year annual maximum daily precipitation for rainfall-runoff studies and estimating flood hazard were mapped. The daily precipitation measurements at 23 climate stations from 1961-2000 were used in the upper Hron basin in central Slovakia. The choice of data preprocessing and interpolation methods was guided by their practical applicability and acceptance in the engineering hydrologic community. The main objective was to discuss the quality and properties of maps of design precipitation with a given return period with respect to the expectations of the end user. Four approaches to the preprocessing of annual maximum 24-hour precipitation data were used, and three interpolation methods employed. The first approach is the direct mapping of at-site estimates of distribution function quantiles; the second is the direct mapping of local estimates of the three parameters of the GEV distribution. In the third, the daily precipitation totals were interpolated into a regular grid network, and then the time series of the maximum daily precipitation totals in each grid point of the selected region were statistically analysed. In the fourth, the spatial distribution of the design precipitation was modeled by quantiles predicted by regional precipitation frequency analysis using the Hosking and Wallis procedure. The three interpolation methods used were the inverse distance weighting, nearest neighbor and the kriging method. Visual inspection and jackknife cross-validation were used to compare the combination of approaches.

  3. Noise pollution mapping approach and accuracy on landscape scales.

    PubMed

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations. PMID:23416205

  4. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  5. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches

    NASA Astrophysics Data System (ADS)

    Igoshev, P. A.; Timirgazin, M. A.; Gilmutdinov, V. F.; Arzhnikov, A. K.; Irkhin, V. Yu

    2015-11-01

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed.

  6. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches.

    PubMed

    Igoshev, P A; Timirgazin, M A; Gilmutdinov, V F; Arzhnikov, A K; Irkhin, V Yu

    2015-11-11

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed. PMID:26465091

  7. Uncertainty in Coastal Inundation Mapping: A Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Leon, J. X.; Callaghan, D. P.; Heuvelink, G.; Mills, M.; Phinn, S. R.

    2014-12-01

    Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly as extreme high sea levels and associated erosion are forecasted to increase in magnitude. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis propagate into the inundation mapping. Error propagation within spatial modelling can be appropriately analysed using, for example, a probabilistic framework based on geostatistical simulations. Geostatistical modelling takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The aim of this study was to elaborate probability maps incorporating the impacts of spatially variable and spatially correlated elevation errors in high-resolution DEMs combined with sea level rise uncertainties. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. Sea level rise uncertainty was non-parametrically modelled using 1000 Monte Carlo estimations which were processed to provide the probability density function numerically. The sea level rise uncertainties were modelled using a Weibull distribution with 0.95 scale and 2.2 shape parameters. These uncertainties were combined through addition (i.e., assuming they are independent), and when using probability density distributions, requires a convolution. This probabilistic approach can be used in a risk-aversive decision making process by planning for

  8. Wildfire susceptibility mapping: comparing deterministic and stochastic approaches

    NASA Astrophysics Data System (ADS)

    Pereira, Mário; Leuenberger, Michael; Parente, Joana; Tonini, Marj

    2016-04-01

    Estimating the probability of wildfire-occurrence in a certain area under particular environmental conditions represents a modern tool to support forest protection plans and to reduce fires consequences. This can be performed by the implementation of wildfire susceptibility mapping, normally achieved employing more or less sophisticated models which combine the predisposing variables (as raster datasets) into a geographic information systems (GIS). The selection of the appropriate variables includes the evaluation of success and the implementation of prediction curves, as well as independent probabilistic validations for different scenarios. These methods allow to define the spatial pattern of wildfire-occurrences, characterize the susceptibility of the territory, namely for specific fire causes/types, and can also account for other factors such as human behavior and social aspects. We selected Portugal as the study region which, due to its favorable climatic, topographic and vegetation conditions, is by far the European country most affected by wildfires. In addition, Verde and Zêzere (2010) performed a first assessment and validation of wildfire susceptibility and hazard in Portugal which can be used as benchmarking. The objectives of the present study comprise: (1) assessing the structural forest fire risk in Portugal using updated datasets, namely, with higher spatial resolution (80 m to 25 m), most recent vegetation cover (Corine Land Cover), longer fire history (1975-2013); and, (2) comparing linear vs non-linear approaches for wildfire susceptibility mapping. The data we used includes: (i) a DEM derived from the Shuttle Radar Topographic Mission in a resolution of 1 arc-seconds (DEM-SRTM 25 m) to assess elevation and slope; (ii) the Corine Land Cover inventory provided by the European Environment Agency (http://www.eea.europa.eu/pt) to produce the land use land cover map; (iii) the National Mapping Burnt Areas (NMBA) provided by the Institute for the

  9. Extended self-energy functional approach for strongly correlated lattice bosons in the superfluid phase

    SciTech Connect

    Arrigoni, Enrico; Knap, Michael; Linden, Wolfgang von der

    2011-07-01

    Among the various numerical techniques to study the physics of strongly correlated quantum many-body systems, the self-energy functional approach (SFA) has become increasingly important. In its previous form, however, SFA is not applicable to Bose-Einstein condensation or superfluidity. In this paper, we show how to overcome this shortcoming. To this end, we identify an appropriate quantity, which we term D, that represents the correlation correction of the condensate order parameter, as it does the self-energy for Green's function. An appropriate functional is derived, which is stationary at the exact physical realization of D and of the self-energy. Its derivation is based on a functional-integral representation of the grand potential followed by an appropriate sequence of Legendre transformations. The approach is not perturbative and, therefore, applicable to a wide range of models with local interactions. We show that the variational cluster approach based on the extended self-energy functional is equivalent to the ''pseudoparticle'' approach proposed in Phys. Rev. B 83, 134507 (2011). We present results for the superfluid density in the two-dimensional Bose-Hubbard model, which shows a remarkable agreement with those of quantum-Monte-Carlo calculations.

  10. A covariance fitting approach for correlated acoustic source mapping.

    PubMed

    Yardibi, Tarik; Li, Jian; Stoica, Petre; Zawodny, Nikolas S; Cattafesta, Louis N

    2010-05-01

    Microphone arrays are commonly used for noise source localization and power estimation in aeroacoustic measurements. The delay-and-sum (DAS) beamformer, which is the most widely used beamforming algorithm in practice, suffers from low resolution and high sidelobe level problems. Therefore, deconvolution approaches, such as the deconvolution approach for the mapping of acoustic sources (DAMAS), are often used for extracting the actual source powers from the contaminated DAS results. However, most deconvolution approaches assume that the sources are uncorrelated. Although deconvolution algorithms that can deal with correlated sources, such as DAMAS for correlated sources, do exist, these algorithms are computationally impractical even for small scanning grid sizes. This paper presents a covariance fitting approach for the mapping of acoustic correlated sources (MACS), which can work with uncorrelated, partially correlated or even coherent sources with a reasonably low computational complexity. MACS minimizes a quadratic cost function in a cyclic manner by making use of convex optimization and sparsity, and is guaranteed to converge at least locally. Simulations and experimental data acquired at the University of Florida Aeroacoustic Flow Facility with a 63-element logarithmic spiral microphone array in the absence of flow are used to demonstrate the performance of MACS. PMID:21117743

  11. Teaching Map Skills: An Inductive Approach. Part Three.

    ERIC Educational Resources Information Center

    Anderson, Jeremy

    1985-01-01

    These learning activities involve secondary geography students in making a turf map, using map grids, solving problems dealing with map scale, and making a map scale. Complete teacher instructions are provided. (RM)

  12. Phase diagram of ultracold atoms in optical lattices: Comparative study of slave fermion and slave boson approaches to Bose-Hubbard model

    SciTech Connect

    Yu Yue; Chui, S. T.

    2005-03-01

    We perform a comparative study of the finite temperature behavior of ultracold Bose atoms in optical lattices by the slave fermion and the slave boson approaches to the Bose-Hubbard model. The phase diagram of the system is presented. Although both approaches are equivalent without approximations, the mean field theory based on the slave fermion technique is quantitatively more appropriate. Conceptually, the slave fermion approach automatically excludes the double occupancy of two identical fermions on the same lattice site. By comparing to known results in limiting cases, we find the slave fermion approach better than the slave boson approach. For example, in the non-interacting limit, the critical temperature of the superfluid-normal liquid transition calculated by the slave fermion approach is closer to the well-known ideal Bose gas result. At zero-temperature limit of the critical interaction, strength from the slave fermion approach is also closer to that from the direct calculation using a zero-temperature mean field theory.

  13. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (K-Map) Generation Skill

    ERIC Educational Resources Information Center

    Görgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  14. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (k-map) Generation Skill

    ERIC Educational Resources Information Center

    Gorgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  15. Canonical map approach to channeling stability in crystals. II

    NASA Astrophysics Data System (ADS)

    Sáenz, A. W.

    1987-11-01

    A nonrelativistic and a relativistic classical Hamiltonian model of two degrees of freedom are considered describing the plane motion of a particle in a potential V(x1,x2)[(x1,x2) =Cartesian coordinates]. Suppose V(x1,x2) is real analytic in its arguments in a neighborhood of the line x2=0, one-periodic in x1 there, and such that the average value of ∂V(x1,0)/∂x2 vanishes. It is proved that, under these conditions and provided that the particle energy E is sufficiently large, there exist for all time two distinguished solutions, one satisfying the equations of motion of the nonrelativistic model and the other those of the relativistic model, whose corresponding configuration-space orbits are one-periodic in x1 and approach the line x2=0 as E→∞. The main theorem is that these solutions are (future) orbitally stable at large enough E if V satisfies the above conditions, as well as natural requirements of linear and nonlinear stability. To prove their existence, one uses a well-known theorem, for which a new and simpler proof is provided, and properties of certain natural canonical maps appropriate to these respective models. It is shown that such solutions are orbitally stable by reducing the maps in question to Birkhoff canonical form and then applying a version of the Moser twist theorem. The approach used here greatly lightens the labor of deriving key estimates for the above maps, these estimates being needed to effect this reduction. The present stability theorem is physically interesting because it is the first rigorous statement on the orbital stability of certain channeling motions of fast charged particles in rigid two-dimensional lattices, within the context of models of the stated degree of generality.

  16. A filtering approach to edge preserving MAP estimation of images.

    PubMed

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing. PMID:21078580

  17. From the shell model to the interacting boson model

    SciTech Connect

    Ginocchio, J.N.; Johnson, C.W.

    1994-07-01

    Starting from a general, microscopic fermion-pair-to-boson mapping of a complete fermion space that preserves Hermitian conjugation, we show that the resulting infinite and non-convergent boson Hamilitonian can be factored into a finite (e.g., a 1 + 2-body fermion Hamiltonian is mapped to a 1 + 2-body boson Hamiltonian) image Hamilitonian times the norm operator, and it is the norm operator that is infinite and non-convergent. We then truncate to a collective boson space and we give conditions under which the exact boson images of finite fermion operators are also finite in the truncated basis.

  18. Einstein's Gravitational Field Approach to Dark Matter and Dark Energy-Geometric Particle Decay into the Vacuum Energy Generating Higgs Boson and Heavy Quark Mass

    NASA Astrophysics Data System (ADS)

    Christensen, Walter James

    2015-08-01

    During an interview at the Niels Bohr Institute David Bohm stated, "according to Einstein, particles should eventually emerge as singularities, or very strong regions of stable pulses of (the gravitational) field" [1]. Starting from this premise, we show spacetime, indeed, manifests stable pulses (n-valued gravitons) that decay into the vacuum energy to generate all three boson masses (including Higgs), as well as heavy-quark mass; and all in precise agreement with the 2010 CODATA report on fundamental constants. Furthermore, our relativized quantum physics approach (RQP) answers to the mystery surrounding dark energy, dark matter, accelerated spacetime, and why ordinary matter dominates over antimatter.

  19. Teaching population health: a competency map approach to education.

    PubMed

    Kaprielian, Victoria S; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K; Murphy, Gwen; Tran, Anh N; Sheline, Barbara L; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H; Scutchfield, F Douglas; Michener, J Lloyd

    2013-05-01

    A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals' training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina, to improve the local community's health. On the basis of these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke's efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919

  20. Teaching Population Health: A Competency Map Approach to Education

    PubMed Central

    Kaprielian, Victoria S.; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K.; Murphy, Gwen; Tran, Anh N.; Sheline, Barbara L.; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J.; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H.; Scutchfield, F. Douglas; Michener, J. Lloyd

    2013-01-01

    A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals’ training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina to improve the local community’s health. Based on these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke’s efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919

  1. Mapping African animal trypanosomosis risk: the landscape approach.

    PubMed

    Guerrini, Laure; Bouyer, Jérémy

    2007-01-01

    African animal trypanosomosis (AAT) is a major hindrance to cattle breeding in the Mouhoun River Basin of Burkina Faso. The authors describe a landscape approach that enables the mapping of tsetse densities and AAT risk along the Mouhoun River loop (702 km long) in Burkina Faso. Three epidemiological landscapes were described: the first and most dangerous corresponded to protected forests and their border areas, with a 0.74 apparent density of infectious fly per trap per day (ADTi), the second to a partially disturbed vegetal formation, with a 0.20 ADTi and the third to a completely disturbed landscape with a 0.08 ADTi. Using this risk indicator, the first landscape was 3.92 more risky than the second which was 3.13 more risky than the last. Similar infectious rates were found in all landscapes (approximately 8%) but tsetse apparent densities dropped significantly (p<0.001) in half-disturbed (2.66) and disturbed landscapes (0.80) in comparison to the natural and border landscapes (11.77). Females were significantly younger (mean physiological age of 29 days) only in the most disturbed landscape (p<0.05) than in the two others one (41 days). According to these results, practical implications of stratifying AAT risk and mapping tsetse densities in vector control campaigns are discussed. PMID:20422544

  2. Mapping the distribution of malaria: current approaches and future directions

    USGS Publications Warehouse

    Johnson, Leah R.; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.

    2015-01-01

    Mapping the distribution of malaria has received substantial attention because the disease is a major source of illness and mortality in humans, especially in developing countries. It also has a defined temporal and spatial distribution. The distribution of malaria is most influenced by its mosquito vector, which is sensitive to extrinsic environmental factors such as rainfall and temperature. Temperature also affects the development rate of the malaria parasite in the mosquito. Here, we review the range of approaches used to model the distribution of malaria, from spatially explicit to implicit, mechanistic to correlative. Although current methods have significantly improved our understanding of the factors influencing malaria transmission, significant gaps remain, particularly in incorporating nonlinear responses to temperature and temperature variability. We highlight new methods to tackle these gaps and to integrate new data with models.

  3. A statistical approach for validating eSOTER and digital soil maps in front of traditional soil maps

    NASA Astrophysics Data System (ADS)

    Bock, Michael; Baritz, Rainer; Köthe, Rüdiger; Melms, Stephan; Günther, Susann

    2015-04-01

    During the European research project eSOTER, three different Digital Soil Maps (DSM) were developed for the pilot area Chemnitz 1:250,000 (FP7 eSOTER project, grant agreement nr. 211578). The core task of the project was to revise the SOTER method for the interpretation of soil and terrain data. It was one of the working hypothesis that eSOTER does not only provide terrain data with typical soil profiles, but that the new products actually perform like a conceptual soil map. The three eSOTER maps for the pilot area considerably differed in spatial representation and content of soil classes. In this study we compare the three eSOTER maps against existing reconnaissance soil maps keeping in mind that traditional soil maps have many subjective issues and intended bias regarding the overestimation and emphasize of certain features. Hence, a true validation of the proper representation of modeled soil maps is hardly possible; rather a statistical comparison between modeled and empirical approaches is possible. If eSOTER data represent conceptual soil maps, then different eSOTER, DSM and conventional maps from various sources and different regions could be harmonized towards consistent new data sets for large areas including the whole European continent. One of the eSOTER maps has been developed closely to the traditional SOTER method: terrain classification data (derived from SRTM DEM) were combined with lithology data (re-interpreted geological map); the corresponding terrain units were then extended with soil information: a very dense regional soil profile data set was used to define soil mapping units based on a statistical grouping of terrain units. The second map is a pure DSM map using continuous terrain parameters instead of terrain classification; radiospectrometric data were used to supplement parent material information from geology maps. The classification method Random Forest was used. The third approach predicts soil diagnostic properties based on

  4. A multi-model ensemble approach to seabed mapping

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  5. Symmetry-improved 2PI approach to the Goldstone-boson IR problem of the SM effective potential

    NASA Astrophysics Data System (ADS)

    Pilaftsis, Apostolos; Teresi, Daniele

    2016-05-01

    The effective potential of the Standard Model (SM), from three loop order and higher, suffers from infrared (IR) divergences arising from quantum effects due to massless would-be Goldstone bosons associated with the longitudinal polarizations of the W± and Z bosons. Such IR pathologies also hinder accurate evaluation of the two-loop threshold corrections to electroweak quantities, such as the vacuum expectation value of the Higgs field. However, these divergences are an artifact of perturbation theory, and therefore need to be consistently resummed in order to obtain an IR-safe effective potential. The so-called Two-Particle-Irreducible (2PI) effective action provides a rigorous framework to consistently perform such resummations, without the need to resort to ad hoc subtractions or running into the risk of over-counting contributions. By considering the recently proposed symmetry-improved 2PI formalism, we address the problem of the Goldstone-boson IR divergences of the SM effective potential in the gaugeless limit of the theory. In the same limit, we evaluate the IR-safe symmetry-improved 2PI effective potential, after taking into account quantum loops of chiral fermions, as well as the renormalization of spurious custodially breaking effects triggered by fermionic Yukawa interactions. Finally, we compare our results with those obtained with other methods presented in the literature.

  6. Pure P2P mediation system: A mappings discovery approach

    NASA Astrophysics Data System (ADS)

    selma, El yahyaoui El idrissi; Zellou, Ahmed; Idri, Ali

    2015-02-01

    The information integration systems consist in offering a uniform interface to provide access to a set of autonomous and distributed information sources. The most important advantage of this system is that it allows users to specify what they want, rather than thinking about how to get the responses. The works realized in this area have particular leads to two major classes of integration systems: the mediation systems based on the paradigm mediator / adapter and peer to peer systems (P2P). The combination of both systems has led to a third type; is the mediation P2P systems. The P2P systems are large-scale systems, self-organized and distributed. They allow the resource management in a completely decentralized way. However, the integration of structured information sources, heterogeneous and distributed proves to be a complex problem. The objective of this work is to propose an approach to resolve conflicts and establish a mapping between the heterogeneous elements. This approach is based on clustering; the latter is to group similar Peers that share common information in the same subnet. Thus, to facilitate the heterogeneity, we introduced three additional layers of our hierarchy of peers: internal schema, external schema and Schema directory peer. We used linguistic techniques, and precisely the name correspondence technique, that is based on the similarity of names to propose a correspondence.

  7. Uncertainty propagation in a cascade modelling approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Rodríguez-Rincón, J. P.; Pedrozo-Acuña, A.; Breña Naranjo, J. A.

    2014-07-01

    The purpose of this investigation is to study the propagation of meteorological uncertainty within a cascade modelling approach to flood mapping. The methodology is comprised of a Numerical Weather Prediction Model (NWP), a distributed rainfall-runoff model and a standard 2-D hydrodynamic model. The cascade of models is used to reproduce an extreme flood event that took place in the Southeast of Mexico, during November 2009. The event is selected as high quality field data (e.g. rain gauges; discharge) and satellite imagery are available. Uncertainty in the meteorological model (Weather Research and Forecasting model) is evaluated through the use of a multi-physics ensemble technique, which considers twelve parameterization schemes to determine a given precipitation. The resulting precipitation fields are used as input in a distributed hydrological model, enabling the determination of different hydrographs associated to this event. Lastly, by means of a standard 2-D hydrodynamic model, hydrographs are used as forcing conditions to study the propagation of the meteorological uncertainty to an estimated flooded area. Results show the utility of the selected modelling approach to investigate error propagation within a cascade of models. Moreover, the error associated to the determination of the runoff, is showed to be lower than that obtained in the precipitation estimation suggesting that uncertainty do not necessarily increase within a model cascade.

  8. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  9. Teaching Map Skills: An Inductive Approach. Part Four.

    ERIC Educational Resources Information Center

    Anderson, Jeremy

    1985-01-01

    Satisfactory completion of this self-contained map exercise will demonstrate student ability to use symbols, legends, scale, orientation, index, and grid in map reading and map use to give directions for way-finding. The exercise should take one class period to complete. (RM)

  10. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  11. High School Biology: A Group Approach to Concept Mapping.

    ERIC Educational Resources Information Center

    Brown, David S.

    2003-01-01

    Explains concept mapping as an instructional method in cooperative learning environments, and describes a study investigating the effectiveness of concept mapping on student learning during a photosynthesis and cellular respiration unit. Reports on the positive effects of concept mapping in the experimental group. (Contains 16 references.) (YDS)

  12. Comparison of Mixed-Model Approaches for Association Mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Association-mapping methods promise to overcome the limitations of linkage-mapping methods. The main objectives of this study were to (i) evaluate various methods for association mapping in the autogamous species wheat using an empirical data set, (ii) determine a marker-based kinship matrix using a...

  13. Mapping diffusion in a living cell via the phasor approach.

    PubMed

    Ranjit, Suman; Lanzano, Luca; Gratton, Enrico

    2014-12-16

    Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created. PMID:25517145

  14. Current Approaches Toward Quantitative Mapping of the Interactome

    PubMed Central

    Buntru, Alexander; Trepte, Philipp; Klockmeier, Konrad; Schnoegl, Sigrid; Wanker, Erich E.

    2016-01-01

    Protein–protein interactions (PPIs) play a key role in many, if not all, cellular processes. Disease is often caused by perturbation of PPIs, as recently indicated by studies of missense mutations. To understand the associations of proteins and to unravel the global picture of PPIs in the cell, different experimental detection techniques for PPIs have been established. Genetic and biochemical methods such as the yeast two-hybrid system or affinity purification-based approaches are well suited to high-throughput, proteome-wide screening and are mainly used to obtain qualitative results. However, they have been criticized for not reflecting the cellular situation or the dynamic nature of PPIs. In this review, we provide an overview of various genetic methods that go beyond qualitative detection and allow quantitative measuring of PPIs in mammalian cells, such as dual luminescence-based co-immunoprecipitation, Förster resonance energy transfer or luminescence-based mammalian interactome mapping with bait control. We discuss the strengths and weaknesses of different techniques and their potential applications in biomedical research. PMID:27200083

  15. Interacting boson model from energy density functionals: {gamma}-softness and the related topics

    SciTech Connect

    Nomura, K.

    2012-10-20

    A comprehensive way of deriving the Hamiltonian of the interacting boson model (IBM) is described. Based on the fact that the multi-nucleon induced surface deformation in finite nucleus is simulated by effective boson degrees of freedom, the potential energy surface calculated with self-consistent mean-field method employing a given energy density functional (EDF) is mapped onto the IBM analog, and thereby the excitation spectra and transition rates with good symmetry quantum numbers are calculated. Recent applications of the proposed approach are reported: (i) an alternative robust interpretation of the {gamma}-soft nuclei and (ii) shape coexistence in lead isotopes.

  16. A faster and economical approach to floodplain mapping using the SSURGO soil database

    NASA Astrophysics Data System (ADS)

    Sangwan, N.; Merwade, V.

    2014-12-01

    Floods are the most damaging of all natural disasters, adversely affecting millions of lives and causing financial losses worth billions of dollars every year across the globe. Flood inundation maps play a key role in the assessment and mitigation of potential flood hazards. However, there are several communities in the United States where flood risk maps are not available due to the lack of the resources needed to create such maps through the conventional modeling approach. The objective of this study is to develop and examine an economical alternative approach to floodplain mapping using widely available SSURGO soil data in the United States. By using the state of Indiana as a test case, floodplain maps are developed for the entire state by identifying the flood-prone soil map units based on their attributes recorded in the SSURGO database. For validation, the flood extents obtained from the soil data are compared with the extents predicted by other floodplain maps, including the Federal Emergency Management Agency (FEMA) issued Flood Insurance Rate Maps (FIRM), flood extents observed during past floods, and other flood maps derived using Digital Elevation Models (DEMs). In general, SSURGO based floodplain maps are found to be largely in agreement with flood inundation maps created by FEMA. Comparison between the FEMA maps and the SSURGO derived floodplain maps show an overlap ranging from 65 to 90 percent. Similar results are also found when the SSURGO derived floodplain maps are compared with FEMA maps for recent flood events in other states including Minnesota, Washington and Wisconsin. Although not in perfect conformance with reference flood maps, the SSURGO soil data approach offers an economical and faster alternative to floodplain mapping in areas where detailed flood modeling and mapping has not been conducted.

  17. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  18. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    ERIC Educational Resources Information Center

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  19. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-07-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff and flood predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP-mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexities of automatic DRP-mapping approaches affect hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison, and a deviation map were derived. The automatically derived DRP maps were used in synthetic runoff simulations with an adapted version of the PREVAH hydrological model, and simulation results compared with those from simulations using the reference maps. The DRP maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. Furthermore, we argue not to use

  20. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, M.; Buss, R.; Scherrer, S.; Margreth, M.; Zappa, M.

    2015-12-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexity of automatic DRP mapping approaches affects hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison and a deviation map were derived. The automatically derived DRP-maps were used in synthetic runoff simulations with an adapted version of the hydrological model PREVAH, and simulation results compared with those from simulations using the reference maps. The DRP-maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP-maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. We therefore recommend not only using expert

  1. Structure-mapping approach to analogy and metaphor

    SciTech Connect

    Gentner, D.

    1982-01-01

    The structure-mapping theory of analogy describes a set of principles by which the interpretation of an analogy is derived from the meanings of its parts. These principles are characterized as implicit rules for mapping knowledge about a base domain into a target domain. Two important features of the theory are that the rules depend only on syntactic properties of the knowledge representation, and not on the specific content of the domains; and the theoretical framework allows analogies to be distinguished cleanly from literal similarity statements, applications of general laws, and other kinds of comparisons. Two mapping principles are described: relations between objects, rather than attributes of objects, are mapped from base to target; and the particular relations mapped are determined by systematicity, as defined by the existence of higher-order relations. 4 references.

  2. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    PubMed

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets. PMID:26336114

  3. Space Borne Swath Mapping Laser Altimeters - Comparison of Measurement Approaches

    NASA Astrophysics Data System (ADS)

    Sun, X.; Abshire, J. B.; Harding, D. J.

    2007-12-01

    Laser altimetry is an important technique for studying the surface topography of the planets and the Earth from orbit. Presently orbital laser altimeters profile surface height along a single ground track, such as the Geoscience Laser Altimeter System (GLAS) on Ice, Cloud, and land Elevation Satellite (ICESat). NASA is developing new technologies for an orbiting swath mapping laser altimeter with faster pulse rate and smaller footprint size to provide an instantaneous 3-dimentional measurement of the of icesheets, land topography and vegetation structure. The goal is to provide a greater than 200 m wide swath with 5 to 10 m diameter laser footprint from a 400 km altitude orbit. To achieve these goals, we have to use more efficient laser transmitters and more sensitive detectors to allow simultaneous multi-channel measurement with a reasonable instrument size and electrical power requirement. The measurement efficiency in terms of electrical energy needed per laser ranging measurement needs to be improved by more than an order of magnitude. Several different approaches were considered, including the use of fiber lasers, shorter laser pulse widths, lower noise analog detectors and photon counting detectors. The receiver sensitivity was further improved by averaging the results from a number of laser pulse measurements. Different laser pulse modulation formats, such as the pseudo random noise code modulation used in the Global Position System (GPS), were investigated to give more flexibility in laser selection and to further improve the ranging performance. We have analyzed and compared measurement performance for several different approaches using the receiver models that was validated with GLAS in orbit measurement data. We compared measurement performance with the traditional high-power low-pulse-rate laser transmitters to those with low-energy high-pulse-rate laser transmitters. For this work we considered laser characteristics representative of Microchip lasers

  4. Study of hole pair condensation based on the SU(2) Slave-Boson approach to the t-J Hamiltonian: Temperature, momentum and doping dependences of spectral functions

    SciTech Connect

    Salk, S.H.S.; Lee, S.S.

    1999-11-01

    Based on the U(1) and SU(2) slave-boson approaches to the t-J Hamiltonian, the authors evaluate the one electron spectral functions for the hole doped high {Tc} cuprates for comparison with the angle resolved photoemission spectroscopy (ARPES) data. They find that the observed quasiparticle peak in the superconducting state is correlated with the hump which exists in the normal state. They find that the spectral weight of the quasiparticle peak increases as doping rate increases, which is consistent with observation. As a consequence of the phase fluctuation effects of the spinon and holon pairing order parameters the spectral weight of the predicted peak obtained from the SU(2) theory is found to be smaller than the one predicted from U(1) mean field theory.

  5. Transboundary aquifer mapping and management in Africa: a harmonised approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Yvan; Villholth, Karen G.

    2013-11-01

    Recent attention to transboundary aquifers (TBAs) in Africa reflects the growing importance of these resources for development in the continent. However, relatively little research on these aquifers and their best management strategies has been published. This report recapitulates progress on mapping and management frameworks for TBAs in Africa. The world map on transboundary aquifers presented at the 6th World Water Forum in 2012 identified 71 TBA systems in Africa. This report presents an updated African TBA map including 80 shared aquifers and aquifer systems superimposed on 63 international river basins. Furthermore, it proposes a new nomenclature for the mapping based on three sub-regions, reflecting the leading regional development communities. The map shows that TBAs represent approximately 42 % of the continental area and 30 % of the population. Finally, a brief review of current international law, specific bi- or multilateral treaties, and TBA management practice in Africa reveals little documented international conflicts over TBAs. The existing or upcoming international river and lake basin organisations offer a harmonised institutional base for TBA management while alternative or supportive models involving the regional development communities are also required. The proposed map and geographical classification scheme for TBAs facilitates identification of options for joint institutional setups.

  6. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. PMID:22325583

  7. Decoherence of spin-deformed bosonic model

    SciTech Connect

    Dehdashti, Sh.; Mahdifar, A.; Bagheri Harouni, M.; Roknizadeh, R.

    2013-07-15

    The decoherence rate and some parameters affecting it are investigated for the generalized spin-boson model. We consider the spin-bosonic model when the bosonic environment is modeled by the deformed harmonic oscillators. We show that the state of the environment approaches a non-linear coherent state. Then, we obtain the decoherence rate of a two-level system which is in contact with a deformed bosonic environment which is either in thermal equilibrium or in the ground state. By using some recent realization of f-deformed oscillators, we show that some physical parameters strongly affect the decoherence rate of a two-level system. -- Highlights: •Decoherence of the generalized spin-boson model is considered. •In this model the environment consists of f-oscillators. •Via the interaction, the state of the environment approaches non-linear coherent states. •Effective parameters on decoherence are considered.

  8. How Albot0 finds its way home: a novel approach to cognitive mapping using robots.

    PubMed

    Yeap, Wai K

    2011-10-01

    Much of what we know about cognitive mapping comes from observing how biological agents behave in their physical environments, and several of these ideas were implemented on robots, imitating such a process. In this paper a novel approach to cognitive mapping is presented whereby robots are treated as a species of their own and their cognitive mapping is being investigated. Such robots are referred to as Albots. The design of the first Albot, Albot0 , is presented. Albot0 computes an imprecise map and employs a novel method to find its way home. Both the map and the return-home algorithm exhibited characteristics commonly found in biological agents. What we have learned from Albot0 's cognitive mapping are discussed. One major lesson is that the spatiality in a cognitive map affords us rich and useful information and this argues against recent suggestions that the notion of a cognitive map is not a useful one. PMID:25164506

  9. Wormholes and Goldstone bosons

    SciTech Connect

    Lee, K.

    1988-07-18

    The quantum theory of a complex scalar field coupled to gravity is considered. A formalism for the semiclassical approach in Euclidean time is developed and used to study wormhole physics. The conserved global charge plays an essential role. Wormhole physics turns on only after the symmetry is spontaneously broken. An effective self-interaction for Goldstone bosons due to wormholes and child universes is shown to be a cosine potential, whose vacuum energy will be reduced by the cosmic expansion. Some implications and questions are discussed.

  10. Eulerian Mapping Closure Approach for Probability Density Function of Concentration in Shear Flows

    NASA Technical Reports Server (NTRS)

    He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Eulerian mapping closure approach is developed for uncertainty propagation in computational fluid mechanics. The approach is used to study the Probability Density Function (PDF) for the concentration of species advected by a random shear flow. An analytical argument shows that fluctuation of the concentration field at one point in space is non-Gaussian and exhibits stretched exponential form. An Eulerian mapping approach provides an appropriate approximation to both convection and diffusion terms and leads to a closed mapping equation. The results obtained describe the evolution of the initial Gaussian field, which is in agreement with direct numerical simulations.

  11. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    PubMed Central

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  12. Mapping Sustainability Initiatives across a Region: An Innovative Survey Approach

    ERIC Educational Resources Information Center

    Somerville, Margaret; Green, Monica

    2012-01-01

    The project of mapping sustainability initiatives across a region is part of a larger program of research about place and sustainability education for the Anthropocene, the new geological age of human-induced planetary changes (Zalasiewicz, Williams, Steffen, & Crutzen, 2010). The study investigated the location, nature and type of sustainability…

  13. The Facebook influence model: a concept mapping approach.

    PubMed

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  14. Trans-ethnic study design approaches for fine-mapping.

    PubMed

    Asimit, Jennifer L; Hatzikotoulas, Konstantinos; McCarthy, Mark; Morris, Andrew P; Zeggini, Eleftheria

    2016-08-01

    Studies that traverse ancestrally diverse populations may increase power to detect novel loci and improve fine-mapping resolution of causal variants by leveraging linkage disequilibrium differences between ethnic groups. The inclusion of African ancestry samples may yield further improvements because of low linkage disequilibrium and high genetic heterogeneity. We investigate the fine-mapping resolution of trans-ethnic fixed-effects meta-analysis for five type II diabetes loci, under various settings of ancestral composition (European, East Asian, African), allelic heterogeneity, and causal variant minor allele frequency. In particular, three settings of ancestral composition were compared: (1) single ancestry (European), (2) moderate ancestral diversity (European and East Asian), and (3) high ancestral diversity (European, East Asian, and African). Our simulations suggest that the European/Asian and European ancestry-only meta-analyses consistently attain similar fine-mapping resolution. The inclusion of African ancestry samples in the meta-analysis leads to a marked improvement in fine-mapping resolution. PMID:26839038

  15. Raman mapping of oral buccal mucosa: a spectral histopathology approach

    NASA Astrophysics Data System (ADS)

    Behl, Isha; Kukreja, Lekha; Deshmukh, Atul; Singh, S. P.; Mamgain, Hitesh; Hole, Arti R.; Krishna, C. Murali

    2014-12-01

    Oral cancer is one of the most common cancers worldwide. One-fifth of the world's oral cancer subjects are from India and other South Asian countries. The present Raman mapping study was carried out to understand biochemical variations in normal and malignant oral buccal mucosa. Data were acquired using WITec alpha 300R instrument from 10 normal and 10 tumors unstained tissue sections. Raman maps of normal sections could resolve the layers of epithelium, i.e. basal, intermediate, and superficial. Inflammatory, tumor, and stromal regions are distinctly depicted on Raman maps of tumor sections. Mean and difference spectra of basal and inflammatory cells suggest abundance of DNA and carotenoids features. Strong cytochrome bands are observed in intermediate layers of normal and stromal regions of tumor. Epithelium and stromal regions of normal cells are classified by principal component analysis. Classification among cellular components of normal and tumor sections is also observed. Thus, the findings of the study further support the applicability of Raman mapping for providing molecular level insights in normal and malignant conditions.

  16. Computer-Assisted Argument Mapping: A "Rationale" Approach

    ERIC Educational Resources Information Center

    Davies, W. Martin

    2009-01-01

    Computer-Assisted Argument Mapping (CAAM) is a new way of understanding arguments. While still embryonic in its development and application, CAAM is being used increasingly as a training and development tool in the professions and government. Inroads are also being made in its application within education. CAAM claims to be helpful in an…

  17. Job Seekers' Perceptions of Teleworking: A Cognitive Mapping Approach.

    ERIC Educational Resources Information Center

    Kerrin, Maire; Hone, Kate

    2001-01-01

    College students (n=40) and nonstudent job seekers (n=20) rated four dimensions of telework. Results were plotted in cognitive maps. Students preferred office work to telework, citing lack of social interaction. Nonstudents, slightly older and more likely to be parents, slightly preferred telework. Targeting recruitment to account for these…

  18. Cognitions of Expert Supervisors in Academe: A Concept Mapping Approach

    ERIC Educational Resources Information Center

    Kemer, Gülsah; Borders, L. DiAnne; Willse, John

    2014-01-01

    Eighteen expert supervisors reported their thoughts while preparing for, conducting, and evaluating their supervision sessions. Concept mapping (Kane & Trochim, [Kane, M., 2007]) yielded 195 cognitions classified into 25 cognitive categories organized into 5 supervision areas: conceptualization of supervision, supervisee assessment,…

  19. The Facebook Influence Model: A Concept Mapping Approach

    PubMed Central

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  20. A New Approach for Constructing the Concept Map

    ERIC Educational Resources Information Center

    Tseng, Shian-Shyong; Sue, Pei-Chi; Su, Jun-Ming; Weng, Jui-Feng; Tsai, Wen-Nung

    2007-01-01

    In recent years, e-learning system has become more and more popular and many adaptive learning environments have been proposed to offer learners customized courses in accordance with their aptitudes and learning results. For achieving the adaptive learning, a predefined concept map of a course is often used to provide adaptive learning guidance…

  1. Higgs boson at LHC: a diffractive opportunity

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2009-03-23

    An alternative process is presented for diffractive Higgs boson production in peripheral pp collisions, where the particles interact through the Double Pomeron Exchange. The event rate is computed as a central-rapidity distribution for Tevatron and LHC energies leading to a result around 0.6 pb, higher than the predictions from previous approaches. Therefore, this result arises as an enhanced signal for the detection of the Higgs boson in hadron colliders. The predictions for the Higgs boson photoproduction are compared to the ones obtained from a similar approach proposed by the Durham group, enabling an analysis of the future developments of its application to pp and AA collisions.

  2. Topographic Mapping of Mars: Approaching the Human Scale

    NASA Astrophysics Data System (ADS)

    Kirk, R. L.; Howington-Kraus, E.; Soderblom, L. A.; Archinal, B. A.

    2002-12-01

    In only three decades, topographic mapping of Mars has progressed from the planetary to the personal scale. The first crude contour maps of the early 1970s, based on Earth-based radar and atmospheric occultation and sounding data, revealed such continental-scale features as the Tharsis bulge. Stereoanalysis of Mariner 9 and Viking Orbiter images filled in some of the details, yielding by the late 1980s a global digital elevation model (DEM) interpolated from 1-km contours and containing systematic errors of many km. This DEM was superseded in the 1990s by data from the Mars Orbiter Laser Altimeter (MOLA), with an accuracy <10 m vertically and ~ 100 m horizontally. MOLA has provided the definitive global map of Mars for the foreseeable future; its most significant weakness is its sample spacing (300 m along-track, with many gaps >1 km and a few up to 10 km between orbit tracks). Stereoanalysis of images from the narrow-angle Mars Orbiter Camera (MOC-NA) can be used to produce local DEMs with a vertical precision similar to MOLA (e.g., ~ 3 m for 3 m/pixel images with ~ 10° convergence), horizontal resolution of 3 pixels (~ 10 m for 3 m images), and control to MOLA for absolute accuracy comparable to the latter. Over 150 MOC-NA stereopairs have been identified, and more continue to be obtained. We will describe our use of the USGS cartographic system ISIS with commercial photogrammetric software SOCET SET (© BAE Systems) to produce DEMs from such pairs. This and similar work by other groups brings topographic mapping close to the scale of features seen from the ground and processes active at the present day. We are also using high-resolution stereo DEMs (and, in some cases, altimetry) as the starting point for calibration of photoclinometry, which yields DEMs with a horizontal resolution of one pixel and a local vertical precision of a small fraction of a pixel. The techniques we describe are directly applicable to other Mars imagers both present (THEMIS) and

  3. The Higgs Boson.

    ERIC Educational Resources Information Center

    Veltman, Martinus J. G.

    1986-01-01

    Reports recent findings related to the particle Higgs boson and examines its possible contribution to the standard mode of elementary processes. Critically explores the strengths and uncertainties of the Higgs boson and proposed Higgs field. (ML)

  4. Using a Linkage Mapping Approach to Identify QTL for Day-Neutrality in the Octoploid Strawberry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A linkage mapping approach was used to identify quantitative trait loci (QTL) associated with day-neutrality in the commercial strawberry, Fragaria ×ananassa (Duch ex Rozier). Amplified Fragment Length Polymorphic (AFLP) markers were used to build a genetic map with a population of 127 lines develo...

  5. Concept Map Engineering: Methods and Tools Based on the Semantic Relation Approach

    ERIC Educational Resources Information Center

    Kim, Minkyu

    2013-01-01

    The purpose of this study is to develop a better understanding of technologies that use natural language as the basis for concept map construction. In particular, this study focuses on the semantic relation (SR) approach to drawing rich and authentic concept maps that reflect students' internal representations of a problem situation. The…

  6. A Time Sequence-Oriented Concept Map Approach to Developing Educational Computer Games for History Courses

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Yang, Kai-Hsiang; Chen, Jing-Hong

    2015-01-01

    Concept maps have been recognized as an effective tool for students to organize their knowledge; however, in history courses, it is important for students to learn and organize historical events according to the time of their occurrence. Therefore, in this study, a time sequence-oriented concept map approach is proposed for developing a game-based…

  7. A Knowledge Intensive Approach to Mapping Clinical Narrative to LOINC

    PubMed Central

    Fiszman, Marcelo; Shin, Dongwook; Sneiderman, Charles A.; Jin, Honglan; Rindflesch, Thomas C.

    2010-01-01

    Many natural language processing systems are being applied to clinical text, yet clinically useful results are obtained only by honing a system to a particular context. We suggest that concentration on the information needed for this processing is crucial and present a knowledge intensive methodology for mapping clinical text to LOINC. The system takes published case reports as input and maps vital signs and body measurements and reports of diagnostic procedures to fully specified LOINC codes. Three kinds of knowledge are exploited: textual, ontological, and pragmatic (including information about physiology and the clinical process). Evaluation on 4809 sentences yielded precision of 89% and recall of 93% (F-score 0.91). Our method could form the basis for a system to provide semi-automated help to human coders. PMID:21346974

  8. Transfer map approach to the beam-beam interaction

    NASA Astrophysics Data System (ADS)

    Dragt, Alex J.

    1980-01-01

    A study is made of a model for the beam-beam interaction in ISABELLE using numerical methods and the recently developed method of Transfer Maps. It is found that analytical transfer map calculations account qualitatively for all the features of the model obtions account qualitatively for all the features of the model observed numerically, and show promise of giving quantitive agreement as well. They may also provide a kind of ''magnifying glass'' for examining numerical results in fine detail to ascertain the presence of small scale stochastic motion that might lead to eventual particle loss. Preliminary evidence is presented to the effect that within the model employed, the beam-beam interaction at its contemplated strengths should not lead to particle loss in ISABELLE.

  9. About measurements of Higgs boson parity

    NASA Astrophysics Data System (ADS)

    Ginzburg, I. F.

    2016-02-01

    Recently CMS and ATLAS announced that they had measured the Higgs boson parity. In this note we show that their approach can determine this parity only under the additional assumption that an extension of Standard Model of some special type is realized in Nature. We show that the used approach gives no information about the Higgs boson parity when assuming most other extensions of the Standard Model.

  10. Comparison of Sub-pixel Classification Approaches for Crop-specific Mapping

    EPA Science Inventory

    The Moderate Resolution Imaging Spectroradiometer (MODIS) data has been increasingly used for crop mapping and other agricultural applications. Phenology-based classification approaches using the NDVI (Normalized Difference Vegetation Index) 16-day composite (250 m) data product...

  11. An approach to reduce mapping errors in the production of landslide inventory maps

    NASA Astrophysics Data System (ADS)

    Santangelo, M.; Marchesini, I.; Bucci, F.; Cardinali, M.; Fiorucci, F.; Guzzetti, F.

    2015-07-01

    Landslide inventory maps (LIMs) show where landslides have occurred in an area, and provide information useful to different types of landslide studies, including susceptibility and hazard modelling and validation, risk assessment, erosion analyses, and to evaluate relationships between landslides and geological settings. Despite recent technological advancements, visual interpretation of aerial photographs (API) remains the most common method to prepare LIMs. In this work, we present a new semi-automatic procedure that exploits GIS technology for the digitalization of landslide data obtained through API. To test the procedure, and to compare it to a consolidated landslide mapping method, we prepared two LIMs starting from the same set of landslide API data, which were digitalized (a) manually adopting a consolidated visual transfer method, and (b) adopting our new semi-automatic procedure. Results indicate that the new semi-automatic procedure is more efficient and results in a more accurate LIM. With the new procedure, the landslide positional error decreases with increasing landslide size following a power-law. We expect that our work will help adopt standards for transferring landslide information from the aerial photographs to a digital landslide map, contributing to the production of accurate landslide maps.

  12. An approach to reduce mapping errors in the production of landslide inventory maps

    NASA Astrophysics Data System (ADS)

    Santangelo, M.; Marchesini, I.; Bucci, F.; Cardinali, M.; Fiorucci, F.; Guzzetti, F.

    2015-09-01

    Landslide inventory maps (LIMs) show where landslides have occurred in an area, and provide information useful to different types of landslide studies, including susceptibility and hazard modelling and validation, risk assessment, erosion analyses, and to evaluate relationships between landslides and geological settings. Despite recent technological advancements, visual interpretation of aerial photographs (API) remains the most common method to prepare LIMs. In this work, we present a new semi-automatic procedure that makes use of GIS technology for the digitization of landslide data obtained through API. To test the procedure, and to compare it to a consolidated landslide mapping method, we prepared two LIMs starting from the same set of landslide API data, which were digitized (a) manually adopting a consolidated visual transfer method, and (b) adopting our new semi-automatic procedure. Results indicate that the new semi-automatic procedure (a) increases the interpreter's overall efficiency by a factor of 2, (b) reduces significantly the subjectivity introduced by the visual (manual) transfer of the landslide information to the digital database, resulting in more accurate LIMs. With the new procedure, the landslide positional error decreases with increasing landslide size, following a power-law. We expect that our work will help adopt standards for transferring landslide information from the aerial photographs to a digital landslide map, contributing to the production of accurate landslide maps.

  13. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  14. Ray mapping approach for the efficient design of continuous freeform surfaces.

    PubMed

    Bösel, Christoph; Gross, Herbert

    2016-06-27

    The efficient design of continuous freeform surfaces, which maps a given light source to an arbitrary target illumination pattern, remains a challenging problem and is considered here for collimated input beams. A common approach are ray-mapping methods, where first a ray mapping between the source and the irradiance distribution on the target plane is calculated and in a subsequent step the surface is constructed. The challenging aspect of this approach is to find an integrable mapping ensuring a continuous surface. Based on the law of reflection/refraction and an integrability condition, we derive a general condition for the surface and ray mapping for a collimated input beam. It is shown that in a small-angle approximation a proper mapping can be calculated via optimal mass transport - a mathematical framework for the calculation of a mapping between two positive density functions. We show that the surface can be constructed by solving a linear advection Eq. with appropriate boundary conditions. The results imply that the optimal mass transport mapping is approximately integrable over a wide range of distances between the freeform and the target plane and offer an efficient way to construct the surface by solving standard integrals. The efficiency is demonstrated by applying it to two challenging design examples, which shows the ability of the presented approach to handle target illumination patterns with steep irradiance gradients and numerous gray levels. PMID:27410583

  15. Stationkeeping Approach for the Microwave Anisotropy Probe (MAP)

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, Dave; Schiff, Conrad

    2002-01-01

    The Microwave Anisotropy Probe was successfully launched on June 30, 2001 and placed into a Lissajous orbit about the L2 Sun-Earth-Moon libration point. However, the L2 libration point is unstable which necessitates occasional stationkeeping maneuvers in order to maintain the spacecraft s Lissajous orbit. Analyses were performed in order to develop a feasible L2 stationkeeping strategy for the MAP mission. The resulting strategy meets the allotted fuel budget, allowing for enough fuel to handle additional he1 taxes, while meeting the attitude requirements for the maneuvers. Results from the first two stationkeeping maneuvers are included.

  16. Approaches to digital snow mapping with LANDSAT-1 data

    NASA Technical Reports Server (NTRS)

    Itten, K. I.

    1975-01-01

    Applying the same LANDSAT-1 data to three substantially different image processing systems, a snow mapping task was performed. LARSYS Ver.3, STANSORT-2, and General Electric Image-100 did all the jobs of detecting the snowline in forested mountainous terrain, and to determine the snowcovered area. While the control and accuracy achieved with LARSYS is remarkable, time and effort to perform the processing favor the systems STANSORT and Image-100. The experiences and results demonstrate the need for a fast interactive system for operational snowmapping with multispectral satellite data.

  17. Aerial Terrain Mapping Using Unmanned Aerial Vehicle Approach

    NASA Astrophysics Data System (ADS)

    Tahar, K. N.

    2012-08-01

    This paper looks into the latest achievement in the low-cost Unmanned Aerial Vehicle (UAV) technology in their capacity to map the semi-development areas. The objectives of this study are to establish a new methodology or a new algorithm in image registration during interior orientation process and to determine the accuracy of the photogrammetric products by using UAV images. Recently, UAV technology has been used in several applications such as mapping, agriculture and surveillance. The aim of this study is to scrutinize the usage of UAV to map the semi-development areas. The performance of the low cost UAV mapping study was established on a study area with two image processing methods so that the results could be comparable. A non-metric camera was attached at the bottom of UAV and it was used to capture images at both sites after it went through several calibration steps. Calibration processes were carried out to determine focal length, principal distance, radial lens distortion, tangential lens distortion and affinity. A new method in image registration for a non-metric camera is discussed in this paper as a part of new methodology of this study. This method used the UAV Global Positioning System (GPS) onboard to register the UAV image for interior orientation process. Check points were established randomly at both sites using rapid static Global Positioning System. Ground control points are used for exterior orientation process, and check point is used for accuracy assessment of photogrammetric product. All acquired images were processed in a photogrammetric software. Two methods of image registration were applied in this study, namely, GPS onboard registration and ground control point registration. Both registrations were processed by using photogrammetric software and the result is discussed. Two results were produced in this study, which are the digital orthophoto and the digital terrain model. These results were analyzed by using the root mean square

  18. High-resolution habitat mapping on mud fields: new approach to quantitative mapping of Ocean quahog.

    PubMed

    Isachenko, Artem; Gubanova, Yana; Tzetlin, Alexander; Mokievsky, Vadim

    2014-12-01

    During 2009-2012 stocks of the bivalve Arctica islandica (Linnaeus, 1767) (Ocean quahog) in Kandalaksha Bay (the White Sea) has been assessed using a side-scan sonar, grab sampling and underwater photo imaging. Structurally uniform localities were highlighted on the basis of side-scan signal. Each type of a signal reflects combination of sediment type, microtopography and structural characteristics of benthic community. The distribution of A. islandica was the predominant factor in determining community structure. Seabed attributes considered most significant were defined for each type of substrate type. Relations of sonar signal and sediment type were used for landscape mapping based on sonar data. Community characteristics at known localities were reliably interpolated to the area of survey using statistical processing of geophysical data. A method of integrated sonar and sampling data interpretation for high-resolution mapping of A. islandica by biomass groups, benthic faunal groups and associated habitats was developed. PMID:24954748

  19. An integrated approach for automated cover-type mapping of large inaccessible areas in Alaska

    USGS Publications Warehouse

    Fleming, Michael D.

    1988-01-01

    The lack of any detailed cover type maps in the state necessitated that a rapid and accurate approach to be employed to develop maps for 329 million acres of Alaska within a seven-year period. This goal has been addressed by using an integrated approach to computer-aided analysis which combines efficient use of field data with the only consistent statewide spatial data sets available: Landsat multispectral scanner data, digital elevation data derived from 1:250 000-scale maps, and 1:60 000-scale color-infrared aerial photographs.

  20. A FISH approach for mapping the human genome using Bacterial Artificial Chromosomes (BACs)

    SciTech Connect

    Hubert, R.S.; Chen, X.N.; Mitchell, S.

    1994-09-01

    As the Human Genome Project progresses, large insert cloning vectors such as BACs, P1, and P1 Artificial Chromosomes (PACs) will be required to complement the YAC mapping efforts. The value of the BAC vector for physical mapping lies in the stability of the inserts, the lack of chimerism, the length of inserts (up to 300 kb), the ability to obtain large amounts of pure clone DNA and the ease of BAC manipulation. These features helped us design two approaches for generating physical mapping reagents for human genetic studies. The first approach is a whole genome strategy in which randomly selected BACs are mapped, using FISH, to specific chromosomal bands. To date, 700 BACs have been mapped to single chromosome bands at a resolution of 2-5 Mb in addition to BACs mapped to 14 different centromeres. These BACs represent more than 90 Mb of the genome and include >70% of all human chromosome bands at the 350-band level. These data revealed that >97% of the BACs were non-chimeric and have a genomic distribution covering most gaps in the existing YAC map with excellent coverage of gene-rich regions. In the second approach, we used YACs to identify BACs on chromosome 21. A 1.5 Mb contig between D21S339 and D21S220 nears completion within the Down syndrome congenital heart disease (DS-CHD) region. Seventeen BACs ranging in size from 80 kb to 240 kb were ordered using 14 STSs with FISH confirmation. We have also used 40 YACs spanning 21q to identify, on average, >1 BAC/Mb to provide molecular cytogenetic reagents and anchor points for further mapping. The contig generated on chromosome 21 will be helpful in isolating the genes for DS-CHD. The physical mapping reagents generated using the whole genome approach will provide cytogenetic markers and mapped genomic fragments that will facilitate positional cloning efforts and the identification of genes within most chromosomal bands.

  1. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    SciTech Connect

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combination of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.

  2. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    DOE PAGESBeta

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combinationmore » of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.« less

  3. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  4. Engineering a robotic approach to mapping exposed volcanic fissures

    NASA Astrophysics Data System (ADS)

    Parcheta, C. E.; Parness, A.; Mitchell, K. L.

    2014-12-01

    Field geology provides a framework for advanced computer models and theoretical calculations of volcanic systems. Some field terrains, though, are poorly preserved or accessible, making documentation, quantification, and investigation impossible. Over 200 volcanologists at the 2012 Kona Chapman Conference on volcanology agreed that and important step forward in the field over the next 100 years should address the realistic size and shape of volcanic conduits. The 1969 Mauna Ulu eruption of Kīlauea provides a unique opportunity to document volcanic fissure conduits, thus, we have an ideal location to begin addressing this topic and provide data on these geometries. Exposed fissures can be mapped with robotics using machine vision. In order to test the hypothesis that fissures have irregularities with depth that will influence their fluid dynamical behavior, we must first map the fissure vents and shallow conduit to deci- or centimeter scale. We have designed, constructed, and field-tested the first version of a robotic device that will image an exposed volcanic fissure in three dimensions. The design phase included three steps: 1) create the payload harness and protective shell to prevent damage to the electronics and robot, 2) construct a circuit board to have the electronics communicate with a surface-based computer, and 3) prototype wheel shapes that can handle a variety of volcanic rock textures. The robot's mechanical parts were built using 3d printing, milling, casting and laser cutting techniques, and the electronics were assembled from off the shelf components. The testing phase took place at Mauna Ulu, Kīlauea, Hawai'i, from May 5 - 9, 2014. Many valuable design lessons were learned during the week, and the first ever 3D map from inside a volcanic fissure were successfully collected. Three vents had between 25% and 95% of their internal surfaces imaged. A fourth location, a non-eruptive crack (possibly a fault line) had two transects imaging the textures

  5. Algorithms for SU(n) boson realizations and D -functions

    NASA Astrophysics Data System (ADS)

    Dhand, Ish; Sanders, Barry C.; de Guise, Hubert

    2015-11-01

    Boson realizations map operators and states of groups to transformations and states of bosonic systems. We devise a graph-theoretic algorithm to construct the boson realizations of the canonical SU(n) basis states, which reduce the canonical subgroup chain, for arbitrary n. The boson realizations are employed to construct D -functions, which are the matrix elements of arbitrary irreducible representations, of SU(n) in the canonical basis. We demonstrate that our D -function algorithm offers significant advantage over the two competing procedures, namely, factorization and exponentiation.

  6. Endoscopic fluorescence mapping of the left atrium: A novel experimental approach for high resolution endocardial mapping in the intact heart

    PubMed Central

    Kalifa, Jérôme; Klos, Matthew; Zlochiver, Sharon; Mironov, Sergey; Tanaka, Kazuhiko; Ulahannan, Netha; Yamazaki, Masatoshi; Jalife, José; Berenfeld, Omer

    2007-01-01

    Background Despite availability of several mapping technologies to investigate the electrophysiological mechanisms of atrial fibrillation (AF), an experimental tool enabling high resolution mapping of electrical impulse on the endocardial surface of the left atrium is still lacking. Objective To present a new optical mapping approach implementing a steerable cardio-endoscope in isolated hearts. Methods The system consists of a direct or side-view endoscope coupled to a 532 nm excitation Laser for illumination, and to a CCD camera for imaging of potentiometric dye fluorescence (DI-4-ANEPPS, 80×80 pixels, 200–800 frames/sec). The cardio-endoscope was aimed successively at diverse posterior left atrial (PLA) locations to obtain high resolution movies of electrical wave propagation, as well as detailed endocardial anatomical features, in the presence and the absence of atrial stretch. Results We present several examples of high resolution endoscopic PLA recordings of wave propagation patterns during both sinus rhythm and AF with signal-to-noise ratio similar to conventional optical mapping systems. We demonstrate the endoscope’s ability to visualize highly organized AF sources (rotors) at specific locations on the PLA and PLA-pulmonary vein junctions, and present video images of waves emanating from such sources as they propagate into pectinate muscles in the LA appendage. In particular, we demonstrate this approach to be ideally suited for studying the effects of atrial stretch on AF dynamics. Conclusions In isolated hearts, cardio-endoscopic optical mapping of electrical activity should enable comprehensive evaluation of atrial fibrillatory activity in the PLA, of the role of the local anatomy on AF dynamics and of the efficacy of pharmacological and ablative interventions. PMID:17599678

  7. Slave boson theories of correlated electron systems

    SciTech Connect

    Woelfle, P.

    1995-05-01

    Slave boson theories of various models of correlated fermions are critically reviewed and several new results are presented. In the example of the Anderson impurity model the limitations of slave boson mean field theory are discussed. Self-consistent conserving approximations are compared with results obtained from the numerical renormalization group. The gauge field theory of the t-J-model is considered in the quasistatic approximation. It is shown that weak localization effects can give valuable information on the existence of gauge fields. Applications of the slave-boson approach due to Kotliar and Ruckenstein to the Hubbard model are also discussed.

  8. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    ERIC Educational Resources Information Center

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  9. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  10. Mapping.

    ERIC Educational Resources Information Center

    Kinney, Douglas M.; McIntosh, Willard L.

    1979-01-01

    The area of geological mapping in the United States in 1978 increased greatly over that reported in 1977; state geological maps were added for California, Idaho, Nevada, and Alaska last year. (Author/BB)

  11. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    ERIC Educational Resources Information Center

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  12. Thermofield-based chain-mapping approach for open quantum systems

    NASA Astrophysics Data System (ADS)

    de Vega, Inés; Bañuls, Mari-Carmen

    2015-11-01

    We consider a thermofield approach to analyze the evolution of an open quantum system coupled to an environment at finite temperature. In this approach, the finite-temperature environment is exactly mapped onto two virtual environments at zero temperature. These two environments are then unitarily transformed into two different chains of oscillators, leading to a one-dimensional structure that can be numerically studied using tensor network techniques. Compared to previous approaches using a single chain mapping, our strategy offers the advantage of an exact description of the initial state at arbitrary temperatures, which results in a gain in computational efficiency and a reduced truncation error.

  13. Three-site Bose-Hubbard model subject to atom losses: Boson-pair dissipation channel and failure of the mean-field approach

    SciTech Connect

    Shchesnovich, V. S.; Mogilevtsev, D. S.

    2010-10-15

    We employ the perturbation series expansion for derivation of the reduced master equations for the three-site Bose-Hubbard model subject to strong atom losses from the central site. The model describes a condensate trapped in a triple-well potential subject to externally controlled removal of atoms. We find that the {pi}-phase state of the coherent superposition between the side wells decays via two dissipation channels, the single-boson channel (similar to the externally applied dissipation) and the boson-pair channel. The quantum derivation is compared to the classical adiabatic elimination within the mean-field approximation. We find that the boson-pair dissipation channel is not captured by the mean-field model, whereas the single-boson channel is described by it. Moreover, there is a matching condition between the zero-point energy bias of the side wells and the nonlinear interaction parameter which separates the regions where either the single-boson or the boson-pair dissipation channel dominate. Our results indicate that the M-site Bose-Hubbard models, for M>2, subject to atom losses may require an analysis which goes beyond the usual mean-field approximation for correct description of their dissipative features. This is an important result in view of the recent experimental works on the single-site addressability of condensates trapped in optical lattices.

  14. Comparison of four Vulnerability Approaches to Mapping of Shallow Aquifers of Eastern Dahomey Basin of Nigeria

    NASA Astrophysics Data System (ADS)

    Oke, Saheed; Vermeulen, Danie

    2016-04-01

    This study presents the outcome of mapping the shallow aquifers of the eastern Dahomey Basin of southwestern Nigeria vulnerability studies. The basin is a coastal transboundary aquifer extending from eastern Ghana to southwestern Nigeria. The study aimed to examine the most suitable method for mapping the basin shallow aquifers by comparing the results of four different vulnerability approaches. This is most important due to differences in vulnerability assessment parameters, approaches and results derived from most vulnerability methods on a particular aquifer. The methodology involves using vulnerability techniques that assess the intrinsic properties of the aquifer. Two methods from travel time approach (AVI and RTt) and index approach (DRASTIC and PI) were employed in the mapping of the basin. The results show the AVI has the least mapping parameters with 75% of the basin classified as very high vulnerability and 25% with high vulnerability. The DRASTIC mapping shows 18% as low vulnerability, 61% as moderate vulnerability and 21% reveal high vulnerability. Mapping with the PI method which has highest parameters shows 66% of the aquifer as low vulnerability and 34% reveal moderate vulnerability. The RTt method shows 18% as very high vulnerability, 8% as high vulnerability, 64% as moderate vulnerability and 10% reveal very low vulnerability. Further analysis involving correlation plots shows the highest correlation of 62% between the RTt and DRASTIC method than within any others methods. The analysis shows that the PI method is the mildest of all the vulnerability methods while the AVI method is the strictest of the methods considered in this vulnerability mapping. The significance of using four different approaches to the mapping of the shallow aquifers of the eastern Dahomey Basin will guide in the recommendation of the best vulnerability method for subsequent future assessment of this and other shallow aquifers. Keywords: Aquifer vulnerability, Dahomey Basin

  15. Making of 3D extinction maps from population synthesis approach

    NASA Astrophysics Data System (ADS)

    Robin, A. C.; Marshall, D.; Reylé, C.; Montillaud, J.

    Interstellar extinction is critical when studying stellar populations and Galactic structure. By taking into account all informations on stellar populations on a given line of sight, the population synthesis approach is an efficient tool to derive the distribution of extinction. This approach has been shown to give reliable estimates in regions where the stars are numerous enough and well distributed in distance. This method has some limits due to dependency on model hypotheses. With other methods, some biases can appear close to the limiting magnitude, and to the maximum distance of detection, due to detection limits of the stars which depend on the extinction itself. We present the successes of this method as well as its limitations and compare with results of other methods.

  16. Geomatics Approach for Assessment of respiratory disease Mapping

    NASA Astrophysics Data System (ADS)

    Pandey, M.; Singh, V.; Vaishya, R. C.

    2014-11-01

    Air quality is an important subject of relevance in the context of present times because air is the prime resource for sustenance of life especially human health position. Then with the aid of vast sums of data about ambient air quality is generated to know the character of air environment by utilizing technological advancements to know how well or bad the air is. This report supplies a reliable method in assessing the Air Quality Index (AQI) by using fuzzy logic. The fuzzy logic model is designed to predict Air Quality Index (AQI) that report monthly air qualities. With the aid of air quality index we can evaluate the condition of the environment of that area suitability regarding human health position. For appraisal of human health status in industrial area, utilizing information from health survey questionnaire for obtaining a respiratory risk map by applying IDW and Gettis Statistical Techniques. Gettis Statistical Techniques identifies different spatial clustering patterns like hot spots, high risk and cold spots over the entire work area with statistical significance.

  17. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  18. A Digital Soil Mapping approach using neural networks for peat depth mapping in Scotland

    NASA Astrophysics Data System (ADS)

    Aitkenhead, Matt; Saunders, Matt; Yeluripati, Jagadeesh

    2014-05-01

    Spatially explicit and accurate peat depth estimates are required for carbon stock assessment, carbon management stategies , hydrological modelling, ecosystem service assessment and land management (e.g. wind farms). In Scotland, a number of surveys have taken place over the years that have produced data on peat depth, and while many of these surveys have focussed on specific locations or peat bogs, a substantial proportion of the data produced is relatively old and has not been digitised, thus limiting its visibility and utility in new research activities, policy development and land management decision making. Here we describe ongoing work where the key objective is to integrate multiple peat survey datasets with existing spatial datasets of climate, vegetation, topography and geology. The dataset produced is generated from a small number of isolated surveys and while it is not representative of all of Scotland's soils, it is sufficient to demonstrate the conceptual basis for model development. It has been used to develop a neural network model of peat depth that has been applied across Scotland's peat bogs at 100m resolution. The resulting map gives an early indication of the variation of peat depth across the country, and allows us to produce an estimate of mean peat bog depth across the country. This estimate will improve with additional data and will contribute to improving our ability to undertake activities that depend on this kind of information. We have identified data gaps that need to be addressed in order to improve this model, in particular peat depth survey data from a wider range of peat types across the country and in particular, blanket bog and upland peat areas. Ongoing work to identify and integrate additional peat bog depth data is described. We also identify potential uses for the existing maps of peat depth, and areas of future model development.

  19. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    NASA Astrophysics Data System (ADS)

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  20. Anomalous gauge boson couplings

    SciTech Connect

    Barklow, T.; Rizzo, T.; Baur, U.

    1997-01-13

    The measurement of anomalous gauge boson self couplings is reviewed for a variety of present and planned accelerators. Sensitivities are compared for these accelerators using models based on the effective Lagrangian approach. The sensitivities described here are for measurement of {open_quotes}generic{close_quotes} parameters {kappa}{sub V}, {lambda}{sub V}, etc., defined in the text. Pre-LHC measurements will not probe these coupling parameters to precision better than O(10{sup -1}). The LHC should be sensitive to better than O(10{sup -2}), while a future NLC should achieve sensitivity of O(10{sup -3}) to O(10{sup -4}) for center of mass energies ranging from 0.5 to 1.5 TeV.

  1. Multidata remote sensing approach to regional geologic mapping in Venezuela

    SciTech Connect

    Baker, R.N.

    1996-08-01

    Remote Sensing played an important role in evaluating the exploration potential of selected lease blocks in Venezuela. Data sets used ranged from regional Landsat and airborne radar (SLAR) surveys to high-quality cloud-free air photos for local but largely inaccessible terrains. The resulting data base provided a framework for the conventional analyses of surface and subsurface information available to the project team. (1) Regional surface geology and major structural elements were interpreted from Landsat MSS imagery supplemented by TM and a regional 1:250,000 airborne radar (SLAR) survey. Evidence of dextral offset, en echelon folds and major thoroughgoing faults suggest a regional transpressional system modified by local extension and readjustment between small-scale crustal blocks. Surface expression of the major structural elements diminishes to the east, but can often be extended beneath the coastal plain by drainage anomalies and subtle geomorphic trends. (2) Environmental conditions were mapped using the high resolution airborne radar images which were used to relate vegetation types to surface texture and elevation; wetlands, outcrop and cultural features to image brightness. Additional work using multispectral TM or SPOT imagery is planned to more accurately define environmental conditions and provide a baseline for monitoring future trends. (3) Offshore oil seeps were detected using ERS-1 satellite radar (SAR) and known seeps in the Gulf of Paria as analogs. While partially successful, natural surfactants, wind shadow and a surprising variety of other phenomena created {open_quotes}false alarms{close_quotes} which required other supporting data and field sampling to verify the results. Key elements of the remote sensing analyses will be incorporated into a comprehensive geographic information (GIS) which will eventually include all of Venezuela.

  2. Flood Hazard Mapping over Large Regions using Geomorphic Approaches

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Troy, Tara J.; Manfreda, Salvatore

    2016-04-01

    Historically, man has always preferred to settle and live near the water. This tendency has not changed throughout time, and today nineteen of the twenty most populated agglomerations of the world (Demographia World Urban Areas, 2015) are located along watercourses or at the mouth of a river. On one hand, these locations are advantageous from many points of view. On the other hand, they expose significant populations and economic assets to a certain degree of flood hazard. Knowing the location and the extent of the areas exposed to flood hazards is essential to any strategy for minimizing the risk. Unfortunately, in data-scarce regions the use of traditional floodplain mapping techniques is prevented by the lack of the extensive data required, and this scarcity is generally most pronounced in developing countries. The present work aims to overcome this limitation by defining an alternative simplified procedure for a preliminary, but efficient, floodplain delineation. To validate the method in a data-rich environment, eleven flood-related morphological descriptors derived from DEMs have been used as linear binary classifiers over the Ohio River basin and its sub-catchments, measuring their performances in identifying the floodplains at the change of the topography and the size of the calibration area. The best performing classifiers among those analysed have been applied and validated across the continental U.S. The results suggest that the classifier based on the index ln(hr/H), named the Geomorphic Flood Index (GFI), is the most suitable to detect the flood-prone areas in data-scarce environments and for large-scale applications, providing good accuracy with low requirements in terms of data and computational costs. Keywords: flood hazard, data-scarce regions, large-scale studies, binary classifiers, DEM, USA.

  3. Evaluation of current statistical approaches for predictive geomorphological mapping

    NASA Astrophysics Data System (ADS)

    Miska, Luoto; Jan, Hjort

    2005-04-01

    Predictive models are increasingly used in geomorphology, but systematic evaluations of novel statistical techniques are still limited. The aim of this study was to compare the accuracy of generalized linear models (GLM), generalized additive models (GAM), classification tree analysis (CTA), neural networks (ANN) and multiple adaptive regression splines (MARS) in predictive geomorphological modelling. Five different distribution models both for non-sorted and sorted patterned ground were constructed on the basis of four terrain parameters and four soil variables. To evaluate the models, the original data set of 9997 squares of 1 ha in size was randomly divided into model training (70%, n=6998) and model evaluation sets (30%, n=2999). In general, active sorted patterned ground is clearly defined in upper fell areas with high slope angle and till soils. Active non-sorted patterned ground is more common in valleys with higher soil moisture and fine-scale concave topography. The predictive performance of each model was evaluated using the area under the receiver operating characteristic curve (AUC) and the Kappa value. The relatively high discrimination capacity of all models, AUC=0.85 0.88 and Kappa=0.49 0.56, implies that the model's predictions provide an acceptable index of sorted and non-sorted patterned ground occurrence. The best performance for model calibration data for both data sets was achieved by the CTA. However, when the predictive mapping ability was explored through the evaluation data set, the model accuracies of CTA decreased clearly compared to the other modelling techniques. For model evaluation data MARS performed marginally best. Our results show that the digital elevation model and soil data can be used to predict relatively robustly the activity of patterned ground in fine scale in a subarctic landscape. This indicates that predictive geomorphological modelling has the advantage of providing relevant and useful information on earth surface

  4. Boson representations of fermion systems: Proton-neutron systems

    NASA Astrophysics Data System (ADS)

    Sambataro, M.

    1988-05-01

    Applications of a procedure recently proposed to construct boson images of fermion Hamiltonians are shown for proton-neutron systems. First the mapping from SD fermion onto sd boson spaces is discussed and a Qπ.Qν interaction investigated. A Hermitian one-body Q boson operator is derived and analytical expressions for its coefficients are obtained. A (Qπ+Qν).(Qπ+Qν) interaction is, then, studied for particle-hole systems and the connections with the SU*(3) dynamical symmetry of the neutron-proton interacting boson model are discussed. Finally, an example of mapping from SDG onto sdg spaces is analyzed. Fermion spectra and E2 matrix elements are well reproduced in the boson spaces.

  5. Novel approaches to map small molecule-target interactions.

    PubMed

    Kapoor, Shobhna; Waldmann, Herbert; Ziegler, Slava

    2016-08-01

    The quest for small molecule perturbators of protein function or a given cellular process lies at the heart of chemical biology and pharmaceutical research. Bioactive compounds need to be extensively characterized in the context of the modulated protein(s) or process(es) in living systems to unravel and confirm their mode of action. A crucial step in this workflow is the identification of the molecular targets for these small molecules, for which a generic methodology is lacking. Herein we summarize recently developed approaches for target identification spurred by advances in omics techniques and chemo- and bioinformatics analysis. PMID:27240466

  6. Inverse field-based approach for simultaneous B₁ mapping at high fields - a phantom based study.

    PubMed

    Jin, Jin; Liu, Feng; Zuo, Zhentao; Xue, Rong; Li, Mingyan; Li, Yu; Weber, Ewald; Crozier, Stuart

    2012-04-01

    Based on computational electromagnetics and multi-level optimization, an inverse approach of attaining accurate mapping of both transmit and receive sensitivity of radiofrequency coils is presented. This paper extends our previous study of inverse methods of receptivity mapping at low fields, to allow accurate mapping of RF magnetic fields (B(1)) for high-field applications. Accurate receive sensitivity mapping is essential to image domain parallel imaging methods, such as sensitivity encoding (SENSE), to reconstruct high quality images. Accurate transmit sensitivity mapping will facilitate RF-shimming and parallel transmission techniques that directly address the RF inhomogeneity issue, arguably the most challenging issue of high-field magnetic resonance imaging (MRI). The inverse field-based approach proposed herein is based on computational electromagnetics and iterative optimization. It fits an experimental image to the numerically calculated signal intensity by iteratively optimizing the coil-subject geometry to better resemble the experiments. Accurate transmit and receive sensitivities are derived as intermediate results of the optimization process. The method is validated by imaging studies using homogeneous saline phantom at 7T. A simulation study at 300MHz demonstrates that the proposed method is able to obtain receptivity mapping with errors an order of magnitude less than that of the conventional method. The more accurate receptivity mapping and simultaneously obtained transmit sensitivity mapping could enable artefact-reduced and intensity-corrected image reconstructions. It is hoped that by providing an approach to the accurate mapping of both transmit and receive sensitivity, the proposed method will facilitate a range of applications in high-field MRI and parallel imaging. PMID:22391489

  7. Mapping Transcription Factors on Extended DNA: A Single Molecule Approach

    NASA Astrophysics Data System (ADS)

    Ebenstein, Yuval; Gassman, Natalie; Weiss, Shimon

    The ability to determine the precise loci and distribution of nucleic acid binding proteins is instrumental to our detailed understanding of cellular processes such as transcription, replication, and chromatin reorganization. Traditional molecular biology approaches and above all Chromatin immunoprecipitation (ChIP) based methods have provided a wealth of information regarding protein-DNA interactions. Nevertheless, existing techniques can only provide average properties of these interactions, since they are based on the accumulation of data from numerous protein-DNA complexes analyzed at the ensemble level. We propose a single molecule approach for direct visualization of DNA binding proteins bound specifically to their recognition sites along a long stretch of DNA such as genomic DNA. Fluorescent Quantum dots are used to tag proteins bound to DNA, and the complex is deposited on a glass substrate by extending the DNA to a linear form. The sample is then imaged optically to determine the precise location of the protein binding site. The method is demonstrated by detecting individual, Quantum dot tagged T7-RNA polymerase enzymes on the bacteriophage T7 genomic DNA and assessing the relative occupancy of the different promoters.

  8. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    PubMed

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  9. A taxonomy of behaviour change methods: an Intervention Mapping approach

    PubMed Central

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fernández, María E.; Markham, Christine; Bartholomew, L. Kay

    2016-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it

  10. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  11. Differential Analysis of 2-D Maps by Pixel-Based Approaches.

    PubMed

    Marengo, Emilio; Robotti, Elisa; Quasso, Fabio

    2016-01-01

    Two approaches to the analysis of 2-D maps are available: the first one involves a step of spot detection on each gel image; the second one is based instead on the direct differential analysis of 2-D map images, following a pixel-based procedure. Both approaches strongly depend on the proper alignment of the gel images, but the pixel-based approach allows to solve important drawbacks of the spot-volume procedure, i.e., the problem of missing data and of overlapping spots. However, this approach is quite computationally intensive and requires the use of algorithms able to separate the information (i.e., spot-related information) from the background. Here, the most recent pixel-based approaches are described. PMID:26611422

  12. Mapping paths: new approaches to dissect eukaryotic signaling circuitry

    PubMed Central

    Mutlu, Nebibe; Kumar, Anuj

    2016-01-01

    Eukaryotic cells are precisely “wired” to coordinate changes in external and intracellular signals with corresponding adjustments in the output of complex and often interconnected signaling pathways. These pathways are critical in understanding cellular growth and function, and several experimental trends are emerging with applicability toward more fully describing the composition and topology of eukaryotic signaling networks. In particular, recent studies have implemented CRISPR/Cas-based screens in mouse and human cell lines for genes involved in various cell growth and disease phenotypes. Proteomic methods using mass spectrometry have enabled quantitative and dynamic profiling of protein interactions, revealing previously undiscovered complexes and allele-specific protein interactions. Methods for the single-cell study of protein localization and gene expression have been integrated with computational analyses to provide insight into cell signaling in yeast and metazoans. In this review, we present an overview of exemplary studies using the above approaches, relevant for the analysis of cell signaling and indeed, more broadly, for many modern biological applications. PMID:27540473

  13. Mapping water quality and substrate cover in optically complex coastal and reef waters: an integrated approach.

    PubMed

    Phinn, S R; Dekker, A G; Brando, V E; Roelfsema, C M

    2005-01-01

    Sustainable management of coastal and coral reef environments requires regular collection of accurate information on recognized ecosystem health indicators. Satellite image data and derived maps of water column and substrate biophysical properties provide an opportunity to develop baseline mapping and monitoring programs for coastal and coral reef ecosystem health indicators. A significant challenge for satellite image data in coastal and coral reef water bodies is the mixture of both clear and turbid waters. A new approach is presented in this paper to enable production of water quality and substrate cover type maps, linked to a field based coastal ecosystem health indicator monitoring program, for use in turbid to clear coastal and coral reef waters. An optimized optical domain method was applied to map selected water quality (Secchi depth, Kd PAR, tripton, CDOM) and substrate cover type (seagrass, algae, sand) parameters. The approach is demonstrated using commercially available Landsat 7 Enhanced Thematic Mapper image data over a coastal embayment exhibiting the range of substrate cover types and water quality conditions commonly found in sub-tropical and tropical coastal environments. Spatially extensive and quantitative maps of selected water quality and substrate cover parameters were produced for the study site. These map products were refined by interactions with management agencies to suit the information requirements of their monitoring and management programs. PMID:15757744

  14. Force scanning: A rapid, high-resolution approach for spatial mechanical property mapping

    PubMed Central

    Darling, E M

    2011-01-01

    Atomic force microscopy (AFM) can be used to co-localize mechanical properties and topographical features through property mapping techniques. The most common approach for testing biological materials at the micro-and nano-scales is force mapping, which involves taking individual force curves at discrete sites across a region of interest. Limitations of force mapping include long testing times and low resolution. While newer AFM methodologies, like modulated scanning and torsional oscillation, circumvent this problem, their adoption for biological materials has been limited. This could be due to their need for specialized software algorithms and/or hardware. The objective of this study is to develop a novel force scanning technique using AFM to rapidly capture high-resolution topographical images of soft biological materials while simultaneously quantifying their mechanical properties. Force scanning is a straight-forward methodology applicable to a wide range of materials and testing environments, requiring no special modification to standard AFMs. Essentially, if a contact mode image can be acquired, then force scanning can be used to produce a spatial modulus map. The current study first validates this technique using agarose gels, comparing results to the standard force mapping approach. Biologically relevant demonstrations are then presented for high-resolution modulus mapping of individual cells, cell-cell interfaces, and articular cartilage tissue. PMID:21411911

  15. Benthic habitat mapping in a Portuguese Marine Protected Area using EUNIS: An integrated approach

    NASA Astrophysics Data System (ADS)

    Henriques, Victor; Guerra, Miriam Tuaty; Mendes, Beatriz; Gaudêncio, Maria José; Fonseca, Paulo

    2015-06-01

    A growing demand for seabed and habitat mapping has taken place over the past years to support the maritime integrated policies at EU and national levels aiming at the sustainable use of sea resources. This study presents the results of applying the hierarchical European Nature Information System (EUNIS) to classify and map the benthic habitats of the Luiz Saldanha Marine Park, a marine protected area (MPA), located in the mainland Portuguese southwest coast, in the Iberian Peninsula. The habitat map was modelled by applying a methodology based on EUNIS to merge biotic and abiotic key habitat drivers. The modelling in this approach focused on predicting the association of different data types: substrate, bathymetry, light intensity, waves and currents energy, sediment grain size and benthic macrofauna into a common framework. The resulting seamless medium scale habitat map discriminates twenty six distinct sublittoral habitats, including eight with no match in the current classification, which may be regarded as new potential habitat classes and therefore will be submitted to EUNIS. A discussion is provided examining the suitability of the current EUNIS scheme as a standardized approach to classify marine benthic habitats and map their spatial distribution at medium scales in the Portuguese coast. In addition the factors that most affected the results available in the predictive habitat map and the role of the environmental factors on macrofaunal assemblage composition and distribution are outlined.

  16. Atom-atom correlations in time-of-flight imaging of ultracold bosons in optical lattices

    SciTech Connect

    Zaleski, T. A.; Kopec, T. K.

    2011-11-15

    We study the spatial correlations of strongly interacting bosons in a ground state, confined in a two-dimensional square and a three-dimensional cubic lattice. Using the combined Bogoliubov method and the quantum rotor approach, we map the Hamiltonian of strongly interacting bosons onto U(1) phase action in order to calculate the atom-atom correlations' decay along the principal axis and a diagonal of the lattice-plane direction as a function of distance. Lower tunneling rates lead to quicker decays of the correlations, whose character becomes exponential. Finally, correlation functions allow us to calculate quantities that are directly bound to experimental outcomes, namely time-of-flight absorption images and resulting visibility. Our results contain all the characteristic features present in experimental data (transition from Mott insulating blob to superfluid peaks, etc.), emphasizing the usability of the proposed approach.

  17. Quantum criticality in disordered bosonic optical lattices

    SciTech Connect

    Cai Xiaoming; Chen Shu; Wang Yupeng

    2011-04-15

    Using the exact Bose-Fermi mapping, we study universal properties of ground-state density distributions and finite-temperature quantum critical behavior of one-dimensional hard-core bosons in trapped incommensurate optical lattices. Through the analysis of universal scaling relations in the quantum critical regime, we demonstrate that the superfluid-to-Bose-glass transition and the general phase diagram of disordered hard-core bosons can be uniquely determined from finite-temperature density distributions of the trapped disordered system.

  18. A new approach to mapping permafrost and change incorporating uncertainties in ground conditions and climate projections

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Olthof, I.; Fraser, R.; Wolfe, S. A.

    2014-11-01

    Spatially detailed information on permafrost distribution and change with climate is important for land use planning, infrastructure development, and environmental assessments. However, the required soil and surficial geology maps in the North are coarse, and projected climate scenarios vary widely. Considering these uncertainties, we propose a new approach to mapping permafrost distribution and change by integrating remote sensing data, field measurements, and a process-based model. Land cover types from satellite imagery are used to capture the general land conditions and to improve the resolution of existing permafrost maps. For each land cover type, field observations are used to estimate the probabilities of different ground conditions. A process-based model is used to quantify the evolution of permafrost for each ground condition under three representative climate scenarios (low, medium, and high warming). From the model results, the probability of permafrost occurrence and the most likely permafrost conditions are determined. We apply this approach at 20 m resolution to a large area in Northwest Territories, Canada. Mapped permafrost conditions are in agreement with field observations and other studies. The data requirements, model robustness, and computation time are reasonable, and this approach may serve as a practical means to mapping permafrost and changes at high resolution in other regions.

  19. A New Approach to Mapping Permafrost and Change Incorporating Uncertainties in Ground Conditions and Climate Projections

    NASA Astrophysics Data System (ADS)

    Zhang, Y.

    2014-12-01

    Spatially detailed information on permafrost distribution and change with climate is important for land-use planning, infrastructure development and environmental assessments. However, the required soil and surficial geology maps in the North are coarse, and projected climate scenarios vary widely. Considering these uncertainties, we propose a new approach to mapping permafrost distribution and change by integrating remote sensing data, field measurements, and a process-based model. Land-cover types from satellite imagery are used to capture the general land conditions and to improve the resolution of existing permafrost maps. For each land-cover type, field observations are used to estimate the probability of different ground conditions. A process-based model is used to quantify the evolution of permafrost for each ground condition under three representative climate scenarios (low, medium and high warming). From the model results, the probability of permafrost occurrence and the most likely permafrost conditions are determined (Fig. 1). We apply this approach at 20 m resolution to a large area in Northwest Territories, Canada. Mapped permafrost conditions are in agreement with field observations and other studies. The data requirements, model robustness and computation time are reasonable, and this approach may serve as a practical means to mapping permafrost and changes at high resolution in other regions.

  20. A new approach to mapping permafrost and change incorporating uncertainties in ground conditions and climate projections

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Olthof, I.; Fraser, R.; Wolfe, S. A.

    2014-04-01

    Spatially detailed information on permafrost distribution and change with climate is important for land-use planning and for environmental and ecological assessments. However, the required soil and surficial geology maps in the north are coarse, and projected climate scenarios vary widely. Considering these uncertainties, we propose a new approach to mapping permafrost distribution and change by integrating remote sensing data, field measurements, and a process-based model. Land-cover types from satellite imagery are used to capture the general land conditions and to improve the resolution of existing permafrost maps. For each land-cover type, field observations are used to estimate the probability of different ground conditions. A process-based model is used to quantify the evolution of permafrost for each ground condition under three representative climate scenarios (low, medium and high warming). From the model results, the probability of permafrost occurrence and the most likely permafrost conditions are determined. We apply this approach at 20 m resolution to a large area in Northwest Territories, Canada. Mapped permafrost conditions are in agreement with field observations and other studies. The data requirements, model robustness and computation time are reasonable, and this approach may serve as a practical means to mapping permafrost and changes at high resolution in other regions.

  1. An effective trace-guided wavefront navigation and map-building approach for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Jan, Gene Eu

    2013-12-01

    This paper aims to address a trace-guided real-time navigation and map building approach of an autonomous mobile robot. Wave-front based global path planner is developed to generate a global trajectory for an autonomous mobile robot. Modified Vector Field Histogram (M-VFH) is employed based on the LIDAR sensor information to guide the robot locally to be autonomously traversed with obstacle avoidance by following traces provided by the global path planner. A local map composed of square grids is created through the local navigator while the robot traverses with limited LIDAR sensory information. From the measured sensory information, a map of the robot's immediate limited surroundings is dynamically built for the robot navigation. The real-time wave-front based navigation and map building methodology has been successfully demonstrated in a Player/Stage simulation environment. With the wave-front-based global path planner and M-VFH local navigator, a safe, short, and reasonable trajectory is successfully planned in a majority of situations without any templates, without explicitly optimizing any global cost functions, and without any learning procedures. Its effectiveness, feasibility, efficiency and simplicity of the proposed real-time navigation and map building of an autonomous mobile robot have been successfully validated by simulation and comparison studies. Comparison studies of the proposed approach with the other path planning approaches demonstrate that the proposed method is capable of planning more reasonable and shorter collision-free trajectories autonomously.

  2. A new computer approach to map mixed forest features and postprocess multispectral data

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1976-01-01

    A computer technique for mapping mixed softwood and hardwood stands in multispectral satellite imagery of forest regions is described. The purpose of the technique is to obtain smoother resource maps useful in timber harvesting operations. The computer program relies on an algorithm which assesses the size and similarity of adjacent sections on satellite imagery (Landsat-1 data is used) and constructs, through an iteration of the basic algorithm, a more general map of timber mixtures, eliminating the mottled appearance of the raw imagery. Despite difficulties in the experimental analysis of a Texas forest, apparently due to relatively low resolution of the Landsat data, the computer classification approach outlined is suggested as a generally applicable method of creating serviceable maps from multispectral imagery.

  3. Integrated environmental mapping and monitoring, a methodological approach to optimise knowledge gathering and sampling strategy.

    PubMed

    Nilssen, Ingunn; Ødegård, Øyvind; Sørensen, Asgeir J; Johnsen, Geir; Moline, Mark A; Berge, Jørgen

    2015-07-15

    New technology has led to new opportunities for a holistic environmental monitoring approach adjusted to purpose and object of interest. The proposed integrated environmental mapping and monitoring (IEMM) concept, presented in this paper, describes the different steps in such a system from mission of survey to selection of parameters, sensors, sensor platforms, data collection, data storage, analysis and to data interpretation for reliable decision making. The system is generic; it can be used by authorities, industry and academia and is useful for planning- and operational phases. In the planning process the systematic approach is also ideal to identify areas with gap of knowledge. The critical stages of the concept is discussed and exemplified by two case studies, one environmental mapping and one monitoring case. As an operational system, the IEMM concept can contribute to an optimised integrated environmental mapping and monitoring for knowledge generation as basis for decision making. PMID:25956441

  4. Two-dimensional thermofield bosonization

    SciTech Connect

    Amaral, R.L.P.G.

    2005-12-15

    The main objective of this paper was to obtain an operator realization for the bosonization of fermions in 1 + 1 dimensions, at finite, non-zero temperature T. This is achieved in the framework of the real-time formalism of Thermofield Dynamics. Formally, the results parallel those of the T = 0 case. The well-known two-dimensional Fermion-Boson correspondences at zero temperature are shown to hold also at finite temperature. To emphasize the usefulness of the operator realization for handling a large class of two-dimensional quantum field-theoretic problems, we contrast this global approach with the cumbersome calculation of the fermion-current two-point function in the imaginary-time formalism and real-time formalisms. The calculations also illustrate the very different ways in which the transmutation from Fermi-Dirac to Bose-Einstein statistics is realized.

  5. Shape-to-String Mapping: A Novel Approach to Clustering Time-Index Biomics Data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Herein we describe a qualitative approach for clustering time-index biomics data. The data are transformed into angles from the intensity-ratios between adjacent time-points. A code is used to map a qualitative representation of the numerical time-index data which captures the features in the data ...

  6. Does Constructivist Approach Applicable through Concept Maps to Achieve Meaningful Learning in Science?

    ERIC Educational Resources Information Center

    Jena, Ananta Kumar

    2012-01-01

    This study deals with the application of constructivist approach through individual and cooperative modes of spider and hierarchical concept maps to achieve meaningful learning on science concepts (e.g. acids, bases & salts, physical and chemical changes). The main research questions were: Q (1): is there any difference in individual and…

  7. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  8. Concept Maps in the Classroom: A New Approach to Reveal Students' Conceptual Change

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Liefländer, Anne K.; Bogner, Franz X.

    2015-01-01

    When entering the classroom, adolescents already hold various conceptions on science topics. Concept maps may function as useful tools to reveal such conceptions although labor-intensive analysis often prevents application in typical classroom situations. The authors aimed to provide teachers with an appropriate approach to analyze students'…

  9. The Criterion-Related Validity of a Computer-Based Approach for Scoring Concept Maps

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Koul, Ravinder; Salehi, Roya

    2006-01-01

    This investigation seeks to confirm a computer-based approach that can be used to score concept maps (Poindexter & Clariana, 2004) and then describes the concurrent criterion-related validity of these scores. Participants enrolled in two graduate courses (n=24) were asked to read about and research online the structure and function of the heart…

  10. Mapping the genomic architecture of adaptive traits with interspecific introgressive origin: a coalescent-based approach.

    PubMed

    Hejase, Hussein A; Liu, Kevin J

    2016-01-01

    Recent studies of eukaryotes including human and Neandertal, mice, and butterflies have highlighted the major role that interspecific introgression has played in adaptive trait evolution. A common question arises in each case: what is the genomic architecture of the introgressed traits? One common approach that can be used to address this question is association mapping, which looks for genotypic markers that have significant statistical association with a trait. It is well understood that sample relatedness can be a confounding factor in association mapping studies if not properly accounted for. Introgression and other evolutionary processes (e.g., incomplete lineage sorting) typically introduce variation among local genealogies, which can also differ from global sample structure measured across all genomic loci. In contrast, state-of-the-art association mapping methods assume fixed sample relatedness across the genome, which can lead to spurious inference. We therefore propose a new association mapping method called Coal-Map, which uses coalescent-based models to capture local genealogical variation alongside global sample structure. Using simulated and empirical data reflecting a range of evolutionary scenarios, we compare the performance of Coal-Map against EIGENSTRAT, a leading association mapping method in terms of its popularity, power, and type I error control. Our empirical data makes use of hundreds of mouse genomes for which adaptive interspecific introgression has recently been described. We found that Coal-Map's performance is comparable or better than EIGENSTRAT in terms of statistical power and false positive rate. Coal-Map's performance advantage was greatest on model conditions that most closely resembled empirically observed scenarios of adaptive introgression. These conditions had: (1) causal SNPs contained in one or a few introgressed genomic loci and (2) varying rates of gene flow - from high rates to very low rates where incomplete lineage

  11. Determination of contact maps in proteins: A combination of structural and chemical approaches

    NASA Astrophysics Data System (ADS)

    Wołek, Karol; Gómez-Sicilia, Àngel; Cieplak, Marek

    2015-12-01

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  12. Determination of contact maps in proteins: A combination of structural and chemical approaches.

    PubMed

    Wołek, Karol; Gómez-Sicilia, Àngel; Cieplak, Marek

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones - a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles. PMID:26723590

  13. Determination of contact maps in proteins: A combination of structural and chemical approaches

    SciTech Connect

    Wołek, Karol; Cieplak, Marek

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  14. Supersymmetric Higgs Bosons in Weak Boson Fusion

    SciTech Connect

    Hollik, Wolfgang; Plehn, Tilman; Rauch, Michael; Rzehak, Heidi

    2009-03-06

    We compute the complete supersymmetric next-to-leading-order corrections to the production of a light Higgs boson in weak-boson fusion. The size of the electroweak corrections is of similar order as the next-to-leading-order corrections in the standard model. The supersymmetric QCD corrections turn out to be significantly smaller than expected and than their electroweak counterparts. These corrections are an important ingredient to a precision analysis of the (supersymmetric) Higgs sector at the LHC, either as a known correction factor or as a contribution to the theory error.

  15. Toward real-time three-dimensional mapping of surficial aquifers using a hybrid modeling approach

    NASA Astrophysics Data System (ADS)

    Friedel, Michael J.; Esfahani, Akbar; Iwashita, Fabio

    2016-02-01

    A hybrid modeling approach is proposed for near real-time three-dimensional (3D) mapping of surficial aquifers. First, airborne frequency-domain electromagnetic (FDEM) measurements are numerically inverted to obtain subsurface resistivities. Second, a machine-learning (ML) algorithm is trained using the FDEM measurements and inverted resistivity profiles, and borehole geophysical and hydrogeologic data. Third, the trained ML algorithm is used together with independent FDEM measurements to map the spatial distribution of the aquifer system. Efficacy of the hybrid approach is demonstrated for mapping a heterogeneous surficial aquifer and confining unit in northwestern Nebraska, USA. For this case, independent performance testing reveals that aquifer mapping is unbiased with a strong correlation (0.94) among numerically inverted and ML-estimated binary (clay-silt or sand-gravel) layer resistivities (5-20 ohm-m or 21-5,000 ohm-m), and an intermediate correlation (0.74) for heterogeneous (clay, silt, sand, gravel) layer resistivities (5-5,000 ohm-m). Reduced correlation for the heterogeneous model is attributed to over-estimating the under-sampled high-resistivity gravels (about 0.5 % of the training data), and when removed the correlation increases (0.87). Independent analysis of the numerically inverted and ML-estimated resistivities finds that the hybrid procedure preserves both univariate and spatial statistics for each layer. Following training, the algorithms can map 3D surficial aquifers as fast as leveled FDEM measurements are presented to the ML network.

  16. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    NASA Astrophysics Data System (ADS)

    Raj, Raghu G.

    2016-07-01

    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  17. Engineering geological mapping in Wallonia (Belgium) : present state and recent computerized approach

    NASA Astrophysics Data System (ADS)

    Delvoie, S.; Radu, J.-P.; Ruthy, I.; Charlier, R.

    2012-04-01

    An engineering geological map can be defined as a geological map with a generalized representation of all the components of a geological environment which are strongly required for spatial planning, design, construction and maintenance of civil engineering buildings. In Wallonia (Belgium) 24 engineering geological maps have been developed between the 70s and the 90s at 1/5,000 or 1/10,000 scale covering some areas of the most industrialized and urbanized cities (Liège, Charleroi and Mons). They were based on soil and subsoil data point (boring, drilling, penetration test, geophysical test, outcrop…). Some displayed data present the depth (with isoheights) or the thickness (with isopachs) of the different subsoil layers up to about 50 m depth. Information about geomechanical properties of each subsoil layer, useful for engineers and urban planners, is also synthesized. However, these maps were built up only on paper and progressively needed to be updated with new soil and subsoil data. The Public Service of Wallonia and the University of Liège have recently initiated a study to evaluate the feasibility to develop engineering geological mapping with a computerized approach. Numerous and various data (about soil and subsoil) are stored into a georelational database (the geotechnical database - using Access, Microsoft®). All the data are geographically referenced. The database is linked to a GIS project (using ArcGIS, ESRI®). Both the database and GIS project consist of a powerful tool for spatial data management and analysis. This approach involves a methodology using interpolation methods to update the previous maps and to extent the coverage to new areas. The location (x, y, z) of each subsoil layer is then computed from data point. The geomechanical data of these layers are synthesized in an explanatory booklet joined to maps.

  18. Mapping susceptibility of rainfall-triggered shallow landslides using a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Liu, Chia-Nan; Wu, Chia-Chen

    2008-08-01

    To prepare a landslide susceptibility map is essential to identify hazardous regions, construct appropriate mitigation facilities, and plan emergency measures for a region prone to landslides triggered by rainfall. The conventional mapping methods require much information about past landslides records and contributing terrace and rainfall. They also rely heavily on the quantity and quality of accessible information and subjectively of the map builder. This paper contributes to a systematic and quantitative assessment of mapping landslide hazards over a region. Geographical Information System is implemented to retrieve relevant parameters from data layers, including the spatial distribution of transient fluid pressures, which is estimated using the TRIGRS program. The factor of safety of each pixel in the study region is calculated analytically. Monte Carlo simulation of random variables is conducted to process the estimation of fluid pressure and factor of safety for multiple times. The failure probability of each pixel is thus estimated. These procedures of mapping landslide potential are demonstrated in a case history. The analysis results reveal a positive correlation between landslide probability and accumulated rainfall. This approach gives simulation results compared to field records. The location and size of actual landslide are well predicted. An explanation for some of the inconsistencies is also provided to emphasize the importance of site information on the accuracy of mapping results.

  19. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    SciTech Connect

    Zhou, Shiguo; Kile, A.; Bechner, M.; Kvikstad, E.; Deng, W.; Wei, J.; Severin, J.; Runnheim, R.; Churas, C.; Forrest, D.; Dimalanta, E.; Lamers, C.; Burland, V.; Blattner, F. R.; Schwartz, David C.

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  20. Improved spatial accuracy of functional maps in the rat olfactory bulb using supervised machine learning approach.

    PubMed

    Murphy, Matthew C; Poplawsky, Alexander J; Vazquez, Alberto L; Chan, Kevin C; Kim, Seong-Gi; Fukuda, Mitsuhiro

    2016-08-15

    Functional MRI (fMRI) is a popular and important tool for noninvasive mapping of neural activity. As fMRI measures the hemodynamic response, the resulting activation maps do not perfectly reflect the underlying neural activity. The purpose of this work was to design a data-driven model to improve the spatial accuracy of fMRI maps in the rat olfactory bulb. This system is an ideal choice for this investigation since the bulb circuit is well characterized, allowing for an accurate definition of activity patterns in order to train the model. We generated models for both cerebral blood volume weighted (CBVw) and blood oxygen level dependent (BOLD) fMRI data. The results indicate that the spatial accuracy of the activation maps is either significantly improved or at worst not significantly different when using the learned models compared to a conventional general linear model approach, particularly for BOLD images and activity patterns involving deep layers of the bulb. Furthermore, the activation maps computed by CBVw and BOLD data show increased agreement when using the learned models, lending more confidence to their accuracy. The models presented here could have an immediate impact on studies of the olfactory bulb, but perhaps more importantly, demonstrate the potential for similar flexible, data-driven models to improve the quality of activation maps calculated using fMRI data. PMID:27236085

  1. Using Concept Mapping in Community-Based Participatory Research: A Mixed Methods Approach

    PubMed Central

    Windsor, Liliane Cambraia

    2015-01-01

    Community-based participatory research (CBPR) has been identified as a useful approach to increasing community involvement in research. Developing rigorous methods in conducting CBPR is an important step in gaining more support for this approach. The current article argues that concept mapping, a structured mixed methods approach, is useful in the initial development of a rigorous CBPR program of research aiming to develop culturally tailored and community-based health interventions for vulnerable populations. A research project examining social dynamics and consequences of alcohol and substance use in Newark, New Jersey, is described to illustrate the use of concept mapping methodology in CBPR. A total of 75 individuals participated in the study. PMID:26561484

  2. Large-extent digital soil mapping approaches for total soil depth

    NASA Astrophysics Data System (ADS)

    Mulder, Titia; Lacoste, Marine; Saby, Nicolas P. A.; Arrouays, Dominique

    2015-04-01

    Total soil depth (SDt) plays a key role in supporting various ecosystem services and properties, including plant growth, water availability and carbon stocks. Therefore, predictive mapping of SDt has been included as one of the deliverables within the GlobalSoilMap project. In this work SDt was predicted for France following the directions of GlobalSoilMap, which requires modelling at 90m resolution. This first method, further referred to as DM, consisted of modelling the deterministic trend in SDt using data mining, followed by a bias correction and ordinary kriging of the residuals. Considering the total surface area of France, being about 540K km2, employed methods may need to be able dealing with large data sets. Therefore, a second method, multi-resolution kriging (MrK) for large datasets, was implemented. This method consisted of modelling the deterministic trend by a linear model, followed by interpolation of the residuals. For the two methods, the general trend was assumed to be explained by the biotic and abiotic environmental conditions, as described by the Soil-Landscape paradigm. The mapping accuracy was evaluated by an internal validation and its concordance with previous soil maps. In addition, the prediction interval for DM and the confidence interval for MrK were determined. Finally, the opportunities and limitations of both approaches were evaluated. The results showed consistency in mapped spatial patterns and a good prediction of the mean values. DM was better capable in predicting extreme values due to the bias correction. Also, DM was more powerful in capturing the deterministic trend than the linear model of the MrK approach. However, MrK was found to be more straightforward and flexible in delivering spatial explicit uncertainty measures. The validation indicated that DM was more accurate than MrK. Improvements for DM may be expected by predicting soil depth classes. MrK shows potential for modelling beyond the country level, at high

  3. Flood inundation mapping uncertainty introduced by topographic data accuracy, geometric configuration and modeling approach

    NASA Astrophysics Data System (ADS)

    Papaioannou, G.; Loukas, Athanasios

    2010-05-01

    Floodplain modeling is a recently new and applied method in river engineering discipline and is essential for prediction of flood hazards. The issue of flood inundation of upland environments with topographically complex floodplains is an understudied subject. In most areas of the U.S.A., the use of topographic information derived from Light Detection and Ranging (LIDAR) has improved the quality of river flood inundation predictions. However, such high quality topographical data are not available in most countries and the necessary information is obtained by topographical survey and/or topographical maps. Furthermore, the optimum dimensionality of hydraulic models, cross-section configuration in one-dimensional (1D) models, mesh resolution in two-dimensional models (2D) and modeling approach is not well studied or documented. All these factors introduce significant uncertainty in the evaluation of the floodplain zoning. This study addresses some of these issues by comparing flood inundation maps developed using different topography, geometric description and modeling approach. The methodology involves use of topographic datasets with different horizontal resolutions, vertical accuracies and bathymetry details. Each topographic dataset is used to create a flood inundation map for different cross-section configurations using 1D (HEC-RAS) model, and different mesh resolutions using 2D models for steady state and unsteady state conditions. Comparison of resulting maps indicates the uncertainty introduced in floodplain modeling by the horizontal resolution and vertical accuracy of topographic data and the different modeling approaches.

  4. DREAM--a novel approach for robust, ultrafast, multislice B₁ mapping.

    PubMed

    Nehrke, Kay; Börnert, Peter

    2012-11-01

    A novel multislice B₁-mapping method dubbed dual refocusing echo acquisition mode is proposed, able to cover the whole transmit coil volume in only one second, which is more than an order of magnitude faster than established approaches. The dual refocusing echo acquisition mode technique employs a stimulated echo acquisition mode (STEAM) preparation sequence followed by a tailored single-shot gradient echo sequence, measuring simultaneously the stimulated echo and the free induction decay as gradient-recalled echoes, and determining the actual flip angle of the STEAM preparation radiofrequency pulses from the ratio of the two measured signals. Due to an elaborated timing scheme, the method is insensitive against susceptibility/chemical shift effects and can deliver a B₀ phase map and a transceive phase map for free. The approach has only a weak T₁ and T₂ dependence and moreover, causes only a low specific absorption rate (SAR) burden. The accuracy of the method with respect to systematic and statistical errors is investigated both, theoretically and in experiments on phantoms. In addition, the performance of the approach is demonstrated in vivo in B₁-mapping and radiofrequency shimming experiments on the abdomen, the legs, and the head on an eight-channel parallel transmit 3 T MRI system. PMID:22252850

  5. Mutual Composite Fermion and Composite Boson approaches to balanced and imbalanced bi-layer quantum Hall system: An electronic analogy of the Helium 4 system

    SciTech Connect

    Ye Jinwu

    2008-03-15

    We use both Mutual Composite Fermion (MCF) and Composite Boson (CB) approach to study balanced and imbalanced Bi-layer Quantum Hall systems (BLQH) and make critical comparisons between the two approaches. We find the CB approach is superior to the MCF approach in studying ground states with different kinds of broken symmetries. In the phase representation of the CB theory, we first study the Excitonic superfluid (ESF) state. The theory puts spin and charge degree freedoms in the same footing, explicitly bring out the spin-charge connection and classify all the possible excitations in a systematic way. Then in the dual density representation of the CB theory, we study possible intermediate phases as the distance increases. We propose there are two critical distances d{sub c1} < d{sub c2} and three phases as the distance increases. When 0 < d < d{sub c1}, the system is in the ESF state which breaks the internal U(1) symmetry, when d{sub c1} < d < d{sub c2}, the system is in an pseudo-spin density wave (PSDW) state which breaks the translational symmetry, there is a first-order transition at d{sub c1} driven by the collapsing of magneto-roton minimum at a finite wavevector in the pseudo-spin channel. When d{sub c2} < d < {infinity}, the system becomes two weakly coupled {nu} = 1/2 Composite Fermion Fermi Liquid (FL) state. There is also a first-order transition at d = d{sub c2}. We construct a quantum Ginzburg Landau action to describe the transition from ESF to PSDW which break the two completely different symmetries. By using the QGL action, we explicitly show that the PSDW takes a square lattice and analyze in detail the properties of the PSDW at zero and finite temperature. We also suggest that the correlated hopping of vacancies in the active and passive layers in the PSDW state leads to very large and temperature-dependent drag consistent with the experimental data. Then we study the effects of imbalance on both ESF and PSDW. In the ESF side, the system supports

  6. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  7. An Integrative Network Approach to Map the Transcriptome to the Phenome

    PubMed Central

    Mehan, Michael R.; Nunez-Iglesias, Juan; Kalakrishnan, Mrinal; Waterman, Michael S.

    2009-01-01

    Abstract Although many studies have been successful in the discovery of cooperating groups of genes, mapping these groups to phenotypes has proved a much more challenging task. In this article, we present the first genome-wide mapping of gene coexpression modules onto the phenome. We annotated coexpression networks from 136 microarray datasets with phenotypes from the Unified Medical Language System (UMLS). We then designed an efficient graph-based simulated annealing approach to identify coexpression modules frequently and specifically occurring in datasets related to individual phenotypes. By requiring phenotype-specific recurrence, we ensure the robustness of our findings. We discovered 118,772 modules specific to 42 phenotypes, and developed validation tests combining Gene Ontology, GeneRIF and UMLS. Our method is generally applicable to any kind of abundant network data with defined phenotype association, and thus paves the way for genome-wide, gene network-phenotype maps. PMID:19630539

  8. Mapping quantitative trait loci in complex pedigrees: a two-step variance component approach.

    PubMed Central

    George, A W; Visscher, P M; Haley, C S

    2000-01-01

    There is a growing need for the development of statistical techniques capable of mapping quantitative trait loci (QTL) in general outbred animal populations. Presently used variance component methods, which correctly account for the complex relationships that may exist between individuals, are challenged by the difficulties incurred through unknown marker genotypes, inbred individuals, partially or unknown marker phases, and multigenerational data. In this article, a two-step variance component approach that enables practitioners to routinely map QTL in populations with the aforementioned difficulties is explored. The performance of the QTL mapping methodology is assessed via its application to simulated data. The capacity of the technique to accurately estimate parameters is examined for a range of scenarios. PMID:11102397

  9. Automated mapping of glacial overdeepenings beneath contemporary ice sheets: Approaches and potential applications

    NASA Astrophysics Data System (ADS)

    Patton, Henry; Swift, Darrel A.; Clark, Chris D.; Livingstone, Stephen J.; Cook, Simon J.; Hubbard, Alun

    2015-03-01

    Awareness is growing on the significance of overdeepenings in ice sheet systems. However, a complete understanding of overdeepening formation is lacking, meaning observations of overdeepening location and morphometry are urgently required to motivate process understanding. Subject to the development of appropriate mapping approaches, high resolution subglacial topography data sets covering the whole of Antarctica and Greenland offer significant potential to acquire such observations and to relate overdeepening characteristics to ice sheet parameters. We explore a possible method for mapping overdeepenings beneath the Antarctic and Greenland ice sheets and illustrate a potential application of this approach by testing a possible relationship between overdeepening elongation ratio and ice sheet flow velocity. We find that hydrological and terrain filtering approaches are unsuited to mapping overdeepenings and develop a novel rule-based GIS methodology that delineates overdeepening perimeters by analysis of closed-contour properties. We then develop GIS procedures that provide information on overdeepening morphology and topographic context. Limitations in the accuracy and resolution of bed-topography data sets mean that application to glaciological problems requires consideration of quality-control criteria to (a) remove potentially spurious depressions and (b) reduce uncertainties that arise from the inclusion of depressions of nonglacial origin, or those in regions where empirical data are sparse. To address the problem of overdeepening elongation, potential quality control criteria are introduced; and discussion of this example serves to highlight the limitations that mapping approaches - and applications of such approaches - must confront. We predict that improvements in bed-data quality will reduce the need for quality control procedures and facilitate increasingly robust insights from empirical data.

  10. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    PubMed

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  11. A Practical and Automated Approach to Large Area Forest Disturbance Mapping with Remote Sensing

    PubMed Central

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  12. High-resolution geologic mapping of the inner continental shelf: Boston Harbor and approaches, Massachusetts

    USGS Publications Warehouse

    Ackerman, Seth D.; Butman, Bradford; Barnhardt, Walter A.; Danforth, William W.; Crocker, James M.

    2006-01-01

    This report presents the surficial geologic framework data and information for the sea floor of Boston Harbor and Approaches, Massachusetts (fig. 1.1). This mapping was conducted as part of a cooperative program between the U.S. Geological Survey (USGS), the Massachusetts Office of Coastal Zone Management (CZM), and the National Oceanic and Atmospheric Administration (NOAA). The primary objective of this project was to provide sea floor geologic information and maps of Boston Harbor to aid resource management, scientific research, industry and the public. A secondary objective was to test the feasibility of using NOAA hydrographic survey data, normally collected to update navigation charts, to create maps of the sea floor suitable for geologic and habitat interpretations. Defining sea-floor geology is the first steps toward managing ocean resources and assessing environmental changes due to natural or human activity. The geophysical data for these maps were collected as part of hydrographic surveys carried out by NOAA in 2000 and 2001 (fig. 1.2). Bottom photographs, video, and samples of the sediments were collected in September 2004 to help in the interpretation of the geophysical data. Included in this report are high-resolution maps of the sea floor, at a scale of 1:25,000; the data used to create these maps in Geographic Information Systems (GIS) format; a GIS project; and a gallery of photographs of the sea floor. Companion maps of sea floor to the north Boston Harbor and Approaches are presented by Barnhardt and others (2006) and to the east by Butman and others (2003a,b,c). See Butman and others (2004) for a map of Massachusetts Bay at a scale of 1:125,000. The sections of this report are listed in the navigation bar along the left-hand margin of this page. Section 1 (this section) introduces the report. Section 2 presents the large-format map sheets. Section 3 describes data collection, processing, and analysis. Section 4 summarizes the geologic history of

  13. Global land cover mapping at 30 m resolution: A POK-based operational approach

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  14. Elastographic mapping in optical coherence tomography using an unconventional approach based on correlation stability.

    PubMed

    Zaitsev, Vladimir Y; Matveev, Lev A; Matveyev, Alexandr L; Gelikonov, Grigory V; Gelikonov, Valentin M

    2014-02-01

    An approach to elastographic mapping in optical coherence tomography (OCT) using comparison of correlation stability of sequentially obtained intensity OCT images of the studied strained tissue is discussed. The basic idea is that for stiffer regions, the OCT image is distorted to a smaller degree. Consequently, cross-correlation maps obtained with compensation of trivial translational motion of the image parts using a sliding correlation window can represent the spatial distribution of the relative tissue stiffness. An important advantage of the proposed approach is that it allows one to avoid the stage of local-strain reconstruction via error-sensitive numerical differentiation of experimentally determined displacements. Another advantage is that the correlation stability (CS) approach intrinsically implies that for deformed softer tissue regions, cross-correlation should already be strongly decreased in contrast to the approaches based on initial reconstruction of displacements. This feature determines a much wider strain range of operability than the proposed approach and is favorable for its free-hand implementation using the OCT probe itself to deform the tissue. The CS approach can be implemented using either the image elements reflecting morphological structure of the tissue or performing the speckle-level cross-correlation. Examples of numerical simulations and experimental demonstrations using both phantom samples and in vivo obtained OCT images are presented. PMID:24042446

  15. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.

    PubMed

    Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  16. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    PubMed Central

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  17. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    NASA Astrophysics Data System (ADS)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  18. A quasi-classical mapping approach to vibrationally coupled electron transport in molecular junctions

    SciTech Connect

    Li, Bin; Miller, William H.; Wilner, Eli Y.; Thoss, Michael

    2014-03-14

    We develop a classical mapping approach suitable to describe vibrationally coupled charge transport in molecular junctions based on the Cartesian mapping for many-electron systems [B. Li and W. H. Miller, J. Chem. Phys. 137, 154107 (2012)]. To properly describe vibrational quantum effects in the transport characteristics, we introduce a simple transformation rewriting the Hamiltonian in terms of occupation numbers and use a binning function to facilitate quantization. The approach provides accurate results for the nonequilibrium Holstein model for a range of bias voltages, vibrational frequencies, and temperatures. It also captures the hallmarks of vibrational quantum effects apparent in step-like structure in the current-voltage characteristics at low temperatures as well as the phenomenon of Franck-Condon blockade.

  19. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  20. Putting people on the map through an approach that integrates social data in conservation planning.

    PubMed

    Stephanson, Sheri L; Mascia, Michael B

    2014-10-01

    Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making. PMID:25102957

  1. An approach to mapping haplotype-specific recombination sites in human MHC class III

    SciTech Connect

    Levo, A.; Westman, P.; Partanen, J.

    1996-12-31

    Studies of the major histocompatibility complex (MHC) in mouse indicate that the recombination sites are not randomly distributed and their occurrence is haplotype-dependent. No data concerning haplotype-specific recombination sites in human are available due to the low number of informative families. To investigate haplotype-specific recombination sites in human MHC, we describe an approach based on identification of recombinant haplotypes derived from one conserved haplotype at the population level. The recombination sites were mapped by comparing polymorphic markers between the recombinant and assumed original haplotypes. We tested this approach on the extended haplotype HLA A3; B47; Bf{sup *}F; C4A{sup *}1; C4B{sup *}Q0; DR7, which is most suitable for this analysis. First, it carries a number of rare markers, and second, the haplotype, albeit rare in the general population, is frequent in patients with 21-hydroxylase (21OH) defect. We observed recombinants derived from this haplotype in patients with 21OH defect. All these haplotypes had the centromeric part (from Bf to DR) identical to the original haplotype, but they differed in HLA A and B. We therefore assumed that they underwent recombinations in the segment that separates the Bf and HLA B genes. Polymorphic markers indicated that all break points mapped to two segments near the TNF locus. This approach makes possible the mapping of preferential recombination sites in different haplotypes. 20 refs., 1 fig., 1 tab.

  2. A simple approach to using an amorphous silicon EPID to verify IMRT planar dose maps.

    PubMed

    Lee, Christopher; Menk, Fred; Cadman, Patrick; Greer, Peter B

    2009-03-01

    A simplified method of verifying intensity modulated radiation therapy (IMRT) fields using a Varian aS500 amorphous silicon electronic portal imaging device (EPID) is demonstrated. Unlike previous approaches, it does not involve time consuming or complicated analytical processing of the data. The central axis pixel response of the EPID, as well as the profile characteristics obtained from images acquired with a 6 MV photon beam, was examined as a function of field size. Ion chamber measurements at various depths in a water phantom were then collected and it was found that at a specific depth d(ref), the dose response and profile characteristics closely matched the results of the EPID analysis. The only manipulation required to be performed on the EPID images was the multiplication of a matrix of off axis ratio values to remove the effect of the flood field calibration. Similarly, d(ref) was found for 18 MV. Planar dose maps at d(ref) in a water phantom for a bar pattern, a strip pattern, and 14 clinical IMRT fields from two patient cases each being from a separate anatomical region, i.e., head and neck as well as the pelvis, for both energies were generated by the Pinnacle planning system (V7.4). EPID images of these fields were acquired and converted to planar dose maps and compared directly with the Pinnacle planar dose maps. Radiographic film dosimetry and a MapCHECK dosimetry device (Sun Nuclear Corporation, Melbourne, FL) were used as an independent verification of the dose distribution. Gamma analysis of the EPID, film, and Pinnacle planar dose maps generated for the clinical IMRT fields showed that approximately 97% of all points passed using a 3% dose/3 mm DTA tolerance test. Based on the range of fields studied, the author's results appear to justify using this approach as a method to verify dose distributions calculated on a treatment planning system, including complex intensity modulated fields. PMID:19378759

  3. An assessment of a collaborative mapping approach for exploring land use patterns for several European metropolises

    NASA Astrophysics Data System (ADS)

    Jokar Arsanjani, Jamal; Vaz, Eric

    2015-03-01

    Until recently, land surveys and digital interpretation of remotely sensed imagery have been used to generate land use inventories. These techniques however, are often cumbersome and costly, allocating large amounts of technical and temporal costs. The technological advances of web 2.0 have brought a wide array of technological achievements, stimulating the participatory role in collaborative and crowd sourced mapping products. This has been fostered by GPS-enabled devices, and accessible tools that enable visual interpretation of high resolution satellite images/air photos provided in collaborative mapping projects. Such technologies offer an integrative approach to geography by means of promoting public participation and allowing accurate assessment and classification of land use as well as geographical features. OpenStreetMap (OSM) has supported the evolution of such techniques, contributing to the existence of a large inventory of spatial land use information. This paper explores the introduction of this novel participatory phenomenon for land use classification in Europe's metropolitan regions. We adopt a positivistic approach to assess comparatively the accuracy of these contributions of OSM for land use classifications in seven large European metropolitan regions. Thematic accuracy and degree of completeness of OSM data was compared to available Global Monitoring for Environment and Security Urban Atlas (GMESUA) datasets for the chosen metropolises. We further extend our findings of land use within a novel framework for geography, justifying that volunteered geographic information (VGI) sources are of great benefit for land use mapping depending on location and degree of VGI dynamism and offer a great alternative to traditional mapping techniques for metropolitan regions throughout Europe. Evaluation of several land use types at the local level suggests that a number of OSM classes (such as anthropogenic land use, agricultural and some natural environment

  4. Mapping Ds insertions in barley using a sequence-based approach.

    PubMed

    Cooper, L D; Marquez-Cedillo, L; Singh, J; Sturbaum, A K; Zhang, S; Edwards, V; Johnson, K; Kleinhofs, A; Rangel, S; Carollo, V; Bregitzer, P; Lemaux, P G; Hayes, P M

    2004-09-01

    A transposon tagging system, based upon maize Ac/Ds elements, was developed in barley (Hordeum vulgaresubsp. vulgare). The long-term objective of this project is to identify a set of lines with Ds insertions dispersed throughout the genome as a comprehensive tool for gene discovery and reverse genetics. AcTPase and Ds-bar elements were introduced into immature embryos of Golden Promise by biolistic transformation. Subsequent transposition and segregation of Ds away from AcTPase and the original site of integration resulted in new lines, each containing a stabilized Ds element in a new location. The sequence of the genomic DNA flanking the Ds elements was obtained by inverse PCR and TAIL-PCR. Using a sequence-based mapping strategy, we determined the genome locations of the Ds insertions in 19 independent lines using primarily restriction digest-based assays of PCR-amplified single nucleotide polymorphisms and PCR-based assays of insertions or deletions. The principal strategy was to identify and map sequence polymorphisms in the regions corresponding to the flanking DNA using the Oregon Wolfe Barley mapping population. The mapping results obtained by the sequence-based approach were confirmed by RFLP analyses in four of the lines. In addition, cloned DNA sequences corresponding to the flanking DNA were used to assign map locations to Morex-derived genomic BAC library inserts, thus integrating genetic and physical maps of barley. BLAST search results indicate that the majority of the transposed Ds elements are found within predicted or known coding sequences. Transposon tagging in barley using Ac/Ds thus promises to provide a useful tool for studies on the functional genomics of the Triticeae. PMID:15449176

  5. A geomorphological mapping approach for the assessment of seabed geohazards and risk

    NASA Astrophysics Data System (ADS)

    Hough, Gayle; Green, Jennifer; Fish, Paul; Mills, Andy; Moore, Roger

    2011-03-01

    Exploration and development of offshore hydrocarbon resources has advanced into remote deepwater regions over the last decade and poses significant technical challenges for the design and installation of wells and facilities at extreme water depths. Seafloor and shallow subsurface processes and conditions in these areas are complex and generally poorly understood, and the geohazards to development are larger scale and fundamentally different to those encountered onshore; consequently the geohazard risk to deepwater development projects is potentially significant and requires careful evaluation and mitigation during the front-end planning and engineering design stages of projects. There are no established industry standards or methods for the assessment of geohazards and engineering-quality geophysical data at the scale of development. The paper describes an integrated and systematic map-based approach for the assessment and mitigation of seabed geohazards and risk to proposed deepwater development. The approach employs a multi-disciplinary team working with engineering-quality field calibrated data to accurately map and assess seafloor ground conditions and ensure that development proposals are not exposed to intolerable geohazard risk. The approach taken is very similar to the practice of establishing geological models for land-based engineering projects, in which the complete geological history of the site is used to characterise and predict the performance of the ground. Such an approach is routine for major projects on land but so far does not seem to be common practice in the offshore industry. The paper illustrates the seafloor geomophological mapping approach developed. The products are being used to optimise development layouts to avoid geohazards where possible and to support site-specific engineering design of facilities based on a detailed understanding of the potential geohazard loadings and associated risk.

  6. Simulating spin-boson models with matrix product states

    NASA Astrophysics Data System (ADS)

    Wall, Michael; Safavi-Naini, Arghavan; Rey, Ana Maria

    2016-05-01

    The global coupling of few-level quantum systems (``spins'') to a discrete set of bosonic modes is a key ingredient for many applications in quantum science, including large-scale entanglement generation, quantum simulation of the dynamics of long-range interacting spin models, and hybrid platforms for force and spin sensing. In many situations, the bosons are integrated out, leading to effective long-range interactions between the spins; however, strong spin-boson coupling invalidates this approach, and spin-boson entanglement degrades the fidelity of quantum simulation of spin models. We present a general numerical method for treating the out-of-equilibrium dynamics of spin-boson systems based on matrix product states. While most efficient for weak coupling or small numbers of boson modes, our method applies for any spatial and operator dependence of the spin-boson coupling. In addition, our approach allows straightforward computation of many quantities of interest, such as the full counting statistics of collective spin measurements and quantum simulation infidelity due to spin-boson entanglement. We apply our method to ongoing trapped ion quantum simulator experiments in analytically intractable regimes. This work is supported by JILA-NSF-PFC-1125844, NSF-PIF- 1211914, ARO, AFOSR, AFOSR-MURI, and the NRC.

  7. Progress in landslide susceptibility mapping over Europe using Tier-based approaches

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Hervás, Javier; Reichenbach, Paola; Malet, Jean-Philippe

    2010-05-01

    The European Thematic Strategy for Soil Protection aims, among other objectives, to ensure a sustainable use of soil. The legal instrument of the strategy, the proposed Framework Directive, suggests identifying priority areas of several soil threats including landslides using a coherent and compatible approach based on the use of common thematic data. In a first stage, this can be achieved through landslide susceptibility mapping using geographically nested, multi-step tiered approaches, where areas identified as of high susceptibility by a first, synoptic-scale Tier ("Tier 1") can then be further assessed and mapped at larger scale by successive Tiers. In order to identify areas prone to landslides at European scale ("Tier 1"), a number of thematic terrain and environmental data sets already available for the whole of Europe can be used as input for a continental scale susceptibility model. However, since no coherent landslide inventory data is available at the moment over the whole continent, qualitative heuristic zonation approaches are proposed. For "Tier 1" a preliminary, simplified model has been developed. It consists of an equally weighting combination of a reduced, continent-wide common dataset of landslide conditioning factors including soil parent material, slope angle and land cover, to derive a landslide susceptibility index using raster mapping units consisting of 1 x 1 km pixels. A preliminary European-wide susceptibility map has thus been produced at 1:1 Million scale, since this is compatible with that of the datasets used. The map has been validated by means of a ratio of effectiveness using samples from landslide inventories in Italy, Austria, Hungary and United Kingdom. Although not differentiated for specific geomorphological environments or specific landslide types, the experimental model reveals a relatively good performance in many European regions at a 1:1 Million scale. An additional "Tier 1" susceptibility map at the same scale and using

  8. Quantum contextuality in N-boson systems

    SciTech Connect

    Benatti, Fabio; Floreanini, Roberto; Genovese, Marco; Olivares, Stefano

    2011-09-15

    Quantum contextuality in systems of identical bosonic particles is explicitly exhibited via the maximum violation of a suitable inequality of Clauser-Horne-Shimony-Holt type. Unlike the approaches considered so far, which make use of single-particle observables, our analysis involves collective observables constructed using multiboson operators. An exemplifying scheme to test this violation with a quantum optical setup is also discussed.

  9. Higgs boson photoproduction at the LHC

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2011-07-15

    We present the current development of the photoproduction approach for the Higgs boson with its application to pp and pA collisions at the LHC. We perform a different analysis for the Gap Survival Probability, where we consider a probability of 3% and also a more optimistic value of 10% based on the HERA data for dijet production. As a result, the cross section for the exclusive Higgs boson production is about 2 fb and 6 fb in pp collisions and 617 and 2056 fb for pPb collisions, considering the gap survival factor of 3% and 10%, respectively.

  10. Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Herkenhoff, K.

    2010-01-01

    The 1:500K-scale geologic map of MTM 85200 - the Olympia Cavi region of Mars - has been submitted for peer review [1]. Physiographically, the quadrangle includes portions of Olympia Rupes, a set of sinuous scarps which elevate Planum Boreum 800 meters above Olympia Planum. The region includes the high-standing, spiral troughs of Boreales Scopuli, the rugged and deep depressions of Olympia Cavi, and the vast dune fields of Olympia Undae. Geologically, the mapped units and landforms reflect the recent history of repeated accumulation and degradation. The widespread occurrence of both weakly and strongly stratified units implicates the drape-like accumulation of ice, dust, and sand through climatic variations. Similarly, the occurrence of layer truncations, particularly at unit boundaries, implicates punctuated periods of both localized and regional erosion and surface deflation whereby underlying units were exhumed and their material transported and re-deposited. Herein, we focus on the iterative mapping approaches that allowed not only the accommodation of the burgeoning variety and volume of data sets, but also facilitated the efficient presentation of map information. Unit characteristics and their geologic history are detailed in past abstracts [2-3].

  11. A branch-and-cut approach to physical mapping with end-probes

    SciTech Connect

    Christof, T.; Reinelt, G.; Juenger, M.

    1997-12-01

    A fundamental problem in computational biology is the construction of physical maps of chromosomes from hybridization experiments between unique probes and clones of chromosome fragments in the presence of error. Alizadeh, Karp, Weisser and Zweig [AKWZ94] first considered a maximum-likelihood model of the problem that is equivalent to finding an ordering of the probes that minimizes a weighted sum of errors, and developed several effective heuristics. We show that by exploiting information about the end-probes of clones, this model can be formulated as a weighted Betweenness Problem. This affords the significant advantage of allowing the well-developed tools of integer linear-programming and branch-and-cut algorithms to be brought to bear on physical mapping, enabling us for the first time to solve small mapping instances to optimality even in the presence of high error. We also show that by combining the optimal solution of many small overlapping Betweenness Problems, one can effectively screen errors from larger instances, and solve the edited instance to optimality as a Hamming-Distance Traveling Salesman Problem. This suggests a new combined approach to physical map construction. 18 refs., 13 figs.

  12. Oxidative status interactome map: towards novel approaches in experiment planning, data analysis, diagnostics and therapy.

    PubMed

    Zolotukhin, Peter; Kozlova, Yulia; Dovzhik, Anastasiya; Kovalenko, Konstantin; Kutsyn, Kseniya; Aleksandrova, Anzhela; Shkurat, Tatyana

    2013-08-01

    Experimental evidence suggests an immense variety of processes associated with and aimed at producing reactive oxygen and/or nitrogen species. Clinical studies implicate an enormous range of pathologies associated with reactive oxygen/nitrogen species metabolism deregulation, particularly oxidative stress. Recent advances in biochemistry, proteomics and molecular biology/biophysics of cells suggest oxidative stress to be an endpoint of complex dysregulation events of conjugated pathways consolidated under the term, proposed here, "oxidative status". The oxidative status concept, in order to allow for novel diagnostic and therapeutic approaches, requires elaboration of a new logic system comprehending all the features, versatility and complexity of cellular pro- and antioxidative components of different nature. We have developed a curated and regularly updated interactive interactome map of human cellular-level oxidative status allowing for systematization of the related most up-to-date experimental data. A total of more than 600 papers were selected for the initial creation of the map. The map comprises more than 300 individual factors with respective interactions, all subdivided hierarchically for logical analysis purposes. The pilot application of the interactome map suggested several points for further development of oxidative status-based technologies. PMID:23698602

  13. A reciprocal space approach for locating symmetry elements in Patterson superposition maps

    SciTech Connect

    Hendrixson, T.

    1990-09-21

    A method for determining the location and possible existence of symmetry elements in Patterson superposition maps has been developed. A comparison of the original superposition map and a superposition map operated on by the symmetry element gives possible translations to the location of the symmetry element. A reciprocal space approach using structure factor-like quantities obtained from the Fourier transform of the superposition function is then used to determine the best'' location of the symmetry element. Constraints based upon the space group requirements are also used as a check on the locations. The locations of the symmetry elements are used to modify the Fourier transform coefficients of the superposition function to give an approximation of the structure factors, which are then refined using the EG relation. The analysis of several compounds using this method is presented. Reciprocal space techniques for locating multiple images in the superposition function are also presented, along with methods to remove the effect of multiple images in the Fourier transform coefficients of the superposition map. In addition, crystallographic studies of the extended chain structure of (NHC{sub 5}H{sub 5})SbI{sub 4} and of the twinning method of the orthorhombic form of the high-{Tc} superconductor YBa{sub 2}Cu{sub 3}O{sub 7-x} are presented. 54 refs.

  14. An optimization approach for mapping and measuring the divergence and correspondence between paths.

    PubMed

    Mueller, Shane T; Perelman, Brandon S; Veinott, Elizabeth S

    2016-03-01

    Many domains of empirical research produce or analyze spatial paths as a measure of behavior. Previously, approaches for measuring the similarity or deviation between two paths have either required timing information or have used ad hoc or manual coding schemes. In this paper, we describe an optimization approach for robustly measuring the area-based deviation between two paths we call ALCAMP (Algorithm for finding the Least-Cost Areal Mapping between Paths). ALCAMP measures the deviation between two paths and produces a mapping between corresponding points on the two paths. The method is robust to a number of aspects in real path data, such as crossovers, self-intersections, differences in path segmentation, and partial or incomplete paths. Unlike similar algorithms that produce distance metrics between trajectories (i.e., paths that include timing information), this algorithm uses only the order of observed path segments to determine the mapping. We describe the algorithm and show its results on a number of sample problems and data sets, and demonstrate its effectiveness for assessing human memory for paths. We also describe available software code written in the R statistical computing language that implements the algorithm to enable data analysis. PMID:25737420

  15. Mapping Variable Width Riparian Zones Utilizing Open Source Data: A Robust New Approach

    NASA Astrophysics Data System (ADS)

    Abood, S. A.; Maclean, A.

    2013-12-01

    Riparian buffers are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well-defined vegetation and soil characteristics. Previous approaches to riparian buffer delineation have primarily utilized fixed width buffers. However, these methodologies only take the watercourse into consideration and ignore critical geomorphology, associated vegetation and soil characteristics. Utilizing spatial data readily available from government agencies and geospatial clearinghouses, such as DEMs and the National Hydrography Dataset, the Riparian Buffer Delineation Model (RBDM) offers advantages by harnessing the geospatial modeling capabilities of ArcMap GIS, incorporating a statistically valid sampling technique along the watercourse to accurately map the critical 50-year plain, and delineating a variable width riparian buffer. Options within the model allow incorporation of National Wetlands Inventory (NWI), Soil Survey Data (SSURGO), National Land Cover Data (NLCD) and/or Cropland Data Layer (CDL) to improve the accuracy and utility of the riparian buffers attributes. This approach recognizes the dynamic and transitional natures of riparian buffers by accounting for hydrologic, geomorphic and vegetation data as inputs into the delineation process. By allowing the incorporation of land cover data, decision makers acquire a useful tool to assist in managing riparian buffers. The model is formatted as an ArcMap toolbox for easy installation and does require a Spatial Analyst license. Variable width riparian buffer utilizing 50-year flood height and 10m DEM. RBDM Inputs

  16. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. PMID:23339716

  17. Approximate gauge symmetry of composite vector bosons

    NASA Astrophysics Data System (ADS)

    Suzuki, Mahiko

    2010-08-01

    It can be shown in a solvable field theory model that the couplings of the composite vector bosons made of a fermion pair approach the gauge couplings in the limit of strong binding. Although this phenomenon may appear accidental and special to the vector bosons made of a fermion pair, we extend it to the case of bosons being constituents and find that the same phenomenon occurs in a more intriguing way. The functional formalism not only facilitates computation but also provides us with a better insight into the generating mechanism of approximate gauge symmetry, in particular, how the strong binding and global current conservation conspire to generate such an approximate symmetry. Remarks are made on its possible relevance or irrelevance to electroweak and higher symmetries.

  18. Improved effective vector boson approximation revisited

    NASA Astrophysics Data System (ADS)

    Bernreuther, Werner; Chen, Long

    2016-03-01

    We reexamine the improved effective vector boson approximation which is based on two-vector-boson luminosities Lpol for the computation of weak gauge-boson hard scattering subprocesses V1V2→W in high-energy hadron-hadron or e-e+ collisions. We calculate these luminosities for the nine combinations of the transverse and longitudinal polarizations of V1 and V2 in the unitary and axial gauge. For these two gauge choices the quality of this approach is investigated for the reactions e-e+→W-W+νeν¯ e and e-e+→t t ¯ νeν¯ e using appropriate phase-space cuts.

  19. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  20. Scaling of noise correlations in one-dimensional lattice hard-core boson systems

    NASA Astrophysics Data System (ADS)

    He, Kai; Rigol, Marcos

    2011-03-01

    Noise correlations are studied for systems of hard-core bosons in one-dimensional lattices. We use an exact numerical approach based on the Bose-Fermi mapping and properties of Slater determinants. We focus on the scaling of the noise correlations with system size in superfluid and insulating phases, which are generated in the homogeneous lattice, with period-two superlattices, and with uniformly distributed random diagonal disorder. For the superfluid phases, the leading contribution is shown to exhibit a density independent scaling proportional to the system size, while the first subleading term exhibits a density dependent power-law exponent.

  1. Scaling of noise correlations in one-dimensional-lattice-hard-core-boson systems

    NASA Astrophysics Data System (ADS)

    He, Kai; Rigol, Marcos

    2011-02-01

    Noise correlations are studied for systems of hard-core bosons in one-dimensional lattices. We use an exact numerical approach based on the Bose-Fermi mapping and properties of Slater determinants. We focus on the scaling of the noise correlations with system size in superfluid and insulating phases, which are generated in the homogeneous lattice, with period-two superlattices and with uniformly distributed random diagonal disorder. For the superfluid phases, the leading contribution is shown to exhibit a density-independent scaling proportional to the system size, while the first subleading term exhibits a density-dependent power-law exponent.

  2. A direct approach to generalised multiple mapping conditioning for selected turbulent diffusion flame cases

    NASA Astrophysics Data System (ADS)

    Sundaram, Brruntha; Klimenko, Alexander Yuri; Cleary, Matthew John; Ge, Yipeng

    2016-07-01

    This work presents a direct and transparent interpretation of two concepts for modelling turbulent combustion: generalised Multiple Mapping Conditioning (MMC) and sparse-Lagrangian Large Eddy Simulation (LES). The MMC approach is presented as a hybrid between the Probability Density Function (PDF) method and approaches based on conditioning (e.g. Conditional Moment Closure, flamelet, etc.). The sparse-Lagrangian approach, which allows for a dramatic reduction of computational cost, is viewed as an alternative interpretation of the Filtered Density Function (FDF) methods. This work presents simulations of several turbulent diffusion flame cases and discusses the universality of the localness parameter between these cases and the universality of sparse-Lagrangian FDF methods with MMC.

  3. A sib-pair approach to interval mapping a quantitative trait loci

    SciTech Connect

    Fulker, K.W. ); Cardon, L.R. )

    1994-06-01

    An interval mapping procedure based on the sib-pair method of Haseman and Elston is developed, and simulation studies are carried out to explore its properties. The procedure is analogous to other interval mapping procedures used with experimental material, such as plants and animals, and yields very similar results in terms of the location and effects size of a quantitative trait locus (QTL). The procedure offers an advantage over the conventional Haseman and Elston approach, in terms of power, and provides useful information concerning the location of a QTL. Because of its simplicity, the method readily lends itself to the analysis of selected samples for increased power and the evaluation of multilocus models of complex phenotypes. 26 refs., 4 figs., 5 tabs.

  4. A GIS based method for soil mapping in Sardinia, Italy: a geomatic approach.

    PubMed

    Vacca, A; Loddo, S; Melis, M T; Funedda, A; Puddu, R; Verona, M; Fanni, S; Fantola, F; Madrau, S; Marrone, V A; Serra, G; Tore, C; Manca, D; Pasci, S; Puddu, M R; Schirru, P

    2014-06-01

    A new project was recently initiated for the realization of the "Land Unit and Soil Capability Map of Sardinia" at a scale of 1:50,000 to support land use planning. In this study, we outline the general structure of the project and the methods used in the activities that have been thus far conducted. A GIS approach was used. We used the soil-landscape paradigm for the prediction of soil classes and their spatial distribution or the prediction of soil properties based on landscape features. The work is divided into two main phases. In the first phase, the available digital data on land cover, geology and topography were processed and classified according to their influence on weathering processes and soil properties. The methods used in the interpretation are based on consolidated and generalized knowledge about the influence of geology, topography and land cover on soil properties. The existing soil data (areal and point data) were collected, reviewed, validated and standardized according to international and national guidelines. Point data considered to be usable were input into a specific database created for the project. Using expert interpretation, all digital data were merged to produce a first draft of the Land Unit Map. During the second phase, this map will be implemented with the existing soil data and verified in the field if also needed with new soil data collection, and the final Land Unit Map will be produced. The Land Unit and Soil Capability Map will be produced by classifying the land units using a reference matching table of land capability classes created for this project. PMID:24315681

  5. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  6. A Probabilistic Approach to Receptive Field Mapping in the Frontal Eye Fields

    PubMed Central

    Mayo, J. Patrick; Morrison, Robert M.; Smith, Matthew A.

    2016-01-01

    Studies of the neuronal mechanisms of perisaccadic vision often lack the resolution needed to determine important changes in receptive field (RF) structure. Such limited analytical power can lead to inaccurate descriptions of visuomotor processing. To address this issue, we developed a precise, probabilistic technique that uses a generalized linear model (GLM) for mapping the visual RFs of frontal eye field (FEF) neurons during stable fixation (Mayo et al., 2015). We previously found that full-field RF maps could be obtained using 1–8 dot stimuli presented at frame rates of 10–150 ms. FEF responses were generally robust to changes in the number of stimuli presented or the rate of presentation, which allowed us to visualize RFs over a range of spatial and temporal resolutions. Here, we compare the quality of RFs obtained over different stimulus and GLM parameters to facilitate future work on the detailed mapping of FEF RFs. We first evaluate the interactions between the number of stimuli presented per trial, the total number of trials, and the quality of RF mapping. Next, we vary the spatial resolution of our approach to illustrate the tradeoff between visualizing RF sub-structure and sampling at high resolutions. We then evaluate local smoothing as a possible correction for situations where under-sampling occurs. Finally, we provide a preliminary demonstration of the usefulness of a probabilistic approach for visualizing full-field perisaccadic RF shifts. Our results present a powerful, and perhaps necessary, framework for studying perisaccadic vision that is applicable to FEF and possibly other visuomotor regions of the brain. PMID:27047352

  7. Bosonization of Weyl Fermions

    NASA Astrophysics Data System (ADS)

    Marino, Eduardo

    The electron, discovered by Thomson by the end of the nineteenth century, was the first experimentally observed particle. The Weyl fermion, though theoretically predicted since a long time, was observed in a condensed matter environment in an experiment reported only a few weeks ago. Is there any linking thread connecting the first and the last observed fermion (quasi)particles? The answer is positive. By generalizing the method known as bosonization, the first time in its full complete form, for a spacetime with 3+1 dimensions, we are able to show that both electrons and Weyl fermions can be expressed in terms of the same boson field, namely the Kalb-Ramond anti-symmetric tensor gauge field. The bosonized form of the Weyl chiral currents lead to the angle-dependent magneto-conductance behavior observed in these systems.

  8. Coulomb problem for vector bosons

    SciTech Connect

    Kuchiev, M.Yu.; Flambaum, V.V.

    2006-05-01

    The Coulomb problem for vector bosons W{sup {+-}} incorporates a well-known difficulty; the charge of the boson localized in a close vicinity of the attractive Coulomb center proves to be infinite. The paradox is shown to be resolved by the QED vacuum polarization, which brings in a strong effective repulsion that eradicates the infinite charge of the boson on the Coulomb center. This property allows one to define the Coulomb problem for vector bosons properly.

  9. Functional Connectivity-Based Parcellation of Amygdala Using Self-Organized Mapping: A Data Driven Approach

    PubMed Central

    Mishra, Arabinda; Rogers, Baxter P.; Chen, Li Min; Gore, John C.

    2013-01-01

    The overall goal of this work is to demonstrate how resting state functional magnetic resonance imaging (fMRI) signals may be used to objectively parcellate functionally heterogeneous subregions of the human amygdala into structures characterized by similar patterns of functional connectivity. We hypothesize that similarity of functional connectivity of subregions with other parts of the brain can be a potential basis to segment and cluster voxels using data driven approaches. In this work, self-organizing map (SOM) was implemented to cluster the connectivity maps associated with each voxel of the human amygdala, thereby defining distinct subregions. The functional separation was optimized by evaluating the overall differences in functional connectivity between the subregions at group level. Analysis of 25 resting state fMRI data sets suggests that SOM can successfully identify functionally independent nuclei based on differences in their inter subregional functional connectivity, evaluated statistically at various confidence levels. Although amygdala contains several nuclei whose distinct roles are implicated in various functions, our objective approach discerns at least two functionally distinct volumes comparable to previous parcellation results obtained using probabilistic tractography and cytoarchitectonic analysis. Association of these nuclei with various known functions and a quantitative evaluation of their differences in overall functional connectivity with lateral orbital frontal cortex and temporal pole confirms the functional diversity of amygdala. The data driven approach adopted here may be used as a powerful indicator of structure–function relationships in the amygdala and other functionally heterogeneous structures as well. PMID:23418140

  10. Toward a Materials Genome Approach for ionic liquids: synthesis guided by ab initio property maps.

    PubMed

    Yan, Fangyong; Lartey, Michael; Jariwala, Kuldeep; Bowser, Sage; Damodaran, Krishnan; Albenze, Erik; Luebke, David R; Nulwala, Hunaid B; Smit, Berend; Haranczyk, Maciej

    2014-11-26

    The Materials Genome Approach (MGA) aims to accelerate development of new materials by incorporating computational and data-driven approaches to reduce the cost of identification of optimal structures for a given application. Here, we use the MGA to guide the synthesis of triazolium-based ionic liquids (ILs). Our approach involves an IL property-mapping tool, which merges combinatorial structure enumeration, descriptor-based structure representation and sampling, and property prediction using molecular simulations. The simulated properties such as density, diffusivity, and gas solubility obtained for a selected set of representative ILs were used to build neural network models and map properties for all enumerated species. Herein, a family of ILs based on ca. 200,000 triazolium-based cations paired with the bis(trifluoromethanesulfonyl)amide anion was investigated using our MGA. Fourteen representative ILs spreading the entire range of predicted properties were subsequently synthesized and then characterized confirming the predicted density, diffusivity, and CO2 Henry's Law coefficient. Moreover, the property (CO2, CH4, and N2 solubility) trends associated with exchange of the bis(trifluoromethanesulfonyl)amide anion with one of 32 other anions were explored and quantified. PMID:25356930

  11. An internal state variable mapping approach for Li-Plating diagnosis

    NASA Astrophysics Data System (ADS)

    Bai, Guangxing; Wang, Pingfeng

    2016-08-01

    Li-ion battery failure becomes one of major challenges for reliable battery applications, as it could cause catastrophic consequences. Compared with capacity fading resulted from calendar effects, Li-plating induced battery failures are more difficult to identify, as they causes sudden capacity loss leaving limited time for failure diagnosis. This paper presents a new internal state variable (ISV) mapping approach to identify values of immeasurable battery ISVs considering changes of inherent parameters of battery system dynamics for Li-plating diagnosis. Employing the developed ISV mapping approach, an explicit functional relationship model between measurable battery signals and immeasurable battery ISVs can be developed. The developed model can then be used to identify ISVs from an online battery system for the occurrence identification of Li-plating. Employing multiphysics based simulation of Li-plating using COMSOL, the proposed Li-plating diagnosis approach is implemented under different conditions in the case studies to demonstrate its efficacy in diagnosis of Li-plating onset timings.

  12. Target fishing of glycopentalone using integrated inverse docking and reverse pharmacophore mapping approach.

    PubMed

    Gurung, A B; Ali, M A; Bhattacharjee, A; Al-Anazi, K M; Farah, M A; Al-Hemaid, F M; Abou-Tarboush, F M; Lee, J; Kim, S Y; Al-Anazi, F S M

    2016-01-01

    Glycopentalone isolated from Glycosmis pentaphylla (family Rutaceae) has cytotoxic and apoptosis inducing effects in various human cancer cell lines; however, its mode of action is not known. Therefore, target fishing of glycopentalone using a combined approach of inverse docking and reverse pharmacophore mapping approach was used to identify potential targets of glycopentalone, and gain insight into its binding modes against the selected molecular targets, viz., CDK-2, CDK-6, Topoisomerase I, Bcl-2, VEGFR-2, Telomere:G-quadruplex and Topoisomerase II. These targets were chosen based on their key roles in the progression of cancer via regulation of cell cycle and DNA replication. Molecular docking analysis revealed that glycopentalone displayed binding energies ranging from -6.38 to -8.35 kcal/mol and inhibition constants ranging from 0.758 to 20.90 μM. Further, the binding affinities of glycopentalone to the targets were in the order: Telomere:G-quadruplex > VEGFR-2 > CDK-6 > CDK-2 > Topoisomerase II > Topoisomerase I > Bcl-2. Binding mode analysis revealed critical hydrogen bonds as well as hydrophobic interactions with the targets. The targets were validated by reverse pharmacophore mapping of glycopentalone against a set of 2241 known human target proteins which revealed CDK-2 and VEGFR-2 as the most favorable targets. The glycopentalone was well mapped to CDK-2 and VEGFR-2 which involve six pharmacophore features (two hydrophobic centers and four hydrogen bond acceptors) and nine pharmacophore features (five hydrophobic, two hydrogen bond acceptors and two hydrogen bond donors), respectively. The present computational approach may aid in rational identification of targets for small molecules against large set of candidate macromolecules before bioassays validation. PMID:27525951

  13. Higgs boson hunting

    SciTech Connect

    Dawson, S.; Haber, H.E.; Rindani, S.D.

    1989-05-01

    This is the summary report of the Higgs Boson Working Group. We discuss a variety of search techniques for a Higgs boson which is lighter than the Z. The processes K /yields/ /pi/H, /eta//prime/ /yields/ /eta/H,/Upsilon/ /yields/ H/gamma/ and e/sup +/e/sup /minus// /yields/ ZH are examined with particular attention paid to theoretical uncertainties in the calculations. We also briefly examine new features of Higgs phenomenology in a model which contains Higgs triplets as well as the usual doublet of scalar fields. 33 refs., 6 figs., 1 tab.

  14. Anomalous gauge boson interactions

    SciTech Connect

    Aihara, H.; Barklow, T.; Baur, U. |

    1995-03-01

    We discuss the direct measurement of the trilinear vector boson couplings in present and future collider experiments. The major goals of such experiments will be the confirmation of the Standard Model (SM) predictions and the search for signals of new physics. We review our current theoretical understanding of anomalous trilinear gauge-boson self interactions. If the energy scale of the new physics is {approximately} 1 TeV, these low energy anomalous couplings are expected to be no larger than {Omicron}(10{sup {minus}2}). Constraints from high precision measurements at LEP and low energy charged and neutral current processes are critically reviewed.

  15. A self organizing map approach to physiological data analysis for enhanced group performance.

    SciTech Connect

    Doser, Adele Beatrice; Merkle, Peter Benedict

    2004-10-01

    A Self Organizing Map (SOM) approach was used to analyze physiological data taken from a group of subjects participating in a cooperative video shooting game. The ultimate aim was to discover signatures of group cooperation, conflict, leadership, and performance. Such information could be fed back to participants in a meaningful way, and ultimately increase group performance in national security applications, where the consequences of a poor group decision can be devastating. Results demonstrated that a SOM can be a useful tool in revealing individual and group signatures from physiological data, and could ultimately be used to heighten group performance.

  16. Determination of the transverse momentum of W bosons in hadronic collisions via forward folding techniques

    NASA Astrophysics Data System (ADS)

    Cuth, Jakub; Merkotan, Kyrylo; Schott, Matthias; Webb, Samuel

    2016-04-01

    The measurement of the transverse momentum of W bosons in hadron collisions provides not only an important test of quantum chromodynamic (QCD) calculations, but also is a crucial input for the precision measurement of the W boson mass. While the measurement of the Z boson transverse momentum is experimentally well under control, the available unfolding techniques for the W boson final states lead generically to relatively large uncertainties. In this paper, we present a new methodology to estimate the W boson transverse momentum spectrum, significantly improving the systematic uncertainties of current approaches.

  17. A Novel Approach on Designing Augmented Fuzzy Cognitive Maps Using Fuzzified Decision Trees

    NASA Astrophysics Data System (ADS)

    Papageorgiou, Elpiniki I.

    This paper proposes a new methodology for designing Fuzzy Cognitive Maps using crisp decision trees that have been fuzzified. Fuzzy cognitive map is a knowledge-based technique that works as an artificial cognitive network inheriting the main aspects of cognitive maps and artificial neural networks. Decision trees, in the other hand, are well known intelligent techniques that extract rules from both symbolic and numeric data. Fuzzy theoretical techniques are used to fuzzify crisp decision trees in order to soften decision boundaries at decision nodes inherent in this type of trees. Comparisons between crisp decision trees and the fuzzified decision trees suggest that the later fuzzy tree is significantly more robust and produces a more balanced decision making. The approach proposed in this paper could incorporate any type of fuzzy decision trees. Through this methodology, new linguistic weights were determined in FCM model, thus producing augmented FCM tool. The framework is consisted of a new fuzzy algorithm to generate linguistic weights that describe the cause-effect relationships among the concepts of the FCM model, from induced fuzzy decision trees.

  18. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    PubMed Central

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  19. An object-oriented approach to automated landform mapping: A case study of drumlins

    NASA Astrophysics Data System (ADS)

    Saha, Kakoli; Wells, Neil A.; Munro-Stasiuk, Mandy

    2011-09-01

    This paper details an automated object-oriented approach to mapping landforms from digital elevation models (DEMs), using the example of drumlins in the Chautauqua drumlin field in NW Pennsylvania and upstate New York. Object-oriented classification is highly desirable as it can identify specific shapes in datasets based on both the pixel values in a raster dataset and the contextual information between pixels and extracted objects. The methodology is built specifically for application to the USGS 30 m resolution DEM data, which are freely available to the public and of sufficient resolution to map medium scale landforms. Using the raw DEM data, as well as derived aspect and slope, Definiens Developer (v.7) was used to perform multiresolution segmentation, followed by rule-based classification in order to extract individual polygons that represent drumlins. Drumlins obtained by automated extraction were visually and statistically compared to those identified via manual digitization. Detailed morphometric descriptive statistics such as means, ranges, and standard deviations were inspected and compared for length, width, elongation ratio, area, and perimeter. Although the manual and automated results were not always statistically identical, a more detailed comparison of just the drumlins identified by both procedures showed that the automated methods easily matched the manual digitization. Differences in the two methods related to mapping compound drumlins, and smaller and larger drumlins. The automated method generally identified more features in these categories than the manual method and thus outperformed the manual method.

  20. An entropy-driven matrix completion (E-MC) approach to complex network mapping

    NASA Astrophysics Data System (ADS)

    Koochakzadeh, Ali; Pal, Piya

    2016-05-01

    Mapping the topology of a complex network in a resource-efficient manner is a challenging problem with applications in internet mapping, social network inference, and so forth. We propose a new entropy driven algorithm leveraging ideas from matrix completion, to map the network using monitors (or sensors) which, when placed on judiciously selected nodes, are capable of discovering their immediate neighbors. The main challenge is to maximize the portion of discovered network using only a limited number of available monitors. To this end, (i) a new measure of entropy or uncertainty is associated with each node, in terms of the currently discovered edges incident on that node, and (ii) a greedy algorithm is developed to select a candidate node for monitor placement based on its entropy. Utilizing the fact that many complex networks of interest (such as social networks), have a low-rank adjacency matrix, a matrix completion algorithm, namely 1-bit matrix completion, is combined with the greedy algorithm to further boost its performance. The low rank property of the network adjacency matrix can be used to extrapolate a portion of missing edges, and consequently update the node entropies, so as to efficiently guide the network discovery algorithm towards placing monitors on the nodes that can turn out to be more informative. Simulations performed on a variety of real world networks such as social networks and peer networks demonstrate the superior performance of the matrix-completion guided approach in discovering the network topology.

  1. Large Deformation Multiresolution Diffeomorphic Metric Mapping for Multiresolution Cortical Surfaces: A Coarse-to-Fine Approach.

    PubMed

    Tan, Mingzhen; Qiu, Anqi

    2016-09-01

    Brain surface registration is an important tool for characterizing cortical anatomical variations and understanding their roles in normal cortical development and psychiatric diseases. However, surface registration remains challenging due to complicated cortical anatomy and its large differences across individuals. In this paper, we propose a fast coarse-to-fine algorithm for surface registration by adapting the large diffeomorphic deformation metric mapping (LDDMM) framework for surface mapping and show improvements in speed and accuracy via a multiresolution analysis of surface meshes and the construction of multiresolution diffeomorphic transformations. The proposed method constructs a family of multiresolution meshes that are used as natural sparse priors of the cortical morphology. At varying resolutions, these meshes act as anchor points where the parameterization of multiresolution deformation vector fields can be supported, allowing the construction of a bundle of multiresolution deformation fields, each originating from a different resolution. Using a coarse-to-fine approach, we show a potential reduction in computation cost along with improvements in sulcal alignment when compared with LDDMM surface mapping. PMID:27254865

  2. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach.

    PubMed

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  3. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids

    PubMed Central

    Tangprasertchai, Narin S.; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S.; Qin, Peter Z.

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve “correct” all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. PMID:26477260

  4. Flight investigation of helicopter IFR approaches to oil rigs using airborne weather and mapping radar

    NASA Technical Reports Server (NTRS)

    Bull, J. S.; Hegarty, D. M.; Phillips, J. D.; Sturgeon, W. R.; Hunting, A. W.; Pate, D. P.

    1979-01-01

    Airborne weather and mapping radar is a near-term, economical method of providing 'self-contained' navigation information for approaches to offshore oil rigs and its use has been rapidly expanding in recent years. A joint NASA/FAA flight test investigation of helicopter IFR approaches to offshore oil rigs in the Gulf of Mexico was initiated in June 1978 and conducted under contract to Air Logistics. Approximately 120 approaches were flown in a Bell 212 helicopter by 15 operational pilots during the months of August and September 1978. The purpose of the tests was to collect data to (1) support development of advanced radar flight director concepts by NASA and (2) aid the establishment of Terminal Instrument Procedures (TERPS) criteria by the FAA. The flight test objectives were to develop airborne radar approach procedures, measure tracking errors, determine accpetable weather minimums, and determine pilot acceptability. Data obtained will contribute significantly to improved helicopter airborne radar approach capability and to the support of exploration, development, and utilization of the Nation's offshore oil supplies.

  5. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids.

    PubMed

    Tangprasertchai, Narin S; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S; Qin, Peter Z

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve "correct" all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. PMID:26477260

  6. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping

    PubMed Central

    Wallace, Chris; Cutler, Antony J; Pontikos, Nikolas; Pekalski, Marcin L; Burren, Oliver S; Cooper, Jason D; García, Arcadio Rubio; Ferreira, Ricardo C; Guo, Hui; Walker, Neil M; Smyth, Deborah J; Rich, Stephen S; Onengut-Gumuscu, Suna; Sawcer, Stephen J; Ban, Maria

    2015-01-01

    Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD) and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS) and type 1 diabetes (T1D) associations in the IL-2RA (CD25) gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3) and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data. PMID:26106896

  7. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping.

    PubMed

    Wallace, Chris; Cutler, Antony J; Pontikos, Nikolas; Pekalski, Marcin L; Burren, Oliver S; Cooper, Jason D; García, Arcadio Rubio; Ferreira, Ricardo C; Guo, Hui; Walker, Neil M; Smyth, Deborah J; Rich, Stephen S; Onengut-Gumuscu, Suna; Sawcer, Stephen J; Ban, Maria; Richardson, Sylvia; Todd, John A; Wicker, Linda S

    2015-06-01

    Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD) and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS) and type 1 diabetes (T1D) associations in the IL-2RA (CD25) gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3) and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data. PMID:26106896

  8. Comparing Point Count System and physically-based approaches to map aquifer vulnerability

    NASA Astrophysics Data System (ADS)

    Lagomarsino, D.; Martina, M. L. V.; Todini, E.

    2009-04-01

    Pollution vulnerability maps of aquifers are an important instrument for land and water management. These maps are generally based on simplified Point Count System Models (PCSM), such as DRASTIC or SINTACS, without the use of physically based groundwater models, which may provide more accurate results. The present research aims at finding a trade-off between the accuracy provided by a physically-based model, which inevitably involves higher computational complexity and data requirements, and the coarser, albeit simpler and easy to implement, approach provided by an indicator based model such as one of the most important PCSM, the DRASTIC model (Aller et al., 1987). The alluvial aquifer of the conoid of the Reno River, extending from pedemountain hills of the Apennines chain towards Po plain, is one of the main sources of drinking water for the city of Bologna. The parameters considered by DRASTIC (Depth of water, net Recharge, Aquifer media, Soil media, Topography, Impact of vadose zone and hydraulic Conductivity) represent the main hydrogeological and environmental parameters that influence the pollution transport from the surface towards the groundwater. The real flow of the Reno aquifer, was then simulated by means of a finite element model (FEFLOW) that takes into account the physical processes of water movement and the associated transport of contaminant in the environment. The results obtained by the model have been compared with the DRASTIC vulnerability map. A preliminary analysis of the vulnerability map, based on chemical analyses of water, reveals that the concentration of Nitrates is actually higher in those zones where higher vulnerability values were found.

  9. Teaching Map Skills: An Inductive Approach. Topics in Geography, No. 8.

    ERIC Educational Resources Information Center

    Anderson, Jeremy

    This document contains lessons designed for teaching geography and map skills to intermediate level students, although many may be adapted to other grade levels. The lessons place heavy reliance on children making a series of individual and collective maps. The 16 lessons included are: (1) mental maps; (2) conventional maps; (3) map elements; (4)…

  10. The field line map approach for simulations of magnetically confined plasmas

    NASA Astrophysics Data System (ADS)

    Stegmeir, Andreas; Coster, David; Maj, Omar; Hallatschek, Klaus; Lackner, Karl

    2016-01-01

    Predictions of plasma parameters in the edge and scrape-off layer of tokamaks is difficult since most modern tokamaks have a divertor and the associated separatrix causes the usually employed field/flux-aligned coordinates to become singular on the separatrix/X-point. The presented field line map approach avoids such problems as it is based on a cylindrical grid: standard finite-difference methods can be used for the discretisation of perpendicular (w.r.t. magnetic field) operators, and the characteristic flute mode property (k∥ ≪k⊥) of structures is exploited computationally via a field line following discretisation of parallel operators which leads to grid sparsification in the toroidal direction. This paper is devoted to the discretisation of the parallel diffusion operator (the approach taken is very similar to the flux-coordinate independent (FCI) approach which has already been adopted to a hyperbolic problem (Ottaviani, 2011; Hariri, 2013)). Based on the support operator method, schemes are derived which maintain the self-adjointness property of the parallel diffusion operator on the discrete level. These methods have very low numerical perpendicular diffusion compared to a naive discretisation which is a critical issue since magnetically confined plasmas exhibit a very strong anisotropy. Two different versions of the discrete parallel diffusion operator are derived: the first is based on interpolation where the order of interpolation and therefore the numerical diffusion is adjustable; the second is based on integration and is advantageous in cases where the field line map is strongly distorted. The schemes are implemented in the new code GRILLIX, and extensive benchmarks and numerous examples are presented which show the validity of the approach in general and GRILLIX in particular.

  11. Non-invasive computation of aortic pressure maps: a phantom-based study of two approaches

    NASA Astrophysics Data System (ADS)

    Delles, Michael; Schalck, Sebastian; Chassein, Yves; Müller, Tobias; Rengier, Fabian; Speidel, Stefanie; von Tengg-Kobligk, Hendrik; Kauczor, Hans-Ulrich; Dillmann, Rüdiger; Unterhinninghofen, Roland

    2014-03-01

    Patient-specific blood pressure values in the human aorta are an important parameter in the management of cardiovascular diseases. A direct measurement of these values is only possible by invasive catheterization at a limited number of measurement sites. To overcome these drawbacks, two non-invasive approaches of computing patient-specific relative aortic blood pressure maps throughout the entire aortic vessel volume are investigated by our group. The first approach uses computations from complete time-resolved, three-dimensional flow velocity fields acquired by phasecontrast magnetic resonance imaging (PC-MRI), whereas the second approach relies on computational fluid dynamics (CFD) simulations with ultrasound-based boundary conditions. A detailed evaluation of these computational methods under realistic conditions is necessary in order to investigate their overall robustness and accuracy as well as their sensitivity to certain algorithmic parameters. We present a comparative study of the two blood pressure computation methods in an experimental phantom setup, which mimics a simplified thoracic aorta. The comparative analysis includes the investigation of the impact of algorithmic parameters on the MRI-based blood pressure computation and the impact of extracting pressure maps in a voxel grid from the CFD simulations. Overall, a very good agreement between the results of the two computational approaches can be observed despite the fact that both methods used completely separate measurements as input data. Therefore, the comparative study of the presented work indicates that both non-invasive pressure computation methods show an excellent robustness and accuracy and can therefore be used for research purposes in the management of cardiovascular diseases.

  12. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. PMID:25890086

  13. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps

    PubMed Central

    Santos, Diego G.; Fernandes, Bruno J. T.; Bezerra, Byron L. D.

    2015-01-01

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset. PMID:26569262

  14. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    NASA Astrophysics Data System (ADS)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Roger Anderson, O.

    2011-04-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample of the study consisted of six classes (140 Form 3 students of 13-15 years old) selected from a typical coeducational school in Brunei. Three classes (40 boys and 30 girls) were taught using the TTA while three other classes (41 boys and 29 girls) used the CMA, enriched with PowerPoint presentations. After the interventions (lessons on magnetism), the students in both groups were asked to describe in writing their understanding of magnetism accrued from the lessons. Their written descriptions were analyzed using flow map analyses to assess their content knowledge and its organisation in memory as evidence of cognitive structure. The extent of CLE was measured using a published CLE survey. The results showed that the cognitive structures of the CMA students were more extensive, thematically organised and richer in interconnectedness of thoughts than those of TTA students. Moreover, CMA students also perceived their classroom learning environment to be more constructivist than their counterparts. It is, therefore, recommended that teachers consider using the CMA teaching technique to help students enrich their understanding, especially for more complex or abstract scientific content.

  15. Multi-scale hierarchical approach for parametric mapping: assessment on multi-compartmental models.

    PubMed

    Rizzo, G; Turkheimer, F E; Bertoldo, A

    2013-02-15

    This paper investigates a new hierarchical method to apply basis function to mono- and multi-compartmental models (Hierarchical-Basis Function Method, H-BFM) at a voxel level. This method identifies the parameters of the compartmental model in its nonlinearized version, integrating information derived at the region of interest (ROI) level by segmenting the cerebral volume based on anatomical definition or functional clustering. We present the results obtained by using a two tissue-four rate constant model with two different tracers ([(11)C]FLB457 and [carbonyl-(11)C]WAY100635), one of the most complex models used in receptor studies, especially at the voxel level. H-BFM is robust and its application on both [(11)C]FLB457 and [carbonyl-(11)C]WAY100635 allows accurate and precise parameter estimates, good quality parametric maps and a low percentage of voxels out of physiological bound (<8%). The computational time depends on the number of basis functions selected and can be compatible with clinical use (~6h for a single subject analysis). The novel method is a robust approach for PET quantification by using compartmental modeling at the voxel level. In particular, different from other proposed approaches, this method can also be used when the linearization of the model is not appropriate. We expect that applying it to clinical data will generate reliable parametric maps. PMID:23220428

  16. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps.

    PubMed

    Santos, Diego G; Fernandes, Bruno J T; Bezerra, Byron L D

    2015-01-01

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset. PMID:26569262

  17. Tectonic lineament mapping of the Thaumasia Plateau, Mars: Comparing results from photointerpretation and a semi-automatic approach

    NASA Astrophysics Data System (ADS)

    Vaz, David A.; Di Achille, Gaetano; Barata, Maria Teresa; Alves, Eduardo Ivo

    2012-11-01

    Photointerpretation is the technique generally used to map and analyze the tectonic features existent on Mars surface. In this study we compare qualitatively and quantitatively two tectonic maps based on the interpretation of satellite imagery and a map derived semi-automatically. The comparison of the two photointerpreted datasets allowed us to infer some of the factors that can influence the process of lineament mapping on Mars. Comparing the manually mapped datasets with the semi-automatically mapped features allowed us to evaluate the accuracy of the semi-automatic mapping procedure, as well as to identify the main limitations of the semi-automatic approach to mapping tectonic structures from MOLA altimetry. Significant differences were found between the two photointerpretations. The qualitative and quantitative comparisons showed how mapping criteria, illumination conditions and scale of analysis can locally influence the interpretations. The semi-automatic mapping procedure proved to be mainly dependent on data quality; nevertheless the methodology, when applied to MOLA data, is able to produce meaningful results at a regional scale.

  18. Mapping CO2 emission in highly urbanized region using standardized microbial respiration approach

    NASA Astrophysics Data System (ADS)

    Vasenev, V. I.; Stoorvogel, J. J.; Ananyeva, N. D.

    2012-12-01

    Urbanization is a major recent land-use change pathway. Land conversion to urban has a tremendous and still unclear effect on soil cover and functions. Urban soil can act as a carbon source, although its potential for CO2 emission is also very high. The main challenge in analysis and mapping soil organic carbon (SOC) in urban environment is its high spatial heterogeneity and temporal dynamics. The urban environment provides a number of specific features and processes that influence soil formation and functioning and results in a unique spatial variability of carbon stocks and fluxes at short distance. Soil sealing, functional zoning, settlement age and size are the predominant factors, distinguishing heterogeneity of urban soil carbon. The combination of these factors creates a great amount of contrast clusters with abrupt borders, which is very difficult to consider in regional assessment and mapping of SOC stocks and soil CO2 emission. Most of the existing approaches to measure CO2 emission in field conditions (eddy-covariance, soil chambers) are very sensitive to soil moisture and temperature conditions. They require long-term sampling set during the season in order to obtain relevant results. This makes them inapplicable for the analysis of CO2 emission spatial variability at the regional scale. Soil respiration (SR) measurement in standardized lab conditions enables to overcome this difficulty. SR is predominant outgoing carbon flux, including autotrophic respiration of plant roots and heterotrophic respiration of soil microorganisms. Microbiota is responsible for 50-80% of total soil carbon outflow. Microbial respiration (MR) approach provides an integral CO2 emission results, characterizing microbe CO2 production in optimal conditions and thus independent from initial difference in soil temperature and moisture. The current study aimed to combine digital soil mapping (DSM) techniques with standardized microbial respiration approach in order to analyse and

  19. Resummation of Goldstone boson contributions to the MSSM effective potential

    NASA Astrophysics Data System (ADS)

    Kumar, Nilanjana; Martin, Stephen P.

    2016-07-01

    We discuss the resummation of the Goldstone boson contributions to the effective potential of the minimal supersymmetric Standard Model. This eliminates the formal problems of spurious imaginary parts and logarithmic singularities in the minimization conditions when the tree-level Goldstone boson squared masses are negative or approach zero. The numerical impact of the resummation is shown to be almost always very small. We also show how to write the two-loop minimization conditions so that Goldstone boson squared masses do not appear at all, and so that they can be solved without iteration.

  20. An integrated approach for updating cadastral maps in Pakistan using satellite remote sensing data

    NASA Astrophysics Data System (ADS)

    Ali, Zahir; Tuladhar, Arbind; Zevenbergen, Jaap

    2012-08-01

    Updating cadastral information is crucial for recording land ownership and property division changes in a timely fashioned manner. In most cases, the existing cadastral maps do not provide up-to-date information on land parcel boundaries. Such a situation demands that all the cadastral data and parcel boundaries information in these maps to be updated in a timely fashion. The existing techniques for acquiring cadastral information are discipline-oriented based on different disciplines such as geodesy, surveying, and photogrammetry. All these techniques require a large number of manpower, time, and cost when they are carried out separately. There is a need to integrate these techniques for acquiring cadastral information to update the existing cadastral data and (re)produce cadastral maps in an efficient manner. To reduce the time and cost involved in cadastral data acquisition, this study develops an integrated approach by integrating global position system (GPS) data, remote sensing (RS) imagery, and existing cadastral maps. For this purpose, the panchromatic image with 0.6 m spatial resolution and the corresponding multi-spectral image with 2.4 m spatial resolution and 3 spectral bands from QuickBird satellite were used. A digital elevation model (DEM) was extracted from SPOT-5 stereopairs and some ground control points (GCPs) were also used for ortho-rectifying the QuickBird images. After ortho-rectifying these images and registering the multi-spectral image to the panchromatic image, fusion between them was attained to get good quality multi-spectral images of these two study areas with 0.6 m spatial resolution. Cadastral parcel boundaries were then identified on QuickBird images of the two study areas via visual interpretation using participatory-GIS (PGIS) technique. The regions of study are the urban and rural areas of Peshawar and Swabi districts in the Khyber Pakhtunkhwa province of Pakistan. The results are the creation of updated cadastral maps with a

  1. Accurate multi-source forest species mapping using the multiple spectral-spatial classification approach

    NASA Astrophysics Data System (ADS)

    Stavrakoudis, Dimitris; Gitas, Ioannis; Karydas, Christos; Kolokoussis, Polychronis; Karathanassi, Vassilia

    2015-10-01

    This paper proposes an efficient methodology for combining multiple remotely sensed imagery, in order to increase the classification accuracy in complex forest species mapping tasks. The proposed scheme follows a decision fusion approach, whereby each image is first classified separately by means of a pixel-wise Fuzzy-Output Support Vector Machine (FO-SVM) classifier. Subsequently, the multiple results are fused according to the so-called multiple spectral- spatial classifier using the minimum spanning forest (MSSC-MSF) approach, which constitutes an effective post-regularization procedure for enhancing the result of a single pixel-based classification. For this purpose, the original MSSC-MSF has been extended in order to handle multiple classifications. In particular, the fuzzy outputs of the pixel-based classifiers are stacked and used to grow the MSF, whereas the markers are also determined considering both classifications. The proposed methodology has been tested on a challenging forest species mapping task in northern Greece, considering a multispectral (GeoEye) and a hyper-spectral (CASI) image. The pixel-wise classifications resulted in overall accuracies (OA) of 68.71% for the GeoEye and 77.95% for the CASI images, respectively. Both of them are characterized by high levels of speckle noise. Applying the proposed multi-source MSSC-MSF fusion, the OA climbs to 90.86%, which is attributed both to the ability of MSSC-MSF to tackle the salt-and-pepper effect, as well as the fact that the fusion approach exploits the relative advantages of both information sources.

  2. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  3. The Contribution of GIS in Flood Mapping: Two Approaches Using Open Source Grass GIS Software

    NASA Astrophysics Data System (ADS)

    Marzocchi, R.; Federici, B.; Cannata, M.; Cosso, T.; Syriou, A.

    2013-01-01

    The first step of a risk assessment analysis is the evaluation of flood-prone areas. Its importance is considered for both managing and planning emergency activities, such as hydraulic risk reduction management, and also town planning. Nowadays, using GIS technology for risk assessment analysis is very common. However, it is not widely used for defining inundated areas. LiDAR data, such as Digital Elevation Models (DEM), makes GIS numerical models attractive methods for obtaining a flooded area automatically. Using GIS tools, is beneficial for effective processing and accuracy assessment in comparison to the traditional methods which are based on topographic maps and field surveys. A first approach (Federici and Sguerso, 2007; Marzocchi et al. 2009) is the use of a GIS module in order to create perifluvial flood maps, having as prerequisites (i) the conformation of the river floodplain by a high resolution DEM and (ii) a water surface profile along the river axis calculated for a given water discharge through a generic one-dimensional (1D) hydraulic model (HEC-RAS, Basement, MIKE 11, etc). On the other hand, a second approach is the use of a 2D model GIS embedded in order to create flooded areas due to a dam break (Cannata & Marzocchi, 2012). This module solves the conservative form of the 2D Shallow Water Equations (SWE) using a Finite Volume Method (FVM). The intercell flux is computed by a one-side upwind conservative scheme extended to a 2D problem (Ying et al., 2004). The new developed GIS module gives as an output maximum intensity maps which can be directly used during the risk assessment process. Both models implemented in GRASS GIS software (GRASS, 2013) and two new commands (r.inund.fluv and r.damflood) have been created. They are all available on the official GRASS website and they are distributed under the terms of the GNU General Public License (GPL). In this work we present a comparison between the two models mentioned above. We analyse the

  4. A Novel Chemical Biology Approach for Mapping of Polymyxin Lipopeptide Antibody Binding Epitopes.

    PubMed

    Velkov, Tony; Yun, Bo; Schneider, Elena K; Azad, Mohammad A K; Dolezal, Olan; Morris, Faye C; Nation, Roger L; Wang, Jiping; Chen, Ke; Yu, Heidi H; Wang, Lv; Thompson, Philip E; Roberts, Kade D; Li, Jian

    2016-05-13

    Polymyxins B and E (i.e., colistin) are a family of naturally occurring lipopeptide antibiotics that are our last line of defense against multidrug resistant (MDR) Gram-negative pathogens. Unfortunately, nephrotoxicity is a dose-limiting factor for polymyxins that limits their clinical utility. Our recent studies demonstrate that polymyxin-induced nephrotoxicity is a result of their extensive accumulation in renal tubular cells. The design and development of safer, novel polymyxin lipopeptides is hampered by our limited understanding of their complex structure-nephrotoxicity relationships. This is the first study to employ a novel targeted chemical biology approach to map the polymyxin recognition epitope of a commercially available polymyxin mAb and demonstrate its utility for mapping the kidney distribution of a novel, less nephrotoxic polymyxin lipopeptide. Eighteen novel polymyxin lipopeptide analogues were synthesized with modifications in the polymyxin core domains, namely, the N-terminal fatty acyl region, tripeptide linear segment, and cyclic heptapeptide. Surface plasmon resonance epitope mapping revealed that the monoclonal antibody (mAb) recognition epitope consisted of the hydrophobic domain (N-terminal fatty acyl and position 6/7) and diaminobutyric acid (Dab) residues at positions 3, 5, 8, and 9 of the polymyxin molecule. Structural diversity within the hydrophobic domains and Dab 3 position are tolerated. Enlightened with an understating of the structure-binding relationships between the polymyxin mAb and the core polymyxin scaffold, we can now rationally employ the mAb to probe the kidney distribution of novel polymyxin lipopeptides. This information will be vital in the design of novel, safer polymyxins through chemical tailoring of the core scaffold and exploration of the elusive/complex polymyxin structure-nephrotoxicity relationships. PMID:27627202

  5. Mapping a near surface variable geologic regime using an integrated geophysical approach

    SciTech Connect

    Rogers, N.T.; Sandberg, S.K.; Miller, P.; Powell, G.

    1997-10-01

    An integrated geophysical approach involving seismic, electromagnetic, and electrical methods was employed to map fluvial, colluvial and bedrock geology, to delineate bedrock channels, and to determine fracture and joint orientations that may influence migration of petroleum hydrocarbons at the Glenrock Oil Seep. Both P (primary)-wave and S (shear)-wave seismic refraction techniques were used to map the bedrock surface topography, bedrock minima, stratigraphic boundaries, and possible structure. S-wave data were preferred because of better vertical resolution due to the combination of slower velocities and lower frequency wave train. Azimuthal resistivity/EP (induced polarization) and azimuthal electromagnetics were used to determine fracture orientations and groundwater flow directions. Terrain conductivity was used to map the fluvial sedimentary sequences (e.g., paleochannel and overbank deposits) in the North Platte River floodplain. Conductivity measurements were also used to estimate bedrock depth and to assist in the placement and recording parameters of the azimuthal soundings. The geophysical investigation indicated that groundwater flow pathways were controlled by the fluvial paleochannels and bedrock erosional features. Primary groundwater flow direction in the bedrock and colluvial sediments was determined from the azimuthal measurements and confirmed by drilling to be N20-40W along the measured strike of the bedrock and joint orientations. Joint/fracture orientations were measured at N20-40W and N10-30E from the azimuthal data and confirmed from measurements at a bedrock outcrop south of the site. The bedrock has an apparent N10E anisotropy in the seismic velocity profiles on the old refinery property that closely match that of measured joint/fracture orientations and may have a minor effect on groundwater flow.

  6. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  7. Mapping the progress and impacts of public health approaches to palliative care: a scoping review protocol

    PubMed Central

    Archibald, Daryll; Patterson, Rebecca; Haraldsdottir, Erna; Hazelwood, Mark; Fife, Shirley; Murray, Scott A

    2016-01-01

    Introduction Public health palliative care is a term that can be used to encompass a variety of approaches that involve working with communities to improve people's experience of death, dying and bereavement. Recently, public health palliative care approaches have gained recognition and momentum within UK health policy and palliative care services. There is general consensus that public health palliative care approaches can complement and go beyond the scope of formal service models of palliative care. However, there is no clarity about how these approaches can be undertaken in practice or how evidence can be gathered relating to their effectiveness. Here we outline a scoping review protocol that will systematically map and categorise the variety of activities and programmes that could be classified under the umbrella term ‘public health palliative care’ and highlight the impact of these activities where measured. Methods and analysis This review will be guided by Arksey and O'Malley's scoping review methodology and incorporate insights from more recent innovations in scoping review methodology. Sensitive searches of 9 electronic databases from 1999 to 2016 will be supplemented by grey literature searches. Eligible studies will be screened independently by two reviewers using a data charting tool developed for this scoping review. Ethics and dissemination This scoping review will undertake a secondary analysis of data already collected and does not require ethical approval. The results will facilitate better understanding of the practical application of public health approaches to palliative care, the impacts these activities can have and how to build the evidence base for this work in future. The results will be disseminated through traditional academic routes such as conferences and journals and also policy and third sector seminars. PMID:27417201

  8. Identifying Inhibitors of Epithelial-Mesenchymal Transition by Connectivity-Map Based Systems Approach

    PubMed Central

    Reka, Ajaya Kumar; Kuick, Rork; Kurapati, Himabindu; Standiford, Theodore J.; Omenn, Gilbert S.; Keshamouni, Venkateshwar G.

    2011-01-01

    Background Acquisition of mesenchymal phenotype by epithelial cells by means of epithelial mesenchymal transition (EMT) is considered as an early event in the multi-step process of tumor metastasis. Therefore, inhibition of EMT might be a rational strategy to prevent metastasis. Methods Utilizing the global gene expression profile from a cell culture model of TGF-β-induced EMT, we identified potential EMT inhibitors. We used a publicly available database (www.broad.mit.edu/cmap) comprising gene expression profiles obtained from multiple different cell lines in response to various drugs to derive negative correlations to EMT gene expression profile using Connectivity Map (C-Map), a pattern matching tool. Results Experimental validation of the identified compounds showed rapamycin as a novel inhibitor of TGF-β signaling along with 17-AAG, a known modulator of TGF-β pathway. Both of these compounds completely blocked EMT and the associated migratory and invasive phenotype. The other identified compound, LY294002, demonstrated a selective inhibition of mesenchymal markers, cell migration and invasion, without affecting the loss of E-cadherin expression or Smad phosphorylation. Conclusions Collectively, our data reveals that rapamycin is a novel modulator of TGF-β signaling, and along with 17-AAG and LY294002, could be used as therapeutic agent for inhibiting EMT. Also, this analysis demonstrates the potential of a systems approach in identifying novel modulators of a complex biological process. PMID:21964532

  9. An efficient unsupervised index based approach for mapping urban vegetation from IKONOS imagery

    NASA Astrophysics Data System (ADS)

    Anchang, Julius Y.; Ananga, Erick O.; Pu, Ruiliang

    2016-08-01

    Despite the increased availability of high resolution satellite image data, their operational use for mapping urban land cover in Sub-Saharan Africa continues to be limited by lack of computational resources and technical expertise. As such, there is need for simple and efficient image classification techniques. Using Bamenda in North West Cameroon as a test case, we investigated two completely unsupervised pixel based approaches to extract tree/shrub (TS) and ground vegetation (GV) cover from an IKONOS derived soil adjusted vegetation index. These included: (1) a simple Jenks Natural Breaks classification and (2) a two-step technique that combined the Jenks algorithm with agglomerative hierarchical clustering. Both techniques were compared with each other and with a non-linear support vector machine (SVM) for classification performance. While overall classification accuracy was generally high for all techniques (>90%), One-Way Analysis of Variance tests revealed the two step technique to outperform the simple Jenks classification in terms of predicting the GV class. It also outperformed the SVM in predicting the TS class. We conclude that the unsupervised methods are technically as good and practically superior for efficient urban vegetation mapping in budget and technically constrained regions such as Sub-Saharan Africa.

  10. Object-based approach to national land cover mapping using HJ satellite imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Xiaosong; Yuan, Quanzhi; Liu, Yu

    2014-01-01

    To meet the carbon storage estimate in ecosystems for a national carbon strategy, we introduce a consistent database of China land cover. The Chinese Huan Jing (HJ) satellite is proven efficient in the cloud-free acquisition of seasonal image series in a monsoon region and in vegetation identification for mesoscale land cover mapping. Thirty-eight classes of level II land cover are generated based on the Land Cover Classification System of the United Nations Food and Agriculture Organization that follows a standard and quantitative definition. Twenty-four layers of derivative spectral, environmental, and spatial features compose the classification database. Object-based approach characterizing additional nonspectral features is conducted through mapping, and multiscale segmentations are applied on object boundary match to target real-world conditions. This method sufficiently employs spatial information, in addition to spectral characteristics, to improve classification accuracy. The algorithm of hierarchical classification is employed to follow step-by-step procedures that effectively control classification quality. This algorithm divides the dual structures of universal and local trees. Consistent universal trees suitable to most regions are performed first, followed by local trees that depend on specific features of nine climate stratifications. The independent validation indicates the overall accuracy reaches 86%.

  11. Mind-mapping for lung cancer: Towards a personalized therapeutics approach

    PubMed Central

    Mollberg, N; Surati, M; Demchuk, C; Fathi, R; Salama, AK; Husain, AN; Hensing, T; Salgia, R

    2011-01-01

    There will be over 220,000 people diagnosed with lung cancer and over 160,000 dying of lung cancer this year alone in the United States. In order to arrive at better control, prevention, diagnosis, and therapeutics for lung cancer, we must be able to personalize the approach towards lung cancer. Mind-mapping has existed for centuries for physicians to properly think about various “flows” of personalized medicine. We include here the epidemiology, diagnosis, histology, and treatment of lung cancer—specifically, non-small cell lung cancer. As we have new molecular signatures for lung cancer, this is further detailed. This review is not meant to be a comprehensive review, but rather its purpose is to highlight important aspects of lung cancer diagnosis, management, and personalized treatment options. PMID:21337123

  12. Multipoint linkage analysis using sib pairs: A interval mapping approach for dichotomous outcomes

    SciTech Connect

    Olson, J.M.

    1995-03-01

    I propose an interval mapping approach suitable for a dichotomous outcome, with emphasis on samples of affected sib pairs. The method computes a lod score for each of a set of locations in the interval between two flanking markers and takes as its estimate of trait-locus location the maximum lod score in the interval, provided it exceeds the prespecified critical value. Use of the method depends on prior knowledge of the genetic model for the disease only through available estimates of recurrence risk to relatives of affected individuals. The method gives an unbiased estimate of location, provided the recurrence risks are correctly specified and provided the marker identity-by-descent probabilities are jointly, rather than individually, estimated. I also discuss use of the method for traits determined by two loci and give an approximation that has good power for a wide range of two-locus models. 25 refs., 2 figs., 9 tabs.

  13. A universal airborne LiDAR approach for tropical forest carbon mapping.

    PubMed

    Asner, Gregory P; Mascaro, Joseph; Muller-Landau, Helene C; Vieilledent, Ghislain; Vaudry, Romuald; Rasamoelina, Maminiaina; Hall, Jefferson S; van Breugel, Michiel

    2012-04-01

    Airborne light detection and ranging (LiDAR) is fast turning the corner from demonstration technology to a key tool for assessing carbon stocks in tropical forests. With its ability to penetrate tropical forest canopies and detect three-dimensional forest structure, LiDAR may prove to be a major component of international strategies to measure and account for carbon emissions from and uptake by tropical forests. To date, however, basic ecological information such as height-diameter allometry and stand-level wood density have not been mechanistically incorporated into methods for mapping forest carbon at regional and global scales. A better incorporation of these structural patterns in forests may reduce the considerable time needed to calibrate airborne data with ground-based forest inventory plots, which presently necessitate exhaustive measurements of tree diameters and heights, as well as tree identifications for wood density estimation. Here, we develop a new approach that can facilitate rapid LiDAR calibration with minimal field data. Throughout four tropical regions (Panama, Peru, Madagascar, and Hawaii), we were able to predict aboveground carbon density estimated in field inventory plots using a single universal LiDAR model (r ( 2 ) = 0.80, RMSE = 27.6 Mg C ha(-1)). This model is comparable in predictive power to locally calibrated models, but relies on limited inputs of basal area and wood density information for a given region, rather than on traditional plot inventories. With this approach, we propose to radically decrease the time required to calibrate airborne LiDAR data and thus increase the output of high-resolution carbon maps, supporting tropical forest conservation and climate mitigation policy. PMID:22033763

  14. Effect of delayed feedback on the dynamics of a scalar map via a frequency-domain approach.

    PubMed

    Gentile, Franco S; Bel, Andrea L; Belén D'Amico, M; Moiola, Jorge L

    2011-06-01

    The effect of delayed feedback on the dynamics of a scalar map is studied by using a frequency-domain approach. Explicit conditions for the occurrence of period-doubling and Neimark-Sacker bifurcations in the controlled map are found analytically. The appearance of a 1:2 resonance for certain values of the delay is also formalized, revealing that this phenomenon is independent of the system parameters. A detailed study of the well-known logistic map under delayed feedback is included for illustration. PMID:21721759

  15. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    NASA Astrophysics Data System (ADS)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts

  16. An Improved Approach for Mapping Quantitative Trait loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Wullschleger, Stan D; Wu, Song; Wu, Rongling; Yang, Jie; Li, Yao; Yin, Tongming; Tuskan, Gerald A

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  17. An Improved Approach for Mapping Quantitative Trait Loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Tuskan, Gerald A; Yin, Tongming; Wullschleger, Stan D; Yang, Jie; Huang, Youjun; Li, Yao; Wu, Rongling

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  18. Chiral anomaly, bosonization, and fractional charge

    SciTech Connect

    Mignaco, J.A.; Monteiro, M.A.R.

    1985-06-15

    We present a method to evaluate the Jacobian of chiral rotations, regulating determinants through the proper-time method and using Seeley's asymptotic expansion. With this method we compute easily the chiral anomaly for ..nu.. = 4,6 dimensions, discuss bosonization of some massless two-dimensional models, and handle the problem of charge fractionization. In addition, we comment on the general validity of Fujikawa's approach to regulate the Jacobian of chiral rotations with non-Hermitian operators.

  19. A stochastic simulation framework for the prediction of strategic noise mapping and occupational noise exposure using the random walk approach.

    PubMed

    Han, Lim Ming; Haron, Zaiton; Yahya, Khairulzan; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019

  20. A Stochastic Simulation Framework for the Prediction of Strategic Noise Mapping and Occupational Noise Exposure Using the Random Walk Approach

    PubMed Central

    Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019

  1. A new approach of mapping soils in the Alps - Challenges of deriving soil information and creating soil maps for sustainable land use. An example from South Tyrol (Italy)

    NASA Astrophysics Data System (ADS)

    Baruck, Jasmin; Gruber, Fabian E.; Geitner, Clemens

    2015-04-01

    Nowadays sustainable land use management is gaining importance because intensive land use leads to increasing soil degradation. Especially in mountainous regions like the Alps sustainable land use management is important, as topography limits land use. Therefore, a database containing detailed information of soil characteristics is required. However, information of soil properties is far from being comprehensive. The project "ReBo - Terrain classification based on airborne laser scanning data to support soil mapping in the Alps", founded by the Autonomous Province of Bolzano, aims at developing a methodical framework of how to obtain soil data. The approach combines geomorphometric analysis and soil mapping to generate modern soil maps at medium-scale in a time and cost efficient way. In this study the open source GRASS GIS extension module r.geomorphon (Jasciewicz and Stepinski, 2013) is used to derive topographically homogeneous landform units out of high resolution DTMs on scale 1:5.000. Furthermore, for terrain segmentation and classification we additionally use medium-scale data sets (geology, parent material, land use etc.). As the Alps are characterized by a great variety of topography, parent material, wide range of moisture regimes etc. getting reliable soil data is difficult. Additionally, geomorphic activity (debris flow, landslide etc.) leads to natural disturbances. Thus, soil properties are highly diverse and largely scale dependent. Furthermore, getting soil information of anthropogenically influenced soils is an added challenge. Due to intensive cultivation techniques the natural link between the soil forming factors is often repealed. In South Tyrol we find the largest pome producing area in Europe. Normally, the annual precipitation is not enough for intensive orcharding. Thus, irrigation strategies are in use. However, as knowledge about the small scaled heterogeneous soil properties is mostly lacking, overwatering and modifications of the

  2. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches.

    PubMed

    Ma, Ying; Shaik, Mohammed A; Kim, Sharon H; Kozberg, Mariel G; Thibodeaux, David N; Zhao, Hanzhi T; Yu, Hang; Hillman, Elizabeth M C

    2016-10-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results.This article is part of the themed issue 'Interpreting BOLD: a dialogue between cognitive and cellular neuroscience'. PMID:27574312

  3. Identifying Potential Areas for Siting Interim Nuclear Waste Facilities Using Map Algebra and Optimization Approaches

    SciTech Connect

    Omitaomu, Olufemi A; Liu, Cheng; Cetiner, Sacit M; Belles, Randy; Mays, Gary T; Tuttle, Mark A

    2013-01-01

    The renewed interest in siting new nuclear power plants in the United States has brought to the center stage, the need to site interim facilities for long-term management of spent nuclear fuel (SNF). In this paper, a two-stage approach for identifying potential areas for siting interim SNF facilities is presented. In the first stage, the land area is discretized into grids of uniform size (e.g., 100m x 100m grids). For the continental United States, this process resulted in a data matrix of about 700 million cells. Each cell of the matrix is then characterized as a binary decision variable to indicate whether an exclusion criterion is satisfied or not. A binary data matrix is created for each of the 25 siting criteria considered in this study. Using map algebra approach, cells that satisfy all criteria are clustered and regarded as potential siting areas. In the second stage, an optimization problem is formulated as a p-median problem on a rail network such that the sum of the shortest distance between nuclear power plants with SNF and the potential storage sites from the first stage is minimized. The implications of obtained results for energy policies are presented and discussed.

  4. Interacting boson models for N˜Z nuclei

    NASA Astrophysics Data System (ADS)

    Van Isacker, P.

    2011-05-01

    This contribution discusses the use of boson models in the description of N˜Z nuclei. A brief review is given of earlier attempts, initiated by Elliott and co-workers, to extend the interacting boson model of Arima and Iachello by the inclusion of neutron-proton s and d bosons with T = 1 (IBM-3) as well as T = 0 (IBM-4). It is argued that for the N˜Z nuclei that are currently studied experimentally, a different approach is needed which invokes aligned neutron-proton pairs with angular momentum J = 2j and isospin T = 0. This claim is supported by an analysis of shell-model wave functions in terms of pair states. Results of this alternative version of the interacting boson model are compared with shell-model calculations in the 1g9/2 shell.

  5. Bose-Einstein condensates of bosonic Thomson atoms

    NASA Astrophysics Data System (ADS)

    Schneider, Tobias; Blümel, Reinhold

    1999-10-01

    A system of charged particles in a harmonic trap is a realization of Thomson's raisin cake model. Therefore, we call it a Thomson atom. Bosonic, fermionic and mixed Thomson atoms exist. In this paper we focus on bosonic Thomson atoms in isotropic traps. Approximating the exact ground state by a condensate we investigate ground-state properties at temperature T = 0 using the Hartree-Fock theory for bosons. In order to assess the quality of our mean-field approach we compare the Hartree-Fock results for bosonic Thomson helium with an exact diagonalization. In contrast to the weakly interacting Bose gas (alkali vapours) mean-field calculations are reliable in the limit of large particle density. The Wigner regime (low particle density) is discussed.

  6. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    USGS Publications Warehouse

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  7. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    NASA Astrophysics Data System (ADS)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels

  8. The influence of mapped hazards on risk beliefs: A proximity-based modeling approach

    PubMed Central

    Severtson, Dolores J.; Burt, James E.

    2013-01-01

    Interview findings suggest perceived proximity to mapped hazards influences risk beliefs when people view environmental hazard maps. For dot maps, four attributes of mapped hazards influenced beliefs: hazard value, proximity, prevalence, and dot patterns. In order to quantify the collective influence of these attributes for viewers' perceived or actual map locations, we present a model to estimate proximity-based hazard or risk (PBH) and share study results that indicate how modeled PBH and map attributes influenced risk beliefs. The randomized survey study among 447 university students assessed risk beliefs for 24 dot maps that systematically varied by the four attributes. Maps depicted water test results for a fictitious hazardous substance in private residential wells and included a designated “you live here” location. Of the nine variables that assessed risk beliefs, the numerical susceptibility variable was most consistently and strongly related to map attributes and PBH. Hazard value, location in or out of a clustered dot pattern, and distance had the largest effects on susceptibility. Sometimes, hazard value interacted with other attributes, e.g. distance had stronger effects on susceptibility for larger than smaller hazard values. For all combined maps, PBH explained about the same amount of variance in susceptibility as did attributes. Modeled PBH may have utility for studying the influence of proximity to mapped hazards on risk beliefs, protective behavior, and other dependent variables. Further work is needed to examine these influences for more realistic maps and representative study samples. PMID:22053748

  9. Sequencing the Pig Genome Using a Mapped BAC by BAC Approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We have generated a highly contiguous physical map covering >98% of the pig genome in just 176 contigs. The map is localised to the genome through integration with the UIUC RH map as well BAC end sequence alignments to the human genome. Over 265k HindIII restriction digest fingerprints totalling 1...

  10. AUTOMATED APPROACHES FOR REGIONAL RUNOFF MAPPING IN THE NORTHEASTERN UNITED STATES

    EPA Science Inventory

    Maps of runoff are useful tools in water resource planning and in regional scientific studies, but producing such maps can be a challenging task. he authors have used automated procedures to develop simple, yet accurate, ways to create runoff contour maps. he goal was to produce ...

  11. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  12. Mapping Natural Terroir Units using a multivariate approach and legacy data

    NASA Astrophysics Data System (ADS)

    Priori, Simone; Barbetti, Roberto; L'Abate, Giovanni; Bucelli, Piero; Storchi, Paolo; Costantini, Edoardo A. C.

    2014-05-01

    was then subdivided into 9 NTUs, statistically differentiated for the used variables. The vineyard areas of Siena province was subdivided into 9 NTU, statistically differentiated for the used variables. The study demonstrated the strength of a multivariate approach for NTU mapping at province scale (1:125,000), using viticultural legacy data. Identification and mapping of terroir diversity within the DOC and DOCG at the province scale suggest the adoption of viticultural subzones. The subzones, based on the NTU, could bring to the fruition of different wine-production systems that enhanced the peculiarities of the terroir.

  13. Use of Mapping and Spatial and Space-Time Modeling Approaches in Operational Control of Aedes aegypti and Dengue

    PubMed Central

    Eisen, Lars; Lozano-Fuentes, Saul

    2009-01-01

    The aims of this review paper are to 1) provide an overview of how mapping and spatial and space-time modeling approaches have been used to date to visualize and analyze mosquito vector and epidemiologic data for dengue; and 2) discuss the potential for these approaches to be included as routine activities in operational vector and dengue control programs. Geographical information system (GIS) software are becoming more user-friendly and now are complemented by free mapping software that provide access to satellite imagery and basic feature-making tools and have the capacity to generate static maps as well as dynamic time-series maps. Our challenge is now to move beyond the research arena by transferring mapping and GIS technologies and spatial statistical analysis techniques in user-friendly packages to operational vector and dengue control programs. This will enable control programs to, for example, generate risk maps for exposure to dengue virus, develop Priority Area Classifications for vector control, and explore socioeconomic associations with dengue risk. PMID:19399163

  14. Use of mapping and spatial and space-time modeling approaches in operational control of Aedes aegypti and dengue.

    PubMed

    Eisen, Lars; Lozano-Fuentes, Saul

    2009-01-01

    The aims of this review paper are to 1) provide an overview of how mapping and spatial and space-time modeling approaches have been used to date to visualize and analyze mosquito vector and epidemiologic data for dengue; and 2) discuss the potential for these approaches to be included as routine activities in operational vector and dengue control programs. Geographical information system (GIS) software are becoming more user-friendly and now are complemented by free mapping software that provide access to satellite imagery and basic feature-making tools and have the capacity to generate static maps as well as dynamic time-series maps. Our challenge is now to move beyond the research arena by transferring mapping and GIS technologies and spatial statistical analysis techniques in user-friendly packages to operational vector and dengue control programs. This will enable control programs to, for example, generate risk maps for exposure to dengue virus, develop Priority Area Classifications for vector control, and explore socioeconomic associations with dengue risk. PMID:19399163

  15. Higgs boson finder and mass estimator: The Higgs boson to WW to leptons decay channel at the LHC

    SciTech Connect

    Barger, Vernon; Huang, Peisi

    2011-11-01

    We exploit the spin and kinematic correlations in the decay of a scalar boson into a pair of real or virtual W-bosons, with both W-bosons decaying leptonically, for Higgs boson discovery at 7 TeV LHC energy with 10 fb{sup -1} luminosity. Without reconstruction of the events, we obtain estimators of the Higgs mass from the peak and width of the signal distribution in m{sub ll}. The separation of signal and background with other distributions, such as the azimuthal angle between two W decay planes, the rapidity difference between the two leptons, missing E{sub T}, and the p{sub T} of leptons, are also prescribed. Our approach identifies the salient Higgs to dilepton signatures that allow subtraction of the continuum W*W* background.

  16. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    NASA Astrophysics Data System (ADS)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  17. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    PubMed Central

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map. PMID:26742857

  18. Gauge invariants and bosonization

    NASA Astrophysics Data System (ADS)

    Kijowski, J.; Rudolph, G.; Rudolph, M.

    1998-12-01

    We present some results, which are part of our program of analyzing gauge theories with fermions in terms of local gauge invariant fields. In a first part the classical Dirac-Maxwell system is discussed. Next we develop a procedure which leads to a reduction of the functional integral to an integral over (bosonic) gauge invariant fields. We apply this procedure to the case of QED and the Schwinger model. In a third part we go some steps towards an analysis of the considered models. We construct effective (quantum) field theories which can be used to calculate vacuum expectation values of physical quantities.

  19. Crater Mapping in the Pluto-Charon System: Considerations, Approach, and Progress

    NASA Astrophysics Data System (ADS)

    Robbins, S. J.; Singer, K. N.; Bray, V. J.; Schenk, P.; Zangari, A. M.; McKinnon, W. B.; Young, L. A.; Runyon, K. D.; Beyer, R. A.; Porter, S.; Lauer, T.; Weaver, H. A., Jr.; Olkin, C.; Ennico Smith, K.; Stern, A.

    2015-12-01

    NASA's New Horizons mission successfully made its closest approach to Pluto on July 14, 2015, at 11:49A.M. UTC. The flyby nature of the mission, distance to the system, and multiple planetary bodies to observe with a diverse instrument set required a complex imaging campaign marked by numerous trade-offs; these lead to a more complicated crater population mapping than a basic orbital mission. The Pluto and Charon imaging campaigns were full-disk or mosaics of the full disk until ≈3.5 hrs before closest approach when the pixel scale was 0.9 km/px. After this, several LORRI-specific imaging campaigns were conducted of the partial disk and later the full crescent, while additional strips were ride-alongs with other instruments. These should supply partial coverage at up to 70-80 m/px for Pluto and 160 m/px for Charon. The LORRI coverage at ≈0.4 km/px does not cover the entire encounter hemisphere, but the MVIC instrument provided comparable full-disk coverage (0.5 km/px) and partial disk at 0.3 km/px. The best images of the non-encounter hemispheres of Pluto and Charon are ≈21 km/px (taken midnight July 10-11). As with any single flyby mission, we are constrained by the best pixel scales and incidence angles at which images were taken during the flyby. While most high-resolution imaging by quantity has been done over areas of variable solar incidence as the spacecraft passed by Pluto and Charon, these cover a relatively small fraction of the bodies and most coverage has been at near-noon sun which makes crater identification difficult. Numerous team members are independently using a variety of crater mapping tools and image products, which will be reconciled and merged to make a more robust final database. We will present our consensus crater database to-date of both plutonian and charonian impact craters as well as correlations with preliminary geologic units. We will also discuss how the crater population compares with predictions and modeled Kuiper Belt

  20. Mapping irrigation potential from renewable groundwater in Africa - a quantitative hydrological approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Y.; Villholth, K. G.

    2015-02-01

    Groundwater provides an important buffer to climate variability in Africa. Yet, groundwater irrigation contributes only a relatively small share of cultivated land, approximately 1% (about 2 × 106 hectares) as compared to 14% in Asia. While groundwater is over-exploited for irrigation in many parts in Asia, previous assessments indicate an underutilized potential in parts of Africa. As opposed to previous country-based estimates, this paper derives a continent-wide, distributed (0.5° spatial resolution) map of groundwater irrigation potential, indicated in terms of fractions of cropland potentially irrigable with renewable groundwater. The method builds on an annual groundwater balance approach using 41 years of hydrological data, allocating only that fraction of groundwater recharge that is in excess after satisfying other present human needs and environmental requirements, while disregarding socio-economic and physical constraints in access to the resource. Due to high uncertainty of groundwater environmental needs, three scenarios, leaving 30, 50 and 70% of recharge for the environment, were implemented. Current dominating crops and cropping rotations and associated irrigation requirements in a zonal approach were applied in order to convert recharge excess to potential irrigated cropland. Results show an inhomogeneously distributed groundwater irrigation potential across the continent, even within individual countries, mainly reflecting recharge patterns and presence or absence of cultivated cropland. Results further show that average annual renewable groundwater availability for irrigation ranges from 692 to 1644 km3 depending on scenario. The total area of cropland irrigable with renewable groundwater ranges from 44.6 to 105.3 × 106 ha, corresponding to 20.5 to 48.6% of the cropland over the continent. In particular, significant potential exists in the semi-arid Sahel and eastern African regions which could support poverty alleviation if developed

  1. Non-linear dynamics of operant behavior: a new approach via the extended return map.

    PubMed

    Li, Jay-Shake; Huston, Joseph P

    2002-01-01

    Previous efforts to apply non-linear dynamic tools to the analysis of operant behavior revealed some promise for this kind of approach, but also some doubts, since the complexity of animal behavior seemed to be beyond the analyzing ability of the available tools. We here outline a series of studies based on a novel approach. We modified the so-called 'return map' and developed a new method, the 'extended return map' (ERM) to extract information from the highly irregular time series data, the inter-response time (IRT) generated by Skinner-box experiments. We applied the ERM to operant lever pressing data from rats using the four fundamental reinforcement schedules: fixed interval (FI), fixed ratio (FR), variable interval (VI) and variable ratio (VR). Our results revealed interesting patterns in all experiment groups. In particular, the FI and VI groups exhibited well-organized clusters of data points. We calculated the fractal dimension out of these patterns and compared experimental data with surrogate data sets, that were generated by randomly shuffling the sequential order of original IRTs. This comparison supported the finding that patterns in ERM reflect the dynamics of the operant behaviors under study. We then built two models to simulate the functional mechanisms of the FI schedule. Both models can produce similar distributions of IRTs and the stereotypical 'scalloped' curve characteristic of FI responding. However, they differ in one important feature in their formulation: while one model uses a continuous function to describe the probability of occurrence of an operant behavior, the other one employs an abrupt switch of behavioral state. Comparison of ERMs showed that only the latter was able to produce patterns similar to the experimental results, indicative of the operation of an abrupt switch from one behavioral state to another over the course of the inter-reinforcement period. This example demonstrated the ERM to be a useful tool for the analysis of

  2. Using a constructivist approach with online concept maps: relationship between theory and nursing education.

    PubMed

    Conceição, Simone C O; Taylor, Linda D

    2007-01-01

    Concept maps have been used in nursing education as a method for students to organize and analyze data. This article describes an online course that used concept maps and self-reflective journals to assess students' thinking processes. The self-reflective journals of 21 students collected over two semesters were qualitatively examined. Three major themes emerged from students' use of concept maps: 1) factors influencing the map creation, 2) developmental learning process over time, and 3) validation of existing knowledge and construction of new knowledge. The use of concept maps with reflective journaling provided a learning experience that allowed students to integrate content consistent with a constructivist paradigm. This integration is a developmental process influenced by the personal preferences of students, concept map design, and content complexity. This developmental process provides early evidence that the application of concept mapping in the online environment, along with reflective journaling, allows students to make new connections, integrate previous knowledge, and validate existing knowledge. PMID:17944263

  3. Limits on light Higgs bosons

    SciTech Connect

    Dawson, S.

    1988-01-01

    Experimental limits on light Higgs bosons (M/sub H/ < 5 GeV) are examined. Particular attention is paid to the process K H. It is shown that there may be an allowed window for light Higgs bosons between about 100 and 210 MeV. 13 refs., 2 figs.

  4. An approach for mapping large-area impervious surfaces: Synergistic use of Landsat-7 ETM+ and high spatial resolution imagery

    USGS Publications Warehouse

    Yang, L.; Huang, C.; Homer, C.G.; Wylie, B.K.; Coan, M.J.

    2003-01-01

    A wide range of urban ecosystem studies, including urban hydrology, urban climate, land use planning, and resource management, require current and accurate geospatial data of urban impervious surfaces. We developed an approach to quantify urban impervious surfaces as a continuous variable by using multisensor and multisource datasets. Subpixel percent impervious surfaces at 30-m resolution were mapped using a regression tree model. The utility, practicality, and affordability of the proposed method for large-area imperviousness mapping were tested over three spatial scales (Sioux Falls, South Dakota, Richmond, Virginia, and the Chesapeake Bay areas of the United States). Average error of predicted versus actual percent impervious surface ranged from 8.8 to 11.4%, with correlation coefficients from 0.82 to 0.91. The approach is being implemented to map impervious surfaces for the entire United States as one of the major components of the circa 2000 national land cover database.

  5. Approximate gauge symemtry of composite vector bosons

    SciTech Connect

    Suzuki, Mahiko

    2010-06-01

    It can be shown in a solvable field theory model that the couplings of the composite vector mesons made of a fermion pair approach the gauge couplings in the limit of strong binding. Although this phenomenon may appear accidental and special to the vector bosons made of a fermion pair, we extend it to the case of bosons being constituents and find that the same phenomenon occurs in more an intriguing way. The functional formalism not only facilitates computation but also provides us with a better insight into the generating mechanism of approximate gauge symmetry, in particular, how the strong binding and global current conservation conspire to generate such an approximate symmetry. Remarks are made on its possible relevance or irrelevance to electroweak and higher symmetries.

  6. The derivation of tropospheric column ozone using the TOR approach and mapping technique

    NASA Astrophysics Data System (ADS)

    Yang, Qing

    2007-12-01

    Tropospheric ozone columns (TCOs) derived from differences between the Dutch-Finnish Aura Ozone Monitoring Instrument (OMI) measurements of the total atmospheric ozone column and the Aura Microwave Limb Sounder (MLS) measurements of stratospheric ozone columns are discussed. Because the measurements by these two instruments are not spatially coincident, interpolation techniques, with emphasis on mapping the stratospheric columns in space and time using the relationships between lower stratospheric ozone and potential vorticity (PV) and geopotential heights (Z), are evaluated at mid-latitudes. It is shown that this PV mapping procedure produces somewhat better agreement in comparisons with ozonesonde measurements, particularly in winter, than does simple linear interpolation of the MLS stratospheric columns or the use of typical coincidence criteria at mid-latitudes. The OMI/MLS derived tropospheric columns are calculated to be 4 Dobson units (DU) smaller than the sonde measured columns at mid-latitudes. This mean difference is consistent with the MLS (version 1.5) stratospheric ozone columns being high relative to Stratospheric Aerosol and Gas Experiment (SAGE II) columns by 3 DU. Standard deviations between the derived tropospheric columns and those measured by ozonesondes are 9 DU (30%) annually but they are just 6 DU (15%) in summer. Uncertainties in the interpolated MLS stratospheric columns are likely to be the primary cause of these standard deviations. An important advantage of the PV mapping approach is that it works well when MLS data are missing (e.g., when an orbit of measurements is missing). In the comparisons against ozonesonde measurements, it provides up to twice as many comparisons compared to the other techniques. The OMI/MLS derived tropospheric ozone columns have been compared with corresponding columns based on the Tropospheric Emission Spectrometer (TES) measurements, and Regional chEmical trAnsport Model (REAM) simulations. The variability of

  7. Agricultural Land Use mapping by multi-sensor approach for hydrological water quality monitoring

    NASA Astrophysics Data System (ADS)

    Brodsky, Lukas; Kodesova, Radka; Kodes, Vit

    2010-05-01

    The main objective of this study is to demonstrate potential of operational use of the high and medium resolution remote sensing data for hydrological water quality monitoring by mapping agriculture intensity and crop structures. In particular use of remote sensing mapping for optimization of pesticide monitoring. The agricultural mapping task is tackled by means of medium spatial and high temporal resolution ESA Envisat MERIS FR images together with single high spatial resolution IRS AWiFS image covering the whole area of interest (the Czech Republic). High resolution data (e.g. SPOT, ALOS, Landsat) are often used for agricultural land use classification, but usually only at regional or local level due to data availability and financial constraints. AWiFS data (nominal spatial resolution 56 m) due to the wide satellite swath seems to be more suitable for use at national level. Nevertheless, one of the critical issues for such a classification is to have sufficient image acquisitions over the whole vegetation period to describe crop development in appropriate way. ESA MERIS middle-resolution data were used in several studies for crop classification. The high temporal and also spectral resolution of MERIS data has indisputable advantage for crop classification. However, spatial resolution of 300 m results in mixture signal in a single pixel. AWiFS-MERIS data synergy brings new perspectives in agricultural Land Use mapping. Also, the developed methodology procedure is fully compatible with future use of ESA (GMES) Sentinel satellite images. The applied methodology of hybrid multi-sensor approach consists of these main stages: a/ parcel segmentation and spectral pre-classification of high resolution image (AWiFS); b/ ingestion of middle resolution (MERIS) vegetation spectro-temporal features; c/ vegetation signatures unmixing; and d/ semantic object-oriented classification of vegetation classes into final classification scheme. These crop groups were selected to be

  8. The Effect of Concept Mapping-Guided Discovery Integrated Teaching Approach on Chemistry Students' Achievement and Retention

    ERIC Educational Resources Information Center

    Fatokun, K. V. F.; Eniayeju, P. A.

    2014-01-01

    This study investigates the effects of Concept Mapping-Guided Discovery Integrated Teaching Approach on the achievement and retention of chemistry students. The sample comprised 162 Senior Secondary two (SS 2) students drawn from two Science Schools in Nasarawa State, Central Nigeria with equivalent mean scores of 9.68 and 9.49 in their pre-test.…

  9. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    ERIC Educational Resources Information Center

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  10. Factors Influencing Seasonal Influenza Vaccination Uptake in Emergency Medical Services Workers: A Concept Mapping Approach.

    PubMed

    Subramaniam, Dipti P; Baker, Elizabeth A; Zelicoff, Alan P; Elliott, Michael B

    2016-08-01

    Seasonal influenza has serious impacts on morbidity and mortality and has a significant economic toll through lost workforce time and strains on the health system. Health workers, particularly emergency medical services (EMS) workers have the potential to transmit influenza to those in their care, yet little is known of the factors that influence EMS workers' decisions regarding seasonal influenza vaccination (SIV) uptake, a key factor in reducing potential for transmitting disease. This study utilizes a modified Theory of Planned Behavior (TPB) model as a guiding framework to explore the factors that influence SIV uptake in EMS workers. Concept mapping, which consists of six-stages (preparation, generation, structuring, representation, interpretation, and utilization) that use quantitative and qualitative approaches, was used to identify participants' perspectives towards SIV. This study identified nine EMS-conceptualized factors that influence EMS workers' vaccination intent and behavior. The EMS-conceptualized factors align with the modified TPB model and suggest the need to consider community-wide approaches that were not initially conceptualized in the model. Additionally, the expansion of non-pharmaceutical measures went above and beyond original conceptualization. Overall, this study demonstrates the need to develop customized interventions such as messages highlighting the importance of EMS workers receiving SIV as the optimum solution. EMS workers who do not intend to receive the SIV should be provided with accurate information on the SIV to dispel misconceptions. Finally, EMS workers should also receive interventions which promote voluntary vaccination, encouraging them to be proactive in the health decisions they make for themselves. PMID:26721630

  11. Multidimensional chemistry coordinate mapping approach for combustion modelling with finite-rate chemistry

    NASA Astrophysics Data System (ADS)

    Jangi, Mehdi; Bai, Xue-Song

    2012-12-01

    A multidimensional chemistry coordinate mapping (CCM) approach is presented for efficient integration of chemical kinetics in numerical simulations of turbulent reactive flows. In CCM the flow transport is integrated in the computational cells in physical space, whereas the integration chemical reactions are carried out in a phase space made up of a few principal variables. Each cell in the phase space corresponds to several computational cells in the physical space, resulting in a speedup of the numerical integration. In reactive flows with small hydrocarbon fuels two principal variables have been shown to be satisfactory to construct the phase space. The two principal variables are the temperature (T) and the specific element mass ratio of the H atom (J H). A third principal variable, σ=∇J H.∇J H, which is related to the dissipation rate of J H, is required to construct the phase space for combustion processes with an initially non-premixed mixture. For complex higher hydrocarbon fuels, e.g. n-heptane, care has to be taken in selecting the phase space in order to model the low-temperature chemistry and ignition process. In this article, a multidimensional CCM algorithm is described for a systematic selection of the principal variables. The method is evaluated by simulating a laminar partially remixed pre-vaporised n-heptane jet ignition process. The CCM approach is then extended to simulate n-heptane spray combustion by coupling the CCM and Reynolds averaged Navier-Stokes (RANS) code. It is shown that the computational time for the integration of chemical reactions can be reduced to only 3-7%, while the result from the CCM method is identical to that of direct integration of the chemistry in the computational cells.

  12. Mapping Agricultural Fields in Sub-Saharan Africa with a Computer Vision Approach

    NASA Astrophysics Data System (ADS)

    Debats, S. R.; Luo, D.; Estes, L. D.; Fuchs, T.; Caylor, K. K.

    2014-12-01

    Sub-Saharan Africa is an important focus for food security research, because it is experiencing unprecedented population growth, agricultural activities are largely dominated by smallholder production, and the region is already home to 25% of the world's undernourished. One of the greatest challenges to monitoring and improving food security in this region is obtaining an accurate accounting of the spatial distribution of agriculture. Households are the primary units of agricultural production in smallholder communities and typically rely on small fields of less than 2 hectares. Field sizes are directly related to household crop productivity, management choices, and adoption of new technologies. As population and agriculture expand, it becomes increasingly important to understand both the distribution of field sizes as well as how agricultural communities are spatially embedded in the landscape. In addition, household surveys, a common tool for tracking agricultural productivity in Sub-Saharan Africa, would greatly benefit from spatially explicit accounting of fields. Current gridded land cover data sets do not provide information on individual agricultural fields or the distribution of field sizes. Therefore, we employ cutting edge approaches from the field of computer vision to map fields across Sub-Saharan Africa, including semantic segmentation, discriminative classifiers, and automatic feature selection. Our approach aims to not only improve the binary classification accuracy of cropland, but also to isolate distinct fields, thereby capturing crucial information on size and geometry. Our research focuses on the development of descriptive features across scales to increase the accuracy and geographic range of our computer vision algorithm. Relevant data sets include high-resolution remote sensing imagery and Landsat (30-m) multi-spectral imagery. Training data for field boundaries is derived from hand-digitized data sets as well as crowdsourcing.

  13. 3D mapping of airway wall thickening in asthma with MSCT: a level set approach

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Brillet, Pierre-Yves; Hartley, Ruth; Grenier, Philippe A.; Brightling, Christopher

    2014-03-01

    Assessing the airway wall thickness in multi slice computed tomography (MSCT) as image marker for airway disease phenotyping such asthma and COPD is a current trend and challenge for the scientific community working in lung imaging. This paper addresses the same problem from a different point of view: considering the expected wall thickness-to-lumen-radius ratio for a normal subject as known and constant throughout the whole airway tree, the aim is to build up a 3D map of airway wall regions of larger thickness and to define an overall score able to highlight a pathological status. In this respect, the local dimension (caliber) of the previously segmented airway lumen is obtained on each point by exploiting the granulometry morphological operator. A level set function is defined based on this caliber information and on the expected wall thickness ratio, which allows obtaining a good estimate of the airway wall throughout all segmented lumen generations. Next, the vascular (or mediastinal dense tissue) contact regions are automatically detected and excluded from analysis. For the remaining airway wall border points, the real wall thickness is estimated based on the tissue density analysis in the airway radial direction; thick wall points are highlighted on a 3D representation of the airways and several quantification scores are defined. The proposed approach is fully automatic and was evaluated (proof of concept) on a patient selection coming from different databases including mild, severe asthmatics and normal cases. This preliminary evaluation confirms the discriminative power of the proposed approach regarding different phenotypes and is currently extending to larger cohorts.

  14. Multistate boson stars

    SciTech Connect

    Bernal, A.; Barranco, J.; Alic, D.; Palenzuela, C.

    2010-02-15

    Motivated by the increasing interest in models which consider scalar fields as viable dark matter candidates, we have constructed a generalization of relativistic boson stars (BS) composed of two coexisting states of the scalar field, the ground state and the first excited state. We have studied the dynamical evolution of these multistate boson stars (MSBS) under radial perturbations, using numerical techniques. We show that stable MSBS can be constructed, when the number of particles in the first excited state, N{sup (2)}, is smaller than the number of particles in the ground state, N{sup (1)}. On the other hand, when N{sup (2)}>N{sup (1)}, the configurations are initially unstable. However, they evolve and settle down into stable configurations. In the stabilization process, the initially ground state is excited and ends in a first excited state, whereas the initially first excited state ends in a ground state. During this process, both states emit scalar field radiation, decreasing their number of particles. This behavior shows that even though BS in the first excited state are intrinsically unstable under finite perturbations, the configuration resulting from the combination of this state with the ground state produces stable objects. Finally we show in a qualitative way, that stable MSBS could be realistic models of dark matter galactic halos, as they produce rotation curves that are flatter at large radii than the rotation curves produced by BS with only one state.

  15. Fine-mapping the genetic basis of CRP regulation in African Americans: a Bayesian approach

    PubMed Central

    Rhodes, Benjamin; Morris, David L.; Subrahmanyan, Lakshman; Aubin, Cristin; Mendes de Leon, Carlos F.; Kelly, Jeremiah F.; Evans, Dennis A.; Whittaker, John C.; Oksenberg, Jorge R.; De Jager, Philip L.; Vyse, Tim

    2009-01-01

    Basal levels of C-reactive protein (CRP) have been associated with disease, particularly future cardiovascular events. Twin studies estimate 50% CRP heritability, so the identification of genetic variants influencing CRP expression is important. Existing studies in populations of European ancestry have identified numerous cis-acting variants but leave significant ambiguity over the identity of the key functional polymorphisms. We addressed this issue by typing a dense map of CRP single nucleotide polymorphisms (SNPs), and quantifying serum CRP in 594 unrelated African Americans. We used Bayesian model choice analysis to select the combination of SNPs best explaining basal CRP and found strong support for triallelic rs3091244 alone, with the T allele acting in an additive manner (Bayes factor >100 vs. null model), with additional support for a model incorporating both rs3091244 and rs12728740. Admixture analysis suggested SNP rs12728740 segregated with haplotypes predicted to be of recent European origin. Using a cladistic approach we confirmed the importance of rs3091244(T) by demonstrating a significant partition of haplotype effect based on the rs3091244(C/T) mutation (F=8.91, P=0.006). We argue that weaker linkage disequilibrium across the African American CRP locus compared with Europeans has allowed us to establish an unambiguous functional role for rs3091244(T), while also recognising the potential for additional functional mutations present in the European genome. PMID:18500540

  16. MapRepeat: an approach for effective assembly of repetitive regions in prokaryotic genomes

    PubMed Central

    Mariano, Diego CB; Pereira, Felipe L; Ghosh, Preetam; Barh, Debmalya; Figueiredo, Henrique CP; Silva, Artur; Ramos, Rommel TJ; Azevedo, Vasco AC

    2015-01-01

    The newest technologies for DNA sequencing have led to the determination of the primary structure of the genomes of organisms, mainly prokaryotes, with high efficiency and at lower costs. However, the presence of regions with repetitive sequences, in addition to the short reads produced by the Next-Generation Sequencing (NGS) platforms, created a lot of difficulty in reconstructing the original genome in silico. Thus, even today, genome assembly continues to be one of the major challenges in bioinformatics specifically when repetitive sequences are considered. In this paper, we present an approach to assemble repetitive regions in prokaryotic genomes. Our methodology enables (i) the identification of these regions through visual tools, (ii) the characterization of sequences on the extremities of gaps and (iii) the extraction of consensus sequences based on mapping of raw data to a reference genome. We also present a case study on the assembly of regions that encode ribosomal RNAs (rRNA) in the genome of Corynebacterium ulcerans FRC11, in order to show the efficiency of the strategies presented here. The proposed methods and tools will help in finishing genome assemblies, besides reducing the running time and associated costs. Availability All scripts are available at http://github.com/dcbmariano/maprepeat PMID:26229287

  17. Multimodality approach to optical early detection and mapping of oral neoplasia

    NASA Astrophysics Data System (ADS)

    Ahn, Yeh-Chan; Chung, Jungrae; Wilder-Smith, Petra; Chen, Zhongping

    2011-07-01

    Early detection of cancer remains the best way to ensure patient survival and quality of life. Squamous cell carcinoma is usually preceded by dysplasia presenting as white, red, or mixed red and white epithelial lesions on the oral mucosa (leukoplakia, erythroplakia). Dysplastic lesions in the form of erythroplakia can carry a risk for malignant conversion of 90%. A noninvasive diagnostic modality would enable monitoring of these lesions at regular intervals and detection of treatment needs at a very early, relatively harmless stage. The specific aim of this work was to test a multimodality approach [three-dimensional optical coherence tomography (OCT) and polarimetry] to noninvasive diagnosis of oral premalignancy and malignancy using the hamster cheek pouch model (nine hamsters). The results were compared to tissue histopathology. During carcinogenesis, epithelial down grow, eventual loss of basement membrane integrity, and subepithelial invasion were clearly visible with OCT. Polarimetry techniques identified a four to five times increased retardance in sites with squamous cell carcinoma, and two to three times greater retardance in dysplastic sites than in normal tissues. These techniques were particularly useful for mapping areas of field cancerization with multiple lesions, as well as lesion margins.

  18. Multimodality approach to optical early detection and  mapping of oral neoplasia

    PubMed Central

    Ahn, Yeh-Chan; Chung, Jungrae; Wilder-Smith, Petra; Chen, Zhongping

    2011-01-01

    Early detection of cancer remains the best way to ensure patient survival and quality of life. Squamous cell carcinoma is usually preceded by dysplasia presenting as white, red, or mixed red and white epithelial lesions on the oral mucosa (leukoplakia, erythroplakia). Dysplastic lesions in the form of erythroplakia can carry a risk for malignant conversion of 90%. A noninvasive diagnostic modality would enable monitoring of these lesions at regular intervals and detection of treatment needs at a very early, relatively harmless stage. The specific aim of this work was to test a multimodality approach [three-dimensional optical coherence tomography (OCT) and polarimetry] to noninvasive diagnosis of oral premalignancy and malignancy using the hamster cheek pouch model (nine hamsters). The results were compared to tissue histopathology. During carcinogenesis, epithelial down grow, eventual loss of basement membrane integrity, and subepithelial invasion were clearly visible with OCT. Polarimetry techniques identified a four to five times increased retardance in sites with squamous cell carcinoma, and two to three times greater retardance in dysplastic sites than in normal tissues. These techniques were particularly useful for mapping areas of field cancerization with multiple lesions, as well as lesion margins. PMID:21806268

  19. Fine-mapping the genetic basis of CRP regulation in African Americans: a Bayesian approach.

    PubMed

    Rhodes, Benjamin; Morris, David L; Subrahmanyan, Lakshman; Aubin, Cristin; de Leon, Carlos F Mendes; Kelly, Jeremiah F; Evans, Dennis A; Whittaker, John C; Oksenberg, Jorge R; De Jager, Philip L; Vyse, Tim J

    2008-07-01

    Basal levels of C-reactive protein (CRP) have been associated with disease, particularly future cardiovascular events. Twin studies estimate 50% CRP heritability, so the identification of genetic variants influencing CRP expression is important. Existing studies in populations of European ancestry have identified numerous cis-acting variants but leave significant ambiguity over the identity of the key functional polymorphisms. We addressed this issue by typing a dense map of CRP single-nucleotide polymorphisms (SNPs), and quantifying serum CRP in 594 unrelated African Americans. We used Bayesian model choice analysis to select the combination of SNPs best explaining basal CRP and found strong support for triallelic rs3091244 alone, with the T allele acting in an additive manner (Bayes factor > 100 vs. null model), with additional support for a model incorporating both rs3091244 and rs12728740. Admixture analysis suggested SNP rs12728740 segregated with haplotypes predicted to be of recent European origin. Using a cladistic approach we confirmed the importance of rs3091244(T) by demonstrating a significant partition of haplotype effect based on the rs3091244(C/T) mutation (F = 8.91, P = 0.006). We argue that weaker linkage disequilibrium across the African American CRP locus compared with Europeans has allowed us to establish an unambiguous functional role for rs3091244(T), while also recognising the potential for additional functional mutations present in the European genome. PMID:18500540

  20. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M.

    2006-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  1. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M., Jr.

    2004-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  2. Improving semi-automated glacier mapping with a multi-method approach: applications in central Asia

    NASA Astrophysics Data System (ADS)

    Smith, T.; Bookhagen, B.; Cannon, F.

    2015-09-01

    Studies of glaciers generally require precise glacier outlines. Where these are not available, extensive manual digitization in a geographic information system (GIS) must be performed, as current algorithms struggle to delineate glacier areas with debris cover or other irregular spectral profiles. Although several approaches have improved upon spectral band ratio delineation of glacier areas, none have entered wide use due to complexity or computational intensity. In this study, we present and apply a glacier mapping algorithm in Central Asia which delineates both clean glacier ice and debris-covered glacier tongues. The algorithm is built around the unique velocity and topographic characteristics of glaciers and further leverages spectral and spatial relationship data. We found that the algorithm misclassifies between 2 and 10 % of glacier areas, as compared to a ~ 750 glacier control data set, and can reliably classify a given Landsat scene in 3-5 min. The algorithm does not completely solve the difficulties inherent in classifying glacier areas from remotely sensed imagery but does represent a significant improvement over purely spectral-based classification schemes, such as the band ratio of Landsat 7 bands three and five or the normalized difference snow index. The main caveats of the algorithm are (1) classification errors at an individual glacier level, (2) reliance on manual intervention to separate connected glacier areas, and (3) dependence on fidelity of the input Landsat data.

  3. Modeling and mapping potential distribution of Crimean juniper (Juniperus excelsa Bieb.) using correlative approaches.

    PubMed

    Özkan, Kürşad; Şentürk, Özdemir; Mert, Ahmet; Negiz, Mehmet Güvenç

    2015-01-01

    Modeling and mapping potential distribution of living organisms has become an important component of conservation planning and ecosystem management in recent years. Various correlative and mechanistic methods can be applied to build predictive distributions of living organisms in terrestrial and marine ecosystems. Correlative methods used to predict species' potential distribution have been described as either group discrimination techniques or profile techniques. We attempted to determine whether group discrimination techniques could perform as well as profile techniques for predicting species potential distributions, using elevation (ELVN), parent material (ROCK), slope (SLOP), radiation index (RI) and topographic position index (TPI)) as explanatory variables. We compared potential distribution predictions made for Crimean juniper (Juniperus excelsa Bieb.) in the Yukan Gokdere forest district of the Mediterranean region, Turkey, applying four group discrimination techniques (discriminate analysis (DA), logistic regression analysis (LR), generalized addictive model (GAM) and classification tree technique (CT)) and two profile techniques (a maximum entropy approach to species distribution modeling (MAXENT), the genetic algorithm for rule-set prediction (GARP)). Visual assessments of the potential distribution probability of the applied models for Crimean juniper were performed by using geographical information systems (GIS). Receiver-operating characteristic (ROC) curves were used to objectively assess model performance. The results suggested that group discrimination techniques are better than profile techniques and, among the group discrimination techniques, GAM indicated the best performance. PMID:26591876

  4. Chern Simons bosonization along RG flows

    NASA Astrophysics Data System (ADS)

    Minwalla, Shiraz; Yokoyama, Shuichi

    2016-02-01

    It has previously been conjectured that the theory of free fundamental scalars minimally coupled to a Chern Simons gauge field is dual to the theory of critical fundamental fermions minimally coupled to a level rank dual Chern Simons gauge field. In this paper we study RG flows away from these two fixed points by turning on relevant operators. In the 't Hooft large N limit we compute the thermal partition along each of these flows and find a map of parameters under which the two partition functions agree exactly with each other all the way from the UV to the IR. We conjecture that the bosonic and fermionic RG flows are dual to each other under this map of parameters. Our flows can be tuned to end at the gauged critical scalar theory and gauged free fermionic theories respectively. Assuming the validity of our conjecture, this tuned trajectory may be viewed as RG flow from the gauged theory of free bosons to the gauged theory of free fermions.

  5. Geometric phases and quantum correlations dynamics in spin-boson model

    SciTech Connect

    Wu, Wei; Xu, Jing-Bo

    2014-01-28

    We explore the dynamics of spin-boson model for the Ohmic bath by employing the master equation approach and obtain an explicit expression of reduced density matrix. We also calculate the geometric phases of the spin-boson model by making use of the analytical results and discuss how the dissipative bosonic environment affects geometric phases. Furthermore, we investigate the dynamics of quantum discord and entanglement of two qubits each locally interacting with its own independent bosonic environments. It is found that the decay properties of quantum discord and entanglement are sensitive to the choice of initial state's parameter and coupling strength between system and bath.

  6. Superfluid transition temperature of the boson-fermion model on a lattice

    SciTech Connect

    Micnas, R.

    2007-11-01

    The properties of a mixture of mutually interacting bound electron pairs and itinerant fermions (the boson-fermion model) on a lattice are further studied. We determine the superconducting critical temperature from a pseudogap phase by applying a self-consistent T-matrix approach, which includes the pairing fluctuations and the boson self-energy effects. The analysis is made for a three dimensional cubic lattice with tight-binding dispersion for electrons and for both standard bosons and the case of hard-core bosons. The results describe the BCS-Bose-Einstein condensation crossover with varying position of the bosonic (local pair) level and give a further insight into the nature of resonance superfluidity in the boson-fermion model.

  7. Gauge bosons at zero and finite temperature

    NASA Astrophysics Data System (ADS)

    Maas, Axel

    2013-03-01

    Gauge theories of the Yang-Mills type are the single most important building block of the standard model of particle physics and beyond. They are an integral part of the strong and weak interactions, and in their Abelian version of electromagnetism. Since Yang-Mills theories are gauge theories their elementary particles, the gauge bosons, cannot be described without fixing a gauge. Therefore, to obtain their properties a quantized and gauge-fixed setting is necessary. Beyond perturbation theory, gauge-fixing in non-Abelian gauge theories is obstructed by the Gribov-Singer ambiguity, which requires the introduction of non-local constraints. The construction and implementation of a method-independent gauge-fixing prescription to resolve this ambiguity is the single most important first step to describe gauge bosons beyond perturbation theory. Proposals for such a procedure, generalizing the perturbative Landau gauge, are described here. Their implementation are discussed for two example methods, lattice gauge theory and the quantum equations of motion. After gauge-fixing, it is possible to study gauge bosons in detail. The most direct access is provided by their correlation functions. The corresponding two- and three-point correlation functions are presented at all energy scales. These give access to the properties of the gauge bosons, like their absence from the asymptotic physical state space, particle-like properties at high energies, and the running coupling. Furthermore, auxiliary degrees of freedom are introduced during gauge-fixing, and their properties are discussed as well. These results are presented for two, three, and four dimensions, and for various gauge algebras. Finally, the modifications of the properties of gauge bosons at finite temperature are presented. Evidence is provided that these reflect the phase structure of Yang-Mills theory. However, it is found that the phase transition is not deconfining the gauge bosons, although the bulk

  8. Mapping the World - a New Approach for Volunteered Geographic Information in the Cloud

    NASA Astrophysics Data System (ADS)

    Moeller, M. S.; Furhmann, S.

    2015-05-01

    The OSM project provides a geodata basis for the entire world under the CC-SA licence agreement. But some parts of the world are mapped more densely compared to other regions. However, many less developed countries show a lack of valid geo-information. Africa for example is a sparsely mapped continent. During a huge Ebola outbreak in 2014 the lack of data became apparent. Help organization like the American Red Cross and the Humanitarian Openstreetmap Team organized mappings campaign to fill the gaps with valid OSM geodata. This paper gives a short introduction into this mapping activity.

  9. A Hybrid Wetland Map for China: A Synergistic Approach Using Census and Spatially Explicit Datasets

    PubMed Central

    Ma, Kun; You, Liangzhi; Liu, Junguo; Zhang, Mingxiang

    2012-01-01

    Wetlands play important ecological, economic, and cultural roles in societies around the world. However, wetland degradation has become a serious ecological issue, raising the global sustainability concern. An accurate wetland map is essential for wetland management. Here we used a fuzzy method to create a hybrid wetland map for China through the combination of five existing wetlands datasets, including four spatially explicit wetland distribution data and one wetland census. Our results show the total wetland area is 384,864 km2, 4.08% of China’s national surface area. The hybrid wetland map also shows spatial distribution of wetlands with a spatial resolution of 1 km. The reliability of the map is demonstrated by comparing it with spatially explicit datasets on lakes and reservoirs. The hybrid wetland map is by far the first wetland mapping that is consistent with the statistical data at the national and provincial levels in China. It provides a benchmark map for research on wetland protection and management. The method presented here is applicable for not only wetland mapping but also for other thematic mapping in China and beyond. PMID:23110105

  10. A Two-Layers Based Approach of an Enhanced-Map for Urban Positioning Support

    PubMed Central

    Piñana-Díaz, Carolina; Toledo-Moreo, Rafael; Toledo-Moreo, F. Javier; Skarmeta, Antonio

    2012-01-01

    This paper presents a two-layer based enhanced map that can support navigation in urban environments. One layer is dedicated to describe the drivable road with a special focus on the accurate description of its bounds. This feature can support positioning and advanced map-matching when compared with standard polyline-based maps. The other layer depicts building heights and locations, thus enabling the detection of non-line-of-sight signals coming from GPS satellites not in direct view. Both the concept and the methodology for creating these enhanced maps are shown in the paper. PMID:23202172

  11. A two-layers based approach of an enhanced-map for urban positioning support.

    PubMed

    Piñana-Díaz, Carolina; Toledo-Moreo, Rafael; Toledo-Moreo, F Javier; Skarmeta, Antonio

    2012-01-01

    This paper presents a two-layer based enhanced map that can support navigation in urban environments. One layer is dedicated to describe the drivable road with a special focus on the accurate description of its bounds. This feature can support positioning and advanced map-matching when compared with standard polyline-based maps. The other layer depicts building heights and locations, thus enabling the detection of non-line-of-sight signals coming from GPS satellites not in direct view. Both the concept and the methodology for creating these enhanced maps are shown in the paper. PMID:23202172

  12. Integrated Georeferencing of Stereo Image Sequences Captured with a Stereovision Mobile Mapping System - Approaches and Practical Results

    NASA Astrophysics Data System (ADS)

    Eugster, H.; Huber, F.; Nebiker, S.; Gisi, A.

    2012-07-01

    Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations - in our case of the imaging sensors - normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  13. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but

  14. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach

    PubMed Central

    Nardi, Pierfrancesco; Di Matteo, Giovanni; Palahi, Marc; Scarascia Mugnozza, Giuseppe

    2016-01-01

    This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier’s Scopus database over the period 1980–2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17) during the period 1980–1989 and they reached 3% (N = 69) in the time slice 1990–1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476) during the period 2010–2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178) of total publications followed by Italy (18%, N = 482) and France (12%, N = 336). These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980–1994, the research topics were poorly characterized, but they become better defined during the time slice 1995–1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014) was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses

  15. Assessment of Social Vulnerability Identification at Local Level around Merapi Volcano - A Self Organizing Map Approach

    NASA Astrophysics Data System (ADS)

    Lee, S.; Maharani, Y. N.; Ki, S. J.

    2015-12-01

    The application of Self-Organizing Map (SOM) to analyze social vulnerability to recognize the resilience within sites is a challenging tasks. The aim of this study is to propose a computational method to identify the sites according to their similarity and to determine the most relevant variables to characterize the social vulnerability in each cluster. For this purposes, SOM is considered as an effective platform for analysis of high dimensional data. By considering the cluster structure, the characteristic of social vulnerability of the sites identification can be fully understand. In this study, the social vulnerability variable is constructed from 17 variables, i.e. 12 independent variables which represent the socio-economic concepts and 5 dependent variables which represent the damage and losses due to Merapi eruption in 2010. These variables collectively represent the local situation of the study area, based on conducted fieldwork on September 2013. By using both independent and dependent variables, we can identify if the social vulnerability is reflected onto the actual situation, in this case, Merapi eruption 2010. However, social vulnerability analysis in the local communities consists of a number of variables that represent their socio-economic condition. Some of variables employed in this study might be more or less redundant. Therefore, SOM is used to reduce the redundant variable(s) by selecting the representative variables using the component planes and correlation coefficient between variables in order to find the effective sample size. Then, the selected dataset was effectively clustered according to their similarities. Finally, this approach can produce reliable estimates of clustering, recognize the most significant variables and could be useful for social vulnerability assessment, especially for the stakeholder as decision maker. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering

  16. Mapping soil vulnerability to floods under varying land use and climate: A new approach

    NASA Astrophysics Data System (ADS)

    Alaoui, Abdallah; Spiess, Pascal; Beyeler, Marcel

    2016-04-01

    the hydrological connectivity between zones of various predisposition to excess surface runoff under different land uses. These promising results indicate that the approach is suited for mapping soil vulnerability to floods under varying land use and climate at any scale.

  17. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach.

    PubMed

    Nardi, Pierfrancesco; Di Matteo, Giovanni; Palahi, Marc; Scarascia Mugnozza, Giuseppe

    2016-01-01

    This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier's Scopus database over the period 1980-2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17) during the period 1980-1989 and they reached 3% (N = 69) in the time slice 1990-1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476) during the period 2010-2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178) of total publications followed by Italy (18%, N = 482) and France (12%, N = 336). These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980-1994, the research topics were poorly characterized, but they become better defined during the time slice 1995-1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014) was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses revealed highly

  18. Unravelling the impact of inheritance within the Wilson Cycle: a combined mapping and numerical modelling approach

    NASA Astrophysics Data System (ADS)

    Chenin, Pauline; Manatschal, Gianreto; Lavier, Luc

    2015-04-01

    Our study aims to unravel how structural, lithological and thermal heterogeneities may influence both orogenic and rift systems within the Wilson Cycle. To do this, we map first-order rift structural domains, timing of the main rift events as well as major heterogeneities and structures inherited from previous orogenies. Besides, we design numerical modelling experiments to investigate the relationships highlighted from the comparison of these maps. We apply this approach to the North Atlantic region, which underwent two major orogenic phases during the Palaeozoic: (1) the Caledonian orogeny - now extending from United-Kingdom to northern Norway and Eastern Greenland - resulted from the Late Ordovician closure of the large Iapetus ocean (> 2 000 km) and smaller Tornquist Seaway. It was followed by purely mechanical extensional orogenic collapse; (2) the Variscides of Southwestern Europe were essentially built from the Devono-Carboniferous suturing of several small oceanic basins (< 200 km) in addition to the large Rheic Ocean. The subsequent orogenic collapse was accompanied by significant magmatic activity, which resulted in mafic underplating and associated mantle depletion over the whole orogenic area. Our study is twofolds: On the one hand, we investigate how the size and maturity of the intervening oceanic basins affect subduction and orogeny, considering two end-members: (a) immature oceanic basins defined as hyperextended rift systems that never achieved steady state seafloor spreading; and (b) mature oceans characterized by a self-sustained magmatic system forming homogeneous oceanic crust. On the other hand, we study how post-orogenic collapse-related underplating and associated mantle depletion may impact subsequent rifting depending on the thermal state (e.g. the duration of relaxation time between the magmatic episode and the onset of rifting). Our results highlight a very different behaviour of the North Atlantic rift with respect to the Caledonian and

  19. Molecular scene analysis: application of a topological approach to the automated interpretation of protein electron-density maps.

    PubMed

    Leherte, L; Fortier, S; Glasgow, J; Allen, F H

    1994-03-01

    Methods to assist in the spatial and visual analysis of electron-density maps have been investigated as part of a project in molecular scene analysis [Fortier, Castleden, Glasgow, Conklin, Walmsley, Leherte & Allen (1993). Acta Cryst. D49, 168-178]. In particular, the usefulness of the topological approach for the segmentation of medium-resolution (3 A) maps of proteins and their interpretation in terms of structural motifs has been assessed. The approach followed is that proposed by Johnson [Johnson (1977). ORCRIT. The Oak Ridge Critical Point Network Program. Chemistry Division, Oak Ridge National Laboratory, USA] which provides a global representation of the electron-density distribution through the location, identification and linkage of its critical points. In the first part of the study, the topological approach was applied to calculated maps of three proteins of small to medium size so as to develop a methodology that could then be used for analyzing maps of medium resolution. The methodology was then applied to both calculated and experimental maps of penicillopepsin at 3 A resolution. The study shows that the networks of critical points can provide a useful segmentation of the maps, tracing the protein main chains and capturing their conformation. In addition, these networks can be parsed in terms of secondary-structure motifs, through a geometrical analysis of the critical points. The procedure adopted for secondary-structure recognition, which was phrased in terms of geometry-based rules, provides a basis for a further automated implementation of a more complete set of recognition operations through the use of artificial-intelligence techniques. PMID:15299453

  20. Chemical Genetics Approach Reveals Importance of cAMP and MAP Kinase Signaling to Lipid and Carotenoid Biosynthesis in Microalgae.

    PubMed

    Choi, Yoon-E; Rhee, Jin-Kyu; Kim, Hyun-Soo; Ahn, Joon-Woo; Hwang, Hyemin; Yang, Ji-Won

    2015-05-01

    In this study, we attempted to understand signaling pathways behind lipid biosynthesis by employing a chemical genetics approach based on small molecule inhibitors. Specific signaling inhibitors of MAP kinase or modulators of cAMP signaling were selected to evaluate the functional roles of each of the key signaling pathways in three different microalgal species: Chlamydomonas reinhardtii, Chlorella vulgaris, and Haematococcus pluvialis. Our results clearly indicate that cAMP signaling pathways are indeed positively associated with microalgal lipid biosynthesis. In contrast, MAP kinase pathways in three microalgal species are all negatively implicated in both lipid and carotenoid biosynthesis. PMID:25563422

  1. Higgs in bosonic channels (CMS)

    NASA Astrophysics Data System (ADS)

    Gori, Valentina

    2015-05-01

    The main Higgs boson decays into bosonic channels will be considered, presenting and discussing results from the latest reprocessing of data collected by the CMS experiment at the LHC, using the full dataset recorded at centre-of-mass energies of 7 and 8 TeV. For this purpose, results from the final Run-I papers for the H → ZZ → 4ℓ, H → γγ and H → WW analyses are presented, focusing on the Higgs boson properties, like the mass, the signal strenght, the couplings to fermions and vector bosons, the spin and parity properties. Furthermore, the Higgs boson width measurement exploiting the on-shell versus the off-shell cross section (in the H → ZZ → 4ℓ and H → ZZ → 2ℓ2ν decay channels) will be shown. All the investigated properties result to be fully consistent with the SM predictions: the signal strength and the signal strength modifiers are consistent with unity in all the bosonic channels considered; the hypothesis of a scalar particle is strongly favored, against the pseudoscalar or the vector/pseudovector or the spin-2 boson hypotheses (all excluded at 99% CL or higher in the H → ZZ → 4ℓ channel). The Higgs boson mass measurement from the combination of H → ZZ → 4ℓ and H → γγ channels gives a value mH = 125.03+0.26-0.27 (stat.) +0.13-0.15 (syst.). An upper limit ΓH < 22 MeV can be put on the Higgs boson width thanks to the new indirect method.

  2. A Constructivist Approach to Designing Computer Supported Concept-Mapping Environment

    ERIC Educational Resources Information Center

    Cheung, Li Siu

    2006-01-01

    In the past two decades, there has been a proliferation of research activities on exploring the use of concept maps to support teaching and learning of various knowledge disciplines which range from science to language subjects. MindNet, which is a collaborative concept mapping environment that supports both synchronous and asynchronous modes of…

  3. Ontology Mapping Neural Network: An Approach to Learning and Inferring Correspondences among Ontologies

    ERIC Educational Resources Information Center

    Peng, Yefei

    2010-01-01

    An ontology mapping neural network (OMNN) is proposed in order to learn and infer correspondences among ontologies. It extends the Identical Elements Neural Network (IENN)'s ability to represent and map complex relationships. The learning dynamics of simultaneous (interlaced) training of similar tasks interact at the shared connections of the…

  4. An Innovative Concept Map Approach for Improving Students' Learning Performance with an Instant Feedback Mechanism

    ERIC Educational Resources Information Center

    Wu, Po-Han; Hwang, Gwo-Jen; Milrad, Marcelo; Ke, Hui-Ru; Huang, Yueh-Min

    2012-01-01

    Concept maps have been widely employed for helping students organise their knowledge as well as evaluating their knowledge structures in a wide range of subject matters. Although researchers have recognised concept maps as being an important educational tool, past experiences have also revealed the difficulty of evaluating the correctness of a…

  5. MAPPING LONG-TERM REGIONAL RUNOFF IN THE EASTERN UNITED STATES USING AUTOMATED APPROACHES

    EPA Science Inventory

    The authors explored and evaluated nine automated procedures for mapping long-term runoff . The evaluations of their accuracy and acceptability are based on visual comparison of the contour maps and analysis of deviations of computed runoff from gaged runoff at 93 withheld sites....

  6. AUTOMATED APPROACHES FOR PRODUCING REGIONAL RUNOFF MAPS OF THE NORTHEASTERN UNITED STATES

    EPA Science Inventory

    We have conducted research to find a faster, simpler, less expensive way to create runoff (i.e., runoff-depth) contour maps using automated procedures. ur goal was to produce maps as accurate as those produced manually by the U.S. Geological Survey. e developed eight procedures b...

  7. Alternative approaches for measuring oat crown rust to improve resistance mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Precise and accurate disease assessment can improve mapping of QTL conferring resistance. We have used two new methods to assess crown rust resistance in the Ogle/TAM O-301 oat mapping population: 1) single-race field trials in isolation chambers and 2) a molecular assay to measure the development...

  8. Bosonic edge states in gapped honeycomb lattices

    NASA Astrophysics Data System (ADS)

    Guo, Huaiming; Niu, Yuekun; Chen, Shu; Feng, Shiping

    2016-03-01

    By quantum Monte Carlo simulations of bosons in gapped honeycomb lattices, we show the existence of bosonic edge states. For a single layer honeycomb lattice, bosonic edge states can be controlled to appear, cross the gap, and merge into bulk states by an on-site potential applied on the outermost sites of the boundary. On a bilayer honeycomb lattice, A bosonic edge state traversing the gap at half filling is demonstrated. The topological origin of the bosonic edge states is discussed with pseudo Berry curvature. The results will simulate experimental studies of these exotic bosonic edge states with ultracold bosons trapped in honeycomb optical lattices.

  9. Dark light Higgs bosons.

    SciTech Connect

    Draper, P.; Liu, T.; Wagner, C. E. M.; Wang, L.-T.; Zhang, H.

    2011-03-24

    We study a limit of the nearly Peccei-Quinn-symmetric next-to-minimal supersymmetric standard model possessing novel Higgs and dark matter (DM) properties. In this scenario, there naturally coexist three light singletlike particles: a scalar, a pseudoscalar, and a singlinolike DM candidate, all with masses of order 0.1-10 GeV. The decay of a standard model-like Higgs boson to pairs of the light scalars or pseudoscalars is generically suppressed, avoiding constraints from collider searches for these channels. For a certain parameter window annihilation into the light pseudoscalar and exchange of the light scalar with nucleons allow the singlino to achieve the correct relic density and a large direct-detection cross section consistent with the DM direct-detection experiments, CoGeNT and DAMA/LIBRA, preferred region simultaneously. This parameter space is consistent with experimental constraints from LEP, the Tevatron, ?, and flavor physics.

  10. Quartic gauge boson couplings

    NASA Astrophysics Data System (ADS)

    He, Hong-Jian

    1998-08-01

    We review the recent progress in studying the anomalous electroweak quartic gauge boson couplings (QGBCs) at the LHC and the next generation high energy e±e- linear colliders (LCs). The main focus is put onto the strong electroweak symmetry breaking scenario in which the non-decoupling guarantees sizable new physics effects for the QGBCs. After commenting upon the current low energy indirect bounds and summarizing the theoretical patterns of QGBCs predicted by the typical resonance/non-resonance models, we review our systematic model-independent analysis on bounding them via WW-fusion and WWZ/ZZZ-production. The interplay of the two production mechanisms and the important role of the beam-polarization at the LCs are emphasized. The same physics may be similarly and better studied at a multi-TeV muon collider with high luminosity.

  11. Mapping and monitoring cropland burning in European Russia: a multi-sensor approach

    NASA Astrophysics Data System (ADS)

    Hall, J.; Loboda, T. V.; Mccarty, G.; McConnell, L.; Woldemariam, T.

    2013-12-01

    Short lived aerosols and pollutants transported from high northern latitudes have amplified the short term warming in the Arctic region. Specifically, black carbon (BC) is recognized as the second most important human emission in regards to climate forcing, behind carbon dioxide with a total climate forcing of +1.1Wm-2. Early studies have suggested that cropland burning may be a high contributor to the BC emissions which are directly deposited above the Arctic Circle. However, accurate monitoring of cropland burning from existing active fire and burned area products is limited. Most existing algorithms are focused on mapping hotter and larger wildfire events. The timing of cropland burning differs from wildfire events and their transient nature adds a further challenge to the product development. In addition, the analysis of multi-year cloud cover over Russian croplands, using the Moderate Resolution Imaging Spectroradiometer (MODIS) daily surface reflectance data showed that on average early afternoon observations from MODIS/ Aqua provided 68 clear views per growing period (defined 1st March 2003 - 30th November 2012) with a range from 30 to 101 clear views; whereas MODIS/Terra provided 75 clear views per growing period (defined 1st March 2001 - 30th November 2012) with a range from 37 to 113 clear views. Here we present a new approach to burned area mapping in croplands from satellite imagery. Our algorithm is designed to detect burned area only within croplands and does not have the requirements to perform well outside those. The algorithm focuses on tracking the natural intra-annual development curve specific for crops rather than natural vegetation and works by identifying the subtle spectral nuances between varieties of cropland field categories. Using a combination of the high visual accuracy from very high resolution (VHR, defined as spatial resolution < 5m) imagery and the temporal trend of MODIS data, we are able to differentiate between burned and plowed

  12. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    USGS Publications Warehouse

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In

  13. Water-sanitation-hygiene mapping: an improved approach for data collection at local level.

    PubMed

    Giné-Garriga, Ricard; de Palencia, Alejandro Jiménez-Fernández; Pérez-Foguet, Agustí

    2013-10-01

    Strategic planning and appropriate development and management of water and sanitation services are strongly supported by accurate and accessible data. If adequately exploited, these data might assist water managers with performance monitoring, benchmarking comparisons, policy progress evaluation, resources allocation, and decision making. A variety of tools and techniques are in place to collect such information. However, some methodological weaknesses arise when developing an instrument for routine data collection, particularly at local level: i) comparability problems due to heterogeneity of indicators, ii) poor reliability of collected data, iii) inadequate combination of different information sources, and iv) statistical validity of produced estimates when disaggregated into small geographic subareas. This study proposes an improved approach for water, sanitation and hygiene (WASH) data collection at decentralised level in low income settings, as an attempt to overcome previous shortcomings. The ultimate aim is to provide local policymakers with strong evidences to inform their planning decisions. The survey design takes the Water Point Mapping (WPM) as a starting point to record all available water sources at a particular location. This information is then linked to data produced by a household survey. Different survey instruments are implemented to collect reliable data by employing a variety of techniques, such as structured questionnaires, direct observation and water quality testing. The collected data is finally validated through simple statistical analysis, which in turn produces valuable outputs that might feed into the decision-making process. In order to demonstrate the applicability of the method, outcomes produced from three different case studies (Homa Bay District-Kenya-; Kibondo District-Tanzania-; and Municipality of Manhiça-Mozambique-) are presented. PMID:23850660

  14. A Geospatial Approach to Mapping Bioenergy Potential of Perennial Crops in North American Tallgrass Prairie

    NASA Astrophysics Data System (ADS)

    Wang, S.; Fritschi, F. B.; Stacy, G.

    2009-12-01

    Biomass is the largest source of renewable energy in the United States and is expected to replace 30% of the domestic petroleum consumption by 2030. Corn ethanol currently constitutes 99% of the country’s biofuels. Extended annual crop planting for biofuel production, however, has raised concerns about long-term environmental, ecological and socio-economical consequences. More sustainable bioenergy resources might therefore be developed to meet the energy demand, food security and climate policy. The DOD has identified switchgrass (Panicum virgatum L.) as a model bioenergy crop. Switchgrass, along with other warm-season grasses, is native to the pre-colonial tallgrass prairie in North America. This study maps the spatial distributions of prairie grasses and marginal croplands in the tallgrass prairie with remote sensing and GIS techniques. In 2000-2008, the 8-day composition MODIS imagery was downloaded to calculate the normalized difference vegetation index (NDVI). With pixel-level temporal trajectory of NDVI, time-series trend analysis was performed to identify native prairie grasses based on their phenological uniqueness. In a case study in southwest Missouri, this trajectory approach distinguished more than 80% of warm-season prairie grasslands from row crops and cool-season pastures (Figure 1). Warm season grasses dominated in the 19 public prairies in the study area in a range of 45-98%. This study explores the geographic context of current and potential perennial bioenergy supplies in the tallgrass prairie. Beyond the current findings, it holds promise for further investigations to provide quantitative economic and environmental information in assisting bioenergy policy decision-making. Figure 1 The distribution of grasslands in the study area. The "WSG", "CSG" and “non-grass” represent warm-season prairie grasses, introduced cool-season grasses and crops and other non-grasses.

  15. What is a Higgs Boson?

    ScienceCinema

    Lincoln, Don

    2014-08-12

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  16. Chiral Bosonization of Superconformal Ghosts

    NASA Technical Reports Server (NTRS)

    Shi, Deheng; Shen, Yang; Liu, Jinling; Xiong, Yongjian

    1996-01-01

    We explain the difference of the Hilbert space of the superconformal ghosts (beta,gamma) system from that of its bosonized fields phi and chi. We calculate the chiral correlation functions of phi, chi fields by inserting appropriate projectors.

  17. What is a Higgs Boson?

    SciTech Connect

    Lincoln, Don

    2011-07-07

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  18. A National Approach to Quantify and Map Biodiversity Conservation Metrics within an Ecosystem Services Framework

    EPA Science Inventory

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have be...

  19. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    PubMed Central

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  20. Use of linkage disequilibrium approaches to map genes for bipolar disorder in the Costa Rican population

    SciTech Connect

    Escamilla, M.A.; Reus, V.I.; Smith, L.B.; Freimer, N.B.

    1996-05-31

    Linkage disequilibrium (LD) analysis provides a powerful means for screening the genome to map the location of disease genes, such as those for bipolar disorder (BP). As described in this paper, the population of the Central Valley of Costa Rica, which is descended from a small number of founders, should be suitable for LD mapping; this assertion is supported by reconstruction of extended haplotypes shared by distantly related individuals in this population suffering low-frequency hearing loss (LFHL1), which has previously been mapped by linkage analysis. A sampling strategy is described for applying LD methods to map genes for BP, and clinical and demographic characteristics of an initially collected sample are discussed. This sample will provide a complement to a previously collected set of Costa Rican BP families which is under investigation using standard linkage analysis. 42 refs., 4 figs., 2 tabs.

  1. A National Approach for Mapping and Quantifying Habitat-based Biodiversity Metrics Across Multiple Spatial Scales

    EPA Science Inventory

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national inte...

  2. AFSM sequencing approach: a simple and rapid method for genome-wide SNP and methylation site discovery and genetic mapping

    PubMed Central

    Xia, Zhiqiang; Zou, Meiling; Zhang, Shengkui; Feng, Binxiao; Wang, Wenquan

    2014-01-01

    We describe methods for the assessment of amplified-fragment single nucleotide polymorphism and methylation (AFSM) sites using a quick and simple molecular marker-assisted breeding strategy based on the use of two restriction enzyme pairs (EcoRI-MspI and EcoRI-HpaII) and a next-generation sequencing platform. Two sets of 85 adapter pairs were developed to concurrently identify SNPs, indels and methylation sites for 85 lines of cassava population in this study. In addition to SNPs and indels, the simplicity of the AFSM protocol makes it particularly suitable for high-throughput full methylation and hemi-methylation analyses. To further demonstrate the ease of this approach, a cassava genetic linkage map was constructed. This approach should be widely applicable for genetic mapping in a variety of organisms and will improve the application of crop genomics in assisted breeding. PMID:25466435

  3. Mapping land cover from satellite images: A basic, low cost approach

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barney, T. W.; Barr, D. J.; Johannsen, C. J.

    1978-01-01

    Simple, inexpensive methodologies developed for mapping general land cover and land use categories from LANDSAT images are reported. One methodology, a stepwise, interpretive, direct tracing technique was developed through working with university students from different disciplines with no previous experience in satellite image interpretation. The technique results in maps that are very accurate in relation to actual land cover and relative to the small investment in skill, time, and money needed to produce the products.

  4. Isotopomer mapping approach to determine N_{2}O production pathways and N_{2}O reduction

    NASA Astrophysics Data System (ADS)

    Lewicka-Szczebak, Dominika; Well, Reinhard; Cardenas, Laura; Bol, Roland

    2016-04-01

    Stable isotopomer analyses of soil-emitted N2O (δ15N, δ18Oand SP = 15N site preference within the linear N2O molecule) may help to distinguish N2O production pathways and to quantify N2O reduction to N2. Different N2O forming processes are characterised by distinct isotopic characteristics. Bacterial denitrification shows significantly lower SP and δ18Ovalues when compared to fungal denitrification and nitrification processes. But SP and δ18Ovalues are also altered during N2O reduction to N2, when the residual N2O is enriched in 18Oand centrally located 15N, resulting in increased δ18Oand SP values. Hence, the interpretation of these isotope characteristics is not straightforward, because higher δ18Oand SP values may be due to admixture of N2O from fungal denitrification or nitrification, or due to N2O reduction to N2. One of these processes, either admixture or reduction, can be quite well quantified if the other one is determined with independent methods. But usually both processes are unknown and the ability to estimate both of them simultaneously would be very beneficial. Here we present an attempt to determine both the admixture and the reduction simultaneously using the isotopomer mapping, i.e. the relation between δ18Oand SP. The measured sample points are typically situated between the two lines: reduction line with a typical slope of about 0.35 and mixing line with a higher slope of about 0.8. Combining the reduction and the mixing vector allows for the determination of both processes based on the location of the sample point between the lines. We tested this new approach for laboratory incubation studies, where a reference method for N2O reduction quantification was applied, i.e. 15N gas flux method or incubations in He atmosphere. This allowed us to check how well the calculated amounts for N2O reduction agree with the results provided by the reference method. The general trend was quite well reflected in our calculated results, however, quite

  5. A Multi - Disciplinary Approach Combining Geological, Geomorphological and Geophysical Data for Mapping the Susceptibility to Sinkholes

    NASA Astrophysics Data System (ADS)

    Margiotta, Stefano; Negri, Sergio; Quarta, Tatiana A. M.; Parise, Mario

    2013-04-01

    The Salento region of southern Italy has a great number of active sinkholes, related to both natural and anthropogenic cavities. The presence of sinkholes is at the origin of several problems to the built-up environment, due to the increasing population growth and development pressures. In such a context, the detection of cavities, and therefore the assessment of the sinkhole hazard presents numerous difficulties. Multidisciplinary - approach, comprising geological, geomorphological and geophysical analyses, is therefore necessary to obtain comprehensive knowledge of the complex phenomena in karstic areas. Geophysical methods can also be of great help to identify and map the areas at higher risk of collapse. In this case it is important to identify the features related to the underground voids, likely evolving to sinkholes, by contrasts in physical properties such as density, electrical resistivity, and so on, with the surrounding sediments. At the same time, identification of the presence of sinkholes by geophysical methods has to adapt to the different geological conditions, so that there is not the possibility to use the same techniques everywhere. At this aim, the present paper illustrates the advantages of integrating geological and geomorphological surveys with surface geophysical techniques such as seismic, geoelectrical and ground penetrating radar methods for the identification of sinkhole-prone areas. The present work illustrates the results concerning a sinkhole system at Nociglia (inland Salento, southeastern Italy) where the shallow phreatic speleogenesis operates close to the water table level with formation of karst conduits and proto-caves whose evolution occurs through successive roof collapse, formation of wide caverns and sinkhole development at the surface. All of this creates serious problems to the nearby infrastructures, including a province road that has often been threatened by the sinkhole development. Geological and geomorphological

  6. Excitations in disordered bosonic optical lattices

    SciTech Connect

    Knap, Michael; Arrigoni, Enrico; Linden, Wolfgang von der

    2010-11-15

    Spectral excitations of ultracold gases of bosonic atoms trapped in one-dimensional optical lattices with disorder are investigated by means of the variational cluster approach applied to the Bose-Hubbard model. Qualitatively different disorder distributions typically employed in experiments are considered. The computed spectra exhibit a strong dependence on the shape of the disorder distribution and the disorder strength. We compare alternative results for the Mott gap obtained from its formal definition and from the minimum peak distance, which is the quantity available from experiments.

  7. Geomorphons — a pattern recognition approach to classification and mapping of landforms

    NASA Astrophysics Data System (ADS)

    Jasiewicz, Jarosław; Stepinski, Tomasz F.

    2013-01-01

    We introduce a novel method for classification and mapping of landform elements from a DEM based on the principle of pattern recognition rather than differential geometry. At the core of the method is the concept of geomorphon (geomorphologic phonotypes) — a simple ternary pattern that serves as an archetype of a particular terrain morphology. A finite number of 498 geomorphons constitute a comprehensive and exhaustive set of all possible morphological terrain types including standard elements of landscape, as well as unfamiliar forms rarely found in natural terrestrial surfaces. A single scan of a DEM assigns an appropriate geomorphon to every cell in the raster using a procedure that self-adapts to identify the most suitable spatial scale at each location. As a result, the method classifies landform elements at a range of different spatial scales with unprecedented computational efficiency. A general purpose geomorphometric map — an interpreted map of topography — is obtained by generalizing allgeomorphons to a small number of the most common landform elements. Due to the robustness and high computational efficiency of the method high resolution geomorphometric maps having continental and even global extents can be generated from giga-cell DEMs. Such maps are a valuable new resource for both manual and automated geomorphometric analyses. In order to demonstrate a practical application of this new method, a 30 m cell- 1 geomorphometric map of the entire country of Poland is generated and the features and potential usage of this map are briefly discussed. The computer implementation of the method is outlined. The code is available in the public domain.

  8. De-pinning of disordered bosonic chains

    NASA Astrophysics Data System (ADS)

    Vogt, N.; Cole, J. H.; Shnirman, A.

    2016-05-01

    We consider onset of transport (de-pinning) in one-dimensional bosonic chains with a repulsive boson–boson interaction that decays exponentially on large length-scales. Our study is relevant for (i) de-pinning of Cooper-pairs in Josephson junction arrays; (ii) de-pinning of magnetic flux quanta in quantum-phase-slip ladders, i.e. arrays of superconducting wires in a ladder-configuration that allow for the coherent tunneling of flux quanta. In the low-frequency, long wave-length regime these chains can be mapped onto an effective model of a one-dimensional elastic field in a disordered potential. The standard de-pinning theories address infinitely long systems in two limiting cases: (a) of uncorrelated disorder (zero correlation length); (b) of long range power-law correlated disorder (infinite correlation length). In this paper we study numerically chains of finite length in the intermediate case of long but finite disorder correlation length. This regime is of relevance for, e.g., the experimental systems mentioned above. We study the interplay of three length scales: the system length, the interaction range, the correlation length of disorder. In particular, we observe the crossover between the solitonic onset of transport in arrays shorter than the disorder correlation length to onset of transport by de-pinning for longer arrays.

  9. A new approach for creating customizable cytoarchitectonic probabilistic maps without a template.

    PubMed

    Tahmasebi, Amir M; Abolmaesumi, Purang; Geng, Xiujuan; Morosan, Patricia; Amunts, Katrin; Christensen, Gary E; Johnsrude, Ingrid S

    2009-01-01

    We present a novel technique for creating template-free probabilistic maps of the cytoarchitectonic areas using a groupwise registration. We use the technique to transform 10 human post-mortem structural MR data sets, together with their corresponding cytoarchitectonic information, to a common space. We have targeted the cytoarchitectonically defined subregions of the primary auditory cortex. Thanks to the template-free groupwise registration, the created maps are not macroanatomically biased towards a specific geometry/topology. The advantage of the group-wise versus pairwise registration in avoiding such anatomical bias is better revealed in studies with small number of subjects and a high degree of variability among the individuals such as the post-mortem data. A leave-one-out cross-validation method was used to compare the sensitivity, specificity and positive predictive value of the proposed and published maps. We observe a significant improvement in localization of cytoarchitectonically defined subregions in primary auditory cortex using the proposed maps. The proposed maps can be tailored to any subject space by registering the subject image to the average of the groupwise-registered post-mortem images. PMID:20426184

  10. Assessing spatial uncertainty in predictive geomorphological mapping: A multi-modelling approach

    NASA Astrophysics Data System (ADS)

    Luoto, Miska; Marmion, Mathieu; Hjort, Jan

    2010-03-01

    Maps of earth surface processes and the potential distribution of landforms make an important contribution to theoretical and applied geomorphology. Because decision making often depends on information based on spatial models, there is a great need to develop methodology to evaluate the spatial uncertainty resulting from those models. In this study we developed a new method to produce maps of the uncertainty of predictions provided by ten state-of-the-art modelling techniques for sorted (SP) and non-sorted (NSP) patterned ground in subarctic Finland at a 1.0-ha resolution. Six uncertainty classes represent the modelling agreement between the different modelling techniques. The resulting uncertainty maps reflect the reliability of the estimates for the studied periglacial landforms in the modelled area. Our results showed a significant negative correlation between the degree of uncertainty and the accuracy of the modelling techniques. On average, when all ten models agreed, the mean area under the curve (AUC) values were 0.904 (NSP) and 0.896 (SP), these values decreased to 0.416 (NSP) and 0.518 (SP), respectively, when only five models agreed. Mapping of the uncertainty of predictions in geomorphology can help scientists to improve the reliability of their data and modelling results. The predictive maps can be interpreted simultaneously with the uncertainty information, improving understanding of the potential pitfalls of the modelling.

  11. Impact of a concept map teaching approach on nursing students' critical thinking skills.

    PubMed

    Kaddoura, Mahmoud; Van-Dyke, Olga; Yang, Qing

    2016-09-01

    Nurses confront complex problems and decisions that require critical thinking in order to identify patient needs and implement best practices. An active strategy for teaching students the skills to think critically is the concept map. This study explores the development of critical thinking among nursing students in a required pathophysiology and pharmacology course during the first year of a Bachelor of Science in Nursing in response to concept mapping as an interventional strategy, using the Health Education Systems, Incorporated critical thinking test. A two-group experimental study with a pretest and posttest design was used. Participants were randomly divided into a control group (n = 42) taught by traditional didactic lecturing alone, and an intervention group (n = 41), taught by traditional didactic lecturing with concept mapping. Students in the concept mapping group performed much better on the Health Education Systems, Incorporated than students in the control group. It is recommended that deans, program directors, and nursing faculties evaluate their curricula to integrate concept map teaching strategies in courses in order to develop critical thinking abilities in their students. PMID:26891960

  12. Brain Mapping in a Patient with Congenital Blindness – A Case for Multimodal Approaches

    PubMed Central

    Roland, Jarod L.; Hacker, Carl D.; Breshears, Jonathan D.; Gaona, Charles M.; Hogan, R. Edward; Burton, Harold; Corbetta, Maurizio; Leuthardt, Eric C.

    2013-01-01

    Recent advances in basic neuroscience research across a wide range of methodologies have contributed significantly to our understanding of human cortical electrophysiology and functional brain imaging. Translation of this research into clinical neurosurgery has opened doors for advanced mapping of functionality that previously was prohibitively difficult, if not impossible. Here we present the case of a unique individual with congenital blindness and medically refractory epilepsy who underwent neurosurgical treatment of her seizures. Pre-operative evaluation presented the challenge of accurately and robustly mapping the cerebral cortex for an individual with a high probability of significant cortical re-organization. Additionally, a blind individual has unique priorities in one’s ability to read Braille by touch and sense the environment primarily by sound than the non-vision impaired person. For these reasons we employed additional measures to map sensory, motor, speech, language, and auditory perception by employing a number of cortical electrophysiologic mapping and functional magnetic resonance imaging methods. Our data show promising results in the application of these adjunctive methods in the pre-operative mapping of otherwise difficult to localize, and highly variable, functional cortical areas. PMID:23914170

  13. Microzonation Mapping Of The Yanbu Industrial City, Western Saudi Arabia: A Multicriteria Decision Analysis Approach

    NASA Astrophysics Data System (ADS)

    Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.

    2016-04-01

    Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.

  14. Remote sensing approach to map riparian vegetation of the Colorado River Ecosystem, Grand Canyon area, Arizona

    NASA Astrophysics Data System (ADS)

    Nguyen, U.; Glenn, E.; Nagler, P. L.; Sankey, J. B.

    2015-12-01

    Riparian zones in the southwestern U.S. are usually a mosaic of vegetation types at varying states of succession in response to past floods or droughts. Human impacts also affect riparian vegetation patterns. Human- induced changes include introduction of exotic species, diversion of water for human use, channelization of the river to protect property, and other land use changes that can lead to deterioration of the riparian ecosystem. This study explored the use of remote sensing to map an iconic stretch of the Colorado River in the Grand Canyon National Park, Arizona. The pre-dam riparian zone in the Grand Canyon was affected by annual floods from spring run-off from the watersheds of Green River, the Colorado River and the San Juan River. A pixel-based vegetation map of the riparian zone in the Grand Canyon, Arizona, was produced from high-resolution aerial imagery. The map was calibrated and validated with ground survey data. A seven-step image processing and classification procedure was developed based on a suite of vegetation indices and classification subroutines available in ENVI Image Processing and Analysis software. The result was a quantitative species level vegetation map that could be more accurate than the qualitative, polygon-based maps presently used on the Lower Colorado River. The dominant woody species in the Grand Canyon are now saltcedar, arrowweed and mesquite, reflecting stress-tolerant forms adapted to alternated flow regimes associated with the river regulation.

  15. Segmentation and automated measurement of chronic wound images: probability map approach

    NASA Astrophysics Data System (ADS)

    Ahmad Fauzi, Mohammad Faizal; Khansa, Ibrahim; Catignani, Karen; Gordillo, Gayle; Sen, Chandan K.; Gurcan, Metin N.

    2014-03-01

    estimated 6.5 million patients in the United States are affected by chronic wounds, with more than 25 billion US dollars and countless hours spent annually for all aspects of chronic wound care. There is need to develop software tools to analyze wound images that characterize wound tissue composition, measure their size, and monitor changes over time. This process, when done manually, is time-consuming and subject to intra- and inter-reader variability. In this paper, we propose a method that can characterize chronic wounds containing granulation, slough and eschar tissues. First, we generate a Red-Yellow-Black-White (RYKW) probability map, which then guides the region growing segmentation process. The red, yellow and black probability maps are designed to handle the granulation, slough and eschar tissues, respectively found in wound tissues, while the white probability map is designed to detect the white label card for measurement calibration purpose. The innovative aspects of this work include: 1) Definition of a wound characteristics specific probability map for segmentation, 2) Computationally efficient regions growing on 4D map; 3) Auto-calibration of measurements with the content of the image. The method was applied on 30 wound images provided by the Ohio State University Wexner Medical Center, with the ground truth independently generated by the consensus of two clinicians. While the inter-reader agreement between the readers is 85.5%, the computer achieves an accuracy of 80%.

  16. A recursive approach to the O(n) model on random maps via nested loops

    NASA Astrophysics Data System (ADS)

    Borot, G.; Bouttier, J.; Guitter, E.

    2012-02-01

    We consider the O(n) loop model on tetravalent maps and show how to rephrase it into a model of bipartite maps without loops. This follows from a combinatorial decomposition that consists in cutting the O(n) model configurations along their loops so that each elementary piece is a map that may have arbitrary even face degrees. In the induced statistics, these maps are drawn according to a Boltzmann distribution whose parameters (the face weights) are determined by a fixed point condition. In particular, we show that the dense and dilute critical points of the O(n) model correspond to bipartite maps with large faces (i.e. whose degree distribution has a fat tail). The re-expression of the fixed point condition in terms of linear integral equations allows us to explore the phase diagram of the model. In particular, we determine this phase diagram exactly for the simplest version of the model where the loops are ‘rigid’. Several generalizations of the model are discussed.

  17. Analytic boosted boson discrimination

    NASA Astrophysics Data System (ADS)

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2016-05-01

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D 2, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits. By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. Our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.

  18. The Cancer Experience Map: An Approach to Including the Patient Voice in Supportive Care Solutions

    PubMed Central

    2015-01-01

    The perspective of the patient, also called the “patient voice”, is an essential element in materials created for cancer supportive care. Identifying that voice, however, can be a challenge for researchers and developers. A multidisciplinary team at a health information company tasked with addressing this issue created a representational model they call the “cancer experience map”. This map, designed as a tool for content developers, offers a window into the complex perspectives inside the cancer experience. Informed by actual patient quotes, the map shows common overall themes for cancer patients, concerns at key treatment points, strategies for patient engagement, and targeted behavioral goals. In this article, the team members share the process by which they created the map as well as its first use as a resource for cancer support videos. The article also addresses the broader policy implications of including the patient voice in supportive cancer content, particularly with regard to mHealth apps. PMID:26022846

  19. Mapping and monitoring the health and vitality of coral reefs from satellite: a biospheric approach.

    PubMed

    Dustan, P; Chakrabarti, S; Alling, A

    2000-01-01

    Biospheric studies of coral reefs require a planetary perspective that only remote sensing from space can provide. This article reviews aspects of monitoring and mapping coral reefs using Landsat and Spot satellite images. It details design considerations for developing a sensor for equatorial orbiting spacecraft, including spectral characteristics of living corals and the spatial resolution required to map coral reef communities. Possible instrumentation choices include computer techniques, filtered imagers, push-broom spectral imagery, and a newly developed hyperspectral imaging scheme using tomographic reconstruction. We compare the salient features of each technique and describe concepts for a payload to conduct planetary-scale coral reef monitoring. PMID:11543553

  20. Mapped Grid Methods Applied to the Slow Variable Discretization-Enhanced Renormalized Numerov Approach.

    PubMed

    Blandon, Juan; Parker, Gregory A; Madrid, Christopher

    2016-02-11

    We introduce a hyperspherical coordinate mapping procedure to the slow variable discretization-enhanced renormalized Numerov method that allows for more accurate and cost-effective calculations of cold and ultracold atom-dimer scattering. The mapping procedure allows optimization of the numerical grid point spacing by adjusting to the shape of the interaction potential. We show results for elastic scattering in HeH2 and compare the results to previous MOLSCAT calculations by Forrey et al. [ Phys. Rev. A 1999, 59, 2146 ]. PMID:26797269

  1. Complex approach to long-term multi-agent mapping in low dynamic environments

    NASA Astrophysics Data System (ADS)

    Shvets, Evgeny A.; Nikolaev, Dmitry P.

    2015-12-01

    In the paper we consider the problem of multi-agent continuous mapping of a changing, low dynamic environment. The mapping problem is a well-studied one, however usage of multiple agents and operation in a non-static environment complicate it and present a handful of challenges (e.g. double-counting, robust data association, memory and bandwidth limits). All these problems are interrelated, but are very rarely considered together, despite the fact that each has drawn attention of the researches. In this paper we devise an architecture that solves the considered problems in an internally consistent manner.

  2. A multi-method approach for benthic habitat mapping of shallow coastal areas with high-resolution multibeam data

    NASA Astrophysics Data System (ADS)

    Micallef, Aaron; Le Bas, Timothy P.; Huvenne, Veerle A. I.; Blondel, Philippe; Hühnerbach, Veit; Deidun, Alan

    2012-05-01

    The coastal waters of the Maltese Islands, central Mediterranean Sea, sustain a diversity of marine habitats and support a wide range of human activities. The islands' shallow waters are characterised by a paucity of hydrographic and marine geo-environmental data, which is problematic in view of the requirements of the Maltese Islands to assess the state of their coastal waters by 2012 as part of the EU Marine Strategy Directive. Multibeam echosounder (MBES) systems are today recognised as one of the most effective tools to map the seafloor, although the quantitative characterisation of MBES data for seafloor and habitat mapping is still an underdeveloped field. The purpose of this study is to outline a semi-automated, Geographic Information System-based methodology to map the distribution of habitats in shallow coastal waters using high-resolution MBES data. What distinguishes our methodology from those proposed in previous studies is the combination of a suite of geomorphometric and textural analytical techniques to map specific types of seafloor morphologies and compositions; the selection of the techniques is based on identifying which geophysical parameter would be influenced by the seabed type under consideration. We tested our approach in a 28 km2 area of Maltese coastal waters. Three data sets were collected from this study area: (i) MBES bathymetry and backscatter data; (ii) Remotely Operated Vehicle imagery and (iii) photographs and sediment samples from dive surveys. The seabed was classified into five elementary morphological zones and features - flat and sloping zones, crests, depressions and breaks of slope - using morphometric derivatives, the Bathymetric Position Index and geomorphometric mapping. Segmentation of the study area into seagrass-covered and unvegetated seafloor was based on roughness estimation. Further subdivision of these classes into the four predominant types of composition - medium sand, maërl associated with sand and gravel

  3. Target attractor tracking of relative phase in Bosonic Josephson junction

    NASA Astrophysics Data System (ADS)

    Borisenok, Sergey

    2016-06-01

    The relative phase of Bosonic Josephson junction in the Josephson regime of Bose-Hubbard model is tracked via the target attractor (`synergetic') feedback algorithm with the inter-well coupling parameter presented as a control function. The efficiency of our approach is demonstrated numerically for Gaussian and harmonic types of target phases.

  4. Two-dimensional thermofield bosonization II: Massive fermions

    SciTech Connect

    Amaral, R.L.P.G.

    2008-11-15

    We consider the perturbative computation of the N-point function of chiral densities of massive free fermions at finite temperature within the thermofield dynamics approach. The infinite series in the mass parameter for the N-point functions are computed in the fermionic formulation and compared with the corresponding perturbative series in the interaction parameter in the bosonized thermofield formulation. Thereby we establish in thermofield dynamics the formal equivalence of the massive free fermion theory with the sine-Gordon thermofield model for a particular value of the sine-Gordon parameter. We extend the thermofield bosonization to include the massive Thirring model.

  5. Spectral domains for bosonic pair creation in static electromagnetic fields

    NASA Astrophysics Data System (ADS)

    Lv, Q. Z.; Li, Y. J.; Grobe, R.; Su, Q.

    2016-04-01

    We study the emission spectrum of bosons created from the vacuum by combined static electric and magnetic fields. Depending on the spatial extension of the magnetic field, we find four regimes of pair creation, characterized by different growth behaviors of the number of the produced particles. We show that these regimes manifest themselves in the eigenenergy spectrum of the Klein-Gordon Hamiltonian. The regimes also lead to rather different kinetic energy spectra of the emitted bosons, whose peak positions can be obtained from a generalized Fano-like perturbative approach.

  6. A physical approach of vulnerability mapping in karst area based on a new interpretation of tracer breakthrough curves

    NASA Astrophysics Data System (ADS)

    Bailly-Comte, V.; Pistre, S.

    2011-12-01

    Strategies for groundwater protection mostly use vulnerability maps to contamination; therefore, a lot of methods have been developed since the 90's with a particular attention to operational techniques. These easy-to-use methods are based on the superposition of relative rating systems applied to critical hydrogeological factors; their major drawback is the subjectivity of the determination of the rating scale and the weighting coefficients. Thus, in addition to vulnerability mapping, empirical results given by tracer tests are often needed to better assess groundwater vulnerability to accidental contamination in complex hydrosystems such as karst aquifers. This means that a lot of data about tracer breakthrough curves (BTC) in karst area are now available for hydrologists. In this context, we propose a physical approach to spatially distributed simulation of tracer BTC based on macrodispersive transport in 1D. A new interpretation of tracer tests performed in various media is shown as a validation of our theoretical development. The vulnerability map is then given by the properties of the simulated tracer BTC (modal time, mean residence time, duration over a given concentration threshold etc.). In this way, our method expresses the vulnerability with units, which makes it possible the comparison from one system to another. In addition, previous or new tracer tests can be used as a validation of the map for the same hydrological conditions. Even if this methodology is not limited to karsts hydrosystems, this seems particularly suitable for these complex environments for which understanding the origin of accidental contamination is crucial.

  7. A combined segmentation and pixel-based classification approach of QuickBird imagery for land cover mapping

    NASA Astrophysics Data System (ADS)

    Wang, Jianmei; Li, Deren; Qin, Wenzhong

    2005-10-01

    Recent advances in remote-sensing technology suggest that satellite-based earth observation (EO) has great potential for providing and updating spatial information in a timely and cost-effective manner. However, with the improvement of the spatial resolution of satellite image, the detail of the image has become more complicated. Even though texture features included for multi-spectral high-resolution satellite imagery, conventional methods for pixel-based classification have limited success. In order to take better advantage of spatial information of high-resolution satellite imagery, a combined segmentation and pixel-based classification approach is presented in this paper. Firstly, pixel-based multi-spectral maximum-likelihood classification approach obtains initial classification result. Secondly, image segmentation is created by watershed transform and region merging. Finally, based on the proportions of each class present in each segment obtain final classification map. A QuickBird imagery of the suburban area of Shanghai in China is used to validate the proposed method. Experiment proves that classification map produced by the combined approach, is visual noise-free, has clean borders, and has better classification accuracy than that by pixel-based classification approach.

  8. A novel lidar-driven two-level approach for real-time unmanned ground vehicle navigation and map building

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Cui, Bo; Zhang, Xingzhong

    2013-12-01

    In this paper, a two-level LIDAR-driven hybrid approach is proposed for real-time unmanned ground vehicle navigation and map building. Top level is newly designed enhanced Voronoi Diagram (EVD) method to plan a global trajectory for an unmanned vehicle. Bottom level employs Vector Field Histogram (VFH) algorithm based on the LIDAR sensor information to locally guide the vehicle under complicated workspace, in which it autonomously traverses from one node to another within the planned EDV with obstacle avoidance. To find the least-cost path within the EDV, novel distance and angle based search heuristic algorithms are developed, in which the cost of an edge is the risk of traversing the edge. An EVD is first constructed based on the environment, which is utilized to generate the initial global trajectory with obstacle avoidance. The VFH algorithm is employed to guide the vehicle to follow the path locally. Its effectiveness and efficiency of real-time navigation and map building for unmanned vehicles have been successfully validated by simulation studies and experiments. The proposed approach is successfully experimented on an actual unmanned vehicle to demonstrate the real-time navigation and map building performance of the proposed method. The vehicle appears to follow a very stable path while navigating through various obstacles.

  9. Decision and function problems based on boson sampling

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, Georgios M.; Brougham, Thomas

    2016-07-01

    Boson sampling is a mathematical problem that is strongly believed to be intractable for classical computers, whereas passive linear interferometers can produce samples efficiently. So far, the problem remains a computational curiosity, and the possible usefulness of boson-sampling devices is mainly limited to the proof of quantum supremacy. The purpose of this work is to investigate whether boson sampling can be used as a resource of decision and function problems that are computationally hard, and may thus have cryptographic applications. After the definition of a rather general theoretical framework for the design of such problems, we discuss their solution by means of a brute-force numerical approach, as well as by means of nonboson samplers. Moreover, we estimate the sample sizes required for their solution by passive linear interferometers, and it is shown that they are independent of the size of the Hilbert space.

  10. An energy balance approach for mapping crop waterstress and yield impacts over the Czech Republic

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There is a growing demand for timely, spatially distributed information regarding crop condition and water use to inform agricultural decision making and yield forecasting efforts. Remote sensing of land-surface temperature has proven valuable for mapping evapotranspiration (ET) and crop stress from...

  11. Evaluation of data analytic approaches to generating cross-domain mappings of controlled science vocabularies

    NASA Astrophysics Data System (ADS)

    Zednik, S.

    2015-12-01

    Recent data publication practices have made increasing amounts of diverse datasets available online for the general research community to explore and integrate. Even with the abundance of data online, relevant data discovery and successful integration is still highly dependent upon the data being published with well-formed and understandable metadata. Tagging a dataset with well-known or controlled community terms is a common mechanism to indicate the intended purpose, subject matter, or other relevant facts of a dataset, however controlled domain terminology can be difficult for cross-domain researchers to interpret and leverage. It is also a challenge for integration portals to successfully provide cross-domain search capabilities over data holdings described using many different controlled vocabularies. Mappings between controlled vocabularies can be challenging because communities frequently develop specialized terminologies and have highly specific and contextual usages of common words. Despite this specificity it is highly desirable to produce cross-domain mappings to support data integration. In this contribution we evaluate the applicability of several data analytic techniques for the purpose of generating mappings between hierarchies of controlled science terms. We hope our efforts initiate more discussion on the topic and encourage future mapping efforts.

  12. Mapping a Mutation in "Caenorhabditis elegans" Using a Polymerase Chain Reaction-Based Approach

    ERIC Educational Resources Information Center

    Myers, Edith M.

    2014-01-01

    Many single nucleotide polymorphisms (SNPs) have been identified within the "Caenorhabditis elegans" genome. SNPs present in the genomes of two isogenic "C. elegans" strains have been routinely used as a tool in forward genetics to map a mutation to a particular chromosome. This article describes a laboratory exercise in which…

  13. A data fusion approach for mapping daily evapotranspiration at field scale

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The capability for mapping water consumption over cropped landscapes on a daily and seasonal basis is increasingly relevant given forecasted scenarios of reduced water availability. Prognostic modeling of water losses to the atmosphere, or evapotranspiration (ET), at field or finer scales in agricul...

  14. A Robust Approach for Mapping Group Marks to Individual Marks Using Peer Assessment

    ERIC Educational Resources Information Center

    Spatar, Ciprian; Penna, Nigel; Mills, Henny; Kutija, Vedrana; Cooke, Martin

    2015-01-01

    Group work can form a substantial component of degree programme assessments. To satisfy institutional and student expectations, students must often be assigned individual marks for their contributions to the group project, typically by mapping a single holistic group mark to individual marks using peer assessment scores. Since the early 1990s,…

  15. B1 mapping with a pure phase encode approach: Quantitative density profiling

    NASA Astrophysics Data System (ADS)

    Vashaee, S.; Newling, B.; MacMillan, B.; Balcom, B. J.

    2013-07-01

    In MRI, it is frequently observed that naturally uniform samples do not have uniform image intensities. In many cases this non-uniform image intensity is due to an inhomogeneous B1 field. The ‘principle of reciprocity' states that the received signal is proportional to the local magnitude of the applied B1 field per unit current. Inhomogeneity in the B1 field results in signal intensity variations that limit the ability of MRI to yield quantitative information. In this paper a novel method is described for mapping B1 inhomogeneities based on measurement of the B1 field employing centric-scan pure phase encode MRI measurements. The resultant B1 map may be employed to correct related non-uniformities in MR images. The new method is based on acquiring successive images with systematically incremented low flip angle excitation pulses. The local image intensity variation is proportional to B12, which ensures high sensitivity to B1 field variations. Pure phase encoding ensures the resultant B1 field maps are free from distortions caused by susceptibility variation, chemical shift and paramagnetic impurities. Hence, the method works well in regions of space that are not accessible to other methods such as in the vicinity of conductive metallic structures, such as the RF probe itself. Quantitative density images result when the centric scan pure phase encode measurement is corrected with a relative or absolute B1 field map. The new technique is simple, reliable and robust.

  16. Approaches for mapping and monitoring arid rangelands with object-based image analysis and hyperspatial imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    At the USDA Agricultural Research Service Jornada Experimental Range (JER) in southern New Mexico, remote sensing research is focused on finding new methods for mapping and monitoring rangelands, and on relating ground-based surveys to remotely sensed information. This presentation will give an over...

  17. Effects of Concept Mapping Instruction Approach on Students' Achievement in Basic Science

    ERIC Educational Resources Information Center

    Ogonnaya, Ukpai Patricia; Okafor, Gabriel; Abonyi, Okechukwu S.; Ugama, J. O.

    2016-01-01

    The study investigated the effects of concept mapping on students' achievement in basic science. The study was carried out in Ebonyi State of Nigeria. The study employed a quasi-experimental design. Specifically the pretest posttest non-equivalent control group research design was used. The sample was 122 students selected from two secondary…

  18. Mapping Patterns of Perceptions: A Community-Based Approach to Cultural Competence Assessment

    ERIC Educational Resources Information Center

    Davis, Tamara S.

    2007-01-01

    Unclear definitions and limited system-level assessment measures inhibit cultural responsiveness in children's mental health. This study explores an alternative method to conceptualize and assess cultural competence in four children's mental health systems of care communities from family and professional perspectives. Concept Mapping was used to…

  19. Daily field-scale ET mapping using a data fusion approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Land-surface evapotranspiration (ET) transfers water from soil and vegetation into atmosphere and is affected by drought, disease, insects, vegetation amount and phenology, and soil texture, fertility and salinity. The ability to accurately map daily ET at field scale will provide useful information...

  20. The Social Maps of Children Approaching Adolescence: Studying the Ecology of Youth Development.

    ERIC Educational Resources Information Center

    Garbarino, James; And Others

    This paper reports the first results of a three-year longitudinal study of the social maps of children beginning the transition to adolescence. This exploratory study is guided by Bronfenbrenner's conception of the ecology of human development stressing the importance of a phenomenological orientation to development in the context of ecological…

  1. The Social Maps of Children Approaching Adolescence: Studying the Ecology of Youth Development.

    ERIC Educational Resources Information Center

    Garbarino, James; And Others

    1978-01-01

    The social maps of preadolescent (grade 6) children were investigated. The focus was on characteristics of children's social networks as a function of neighborhood type, socioeconomic status, and level of physical maturation. The social heterogeneity of the network (peer vs adult salience) was of primary concern. (Author/RD)

  2. Putting vulnerability to climate change on the map: a review of approaches, benefits, and risks

    SciTech Connect

    Preston, Benjamin L

    2011-01-01

    There is growing demand among stakeholders across public and private institutions for spatially-explicit information regarding vulnerability to climate change at the local scale. However, the challenges associated with mapping the geography of climate change vulnerability are non-trivial, both conceptually and technically, suggesting the need for more critical evaluation of this practice. Here, we review climate change vulnerability mapping in the context of four key questions that are fundamental to assessment design. First, what are the goals of the assessment? A review of published assessments yields a range of objective statements that emphasize problem orientation or decision-making about adaptation actions. Second, how is the assessment of vulnerability framed? Assessments vary with respect to what values are assessed (vulnerability of what) and the underlying determinants of vulnerability that are considered (vulnerability to what). The selected frame ultimately influences perceptions of the primary driving forces of vulnerability as well as preferences regarding management alternatives. Third, what are the technical methods by which an assessment is conducted? The integration of vulnerability determinants into a common map remains an emergent and subjective practice associated with a number of methodological challenges. Fourth, who participates in the assessment and how will it be used to facilitate change? Assessments are often conducted under the auspices of benefiting stakeholders, yet many lack direct engagement with stakeholders. Each of these questions is reviewed in turn by drawing on an illustrative set of 45 vulnerability mapping studies appearing in the literature. A number of pathways for placing vulnerability

  3. An Innovative Approach to Scheme Learning Map Considering Tradeoff Multiple Objectives

    ERIC Educational Resources Information Center

    Lin, Yu-Shih; Chang, Yi-Chun; Chu, Chih-Ping

    2016-01-01

    An important issue in personalized learning is to provide learners with customized learning according to their learning characteristics. This paper focused attention on scheming learning map as follows. The learning goal can be achieved via different pathways based on alternative materials, which have the relationships of prerequisite, dependence,…

  4. Magnetoresistance of an Anderson insulator of bosons.

    PubMed

    Gangopadhyay, Anirban; Galitski, Victor; Müller, Markus

    2013-07-12

    We study the magnetoresistance of two-dimensional bosonic Anderson insulators. We describe the change in spatial decay of localized excitations in response to a magnetic field, which is given by an interference sum over alternative tunneling trajectories. The excitations become more localized with increasing field (in sharp contrast to generic fermionic excitations which get weakly delocalized): the localization length ξ(B) is found to change as ξ(-1)(B)-ξ(-1)(0)~B(4/5). The quantum interference problem maps onto the classical statistical mechanics of directed polymers in random media (DPRM). We explain the observed scaling using a simplified droplet model which incorporates the nontrivial DPRM exponents. Our results have implications for a variety of experiments on magnetic-field-tuned superconductor-to-insulator transitions observed in disordered films, granular superconductors, and Josephson junction arrays, as well as for cold atoms in artificial gauge fields. PMID:23889427

  5. A new approach for deriving Flood hazard maps from SAR data and global hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Matgen, P.; Hostache, R.; Chini, M.; Giustarini, L.; Pappenberger, F.; Bally, P.

    2014-12-01

    With the flood consequences likely to amplify because of the growing population and ongoing accumulation of assets in flood-prone areas, global flood hazard and risk maps are needed for improving flood preparedness at large scale. At the same time, with the rapidly growing archives of SAR images of floods, there is a high potential of making use of these images for global and regional flood management. In this framework, an original method that integrates global flood inundation modeling and microwave remote sensing is presented. It takes advantage of the combination of the time and space continuity of a global inundation model with the high spatial resolution of satellite observations. The availability of model simulations over a long time period offers opportunities for estimating flood non-exceedance probabilities in a robust way. These probabilities can be attributed to historical satellite observations. Time series of SAR-derived flood extent maps and associated non-exceedance probabilities can then be combined generate flood hazard maps with a spatial resolution equal to that of the satellite images, which is most of the time higher than that of a global inundation model. In principle, this can be done for any area of interest in the world, provided that a sufficient number of relevant remote sensing images are available. As a test case we applied the method on the Severn River (UK) and the Zambezi River (Mozambique), where large archives of Envisat flood images can be exploited. The global ECMWF flood inundation model is considered for computing the statistics of extreme events. A comparison with flood hazard maps estimated with in situ measured discharge is carried out. The first results confirm the potentiality of the method. However, further developments on two aspects are required to improve the quality of the hazard map and to ensure the acceptability of the product by potential end user organizations. On the one hand, it is of paramount importance to

  6. Near Real-time Visualization of the Coastal Ocean: A Google Maps Approach

    NASA Astrophysics Data System (ADS)

    Terrill, E.; Reuter, P.; Hazard, L.; Otero, M.; Cook, T.; Bowen, J.

    2009-12-01

    The Coastal Observing R&D Center (CORDC) at Scripps Institution of Oceanography has developed and implemented real-time data management and display tools for use in the Google Maps environment. A primary use of these tools is for displaying data measured, aggregated, and distributed by a regional observing system. CORDC developed and continues to maintain these tools that are now in use by a broad suite of end users, including local, state and federal agencies, resource managers, industry, policy makers, educators, scientists and the general public for the Southern California Coastal Ocean Observing System (SCCOOS). These data feeds encompass the ongoing monitoring of a broad suite of ocean observing data including, but not limited to: surface currents, satellite imagery, wave conditions and forecasts, meteorological conditions and forecasts, water quality, bathymetry, ocean temperature, salinity, chlorophyll, and density in the form of data products and raw data. By leveraging Google Maps, this effort has achieved seamless integration of disparate datasets into a unifying, low latency interface for on-line visualization and interaction. The resulting interfaces have brought national attention to the public display of data that SCCOOS serves, notably for ease of use and navigation. While the Google Maps API provides basic capabilities for spatially zooming and panning, developers are able to extend the API to include customized temporal browsing of spatial maps, info. displays, legends, dynamic color scaling and interactive data queries resulting in time series or other point/slice plots. Visualizations of products with more mature Google Maps interfaces will be presented, including statewide ocean surface currents, meteorological observations, ship tracking and output from an operational ocean current models that pose additional challenges of 4D data sets. Current developments involving product layering and API extensions will also be presented.

  7. A Karnaugh map based approach towards systemic reviews and meta-analysis.

    PubMed

    Hassan, Abdul Wahab; Hassan, Ahmad Kamal

    2016-01-01

    Studying meta-analysis and systemic reviews since long had helped us conclude numerous parallel or conflicting studies. Existing studies are presented in tabulated forms which contain appropriate information for specific cases yet it is difficult to visualize. On meta-analysis of data, this can lead to absorption and subsumption errors henceforth having undesirable potential of consecutive misunderstandings in social and operational methodologies. The purpose of this study is to investigate an alternate forum for meta-data presentation that relies on humans' strong pictorial perception capability. Analysis of big-data is assumed to be a complex and daunting task often reserved on the computational powers of machines yet there exist mapping tools which can analyze such data in a hand-handled manner. Data analysis on such scale can benefit from the use of statistical tools like Karnaugh maps where all studies can be put together on a graph based mapping. Such a formulation can lead to more control in observing patterns of research community and analyzing further for uncertainty and reliability metrics. We present a methodological process of converting a well-established study in Health care to its equaling binary representation followed by furnishing values on to a Karnaugh Map. The data used for the studies presented herein is from Burns et al (J Publ Health 34(1):138-148, 2011) consisting of retrospectively collected data sets from various studies on clinical coding data accuracy. Using a customized filtration process, a total of 25 studies were selected for review with no, partial, or complete knowledge of six independent variables thus forming 64 independent cells on a Karnaugh map. The study concluded that this pictorial graphing as expected had helped in simplifying the overview of meta-analysis and systemic reviews. PMID:27064957

  8. Geospatial Predictive Modelling for Climate Mapping of Selected Severe Weather Phenomena Over Poland: A Methodological Approach

    NASA Astrophysics Data System (ADS)

    Walawender, Ewelina; Walawender, Jakub P.; Ustrnul, Zbigniew

    2016-02-01

    The main purpose of the study is to introduce methods for mapping the spatial distribution of the occurrence of selected atmospheric phenomena (thunderstorms, fog, glaze and rime) over Poland from 1966 to 2010 (45 years). Limited in situ observations as well the discontinuous and location-dependent nature of these phenomena make traditional interpolation inappropriate. Spatially continuous maps were created with the use of geospatial predictive modelling techniques. For each given phenomenon, an algorithm identifying its favourable meteorological and environmental conditions was created on the basis of observations recorded at 61 weather stations in Poland. Annual frequency maps presenting the probability of a day with a thunderstorm, fog, glaze or rime were created with the use of a modelled, gridded dataset by implementing predefined algorithms. Relevant explanatory variables were derived from NCEP/NCAR reanalysis and downscaled with the use of a Regional Climate Model. The resulting maps of favourable meteorological conditions were found to be valuable and representative on the country scale but at different correlation (r) strength against in situ data (from r = 0.84 for thunderstorms to r = 0.15 for fog). A weak correlation between gridded estimates of fog occurrence and observations data indicated the very local nature of this phenomenon. For this reason, additional environmental predictors of fog occurrence were also examined. Topographic parameters derived from the SRTM elevation model and reclassified CORINE Land Cover data were used as the external, explanatory variables for the multiple linear regression kriging used to obtain the final map. The regression model explained 89 % of annual frequency of fog variability in the study area. Regression residuals were interpolated via simple kriging.

  9. Spin models and boson sampling

    NASA Astrophysics Data System (ADS)

    Garcia Ripoll, Juan Jose; Peropadre, Borja; Aspuru-Guzik, Alan

    Aaronson & Arkhipov showed that predicting the measurement statistics of random linear optics circuits (i.e. boson sampling) is a classically hard problem for highly non-classical input states. A typical boson-sampling circuit requires N single photon emitters and M photodetectors, and it is a natural idea to rely on few-level systems for both tasks. Indeed, we show that 2M two-level emitters at the input and output ports of a general M-port interferometer interact via an XY-model with collective dissipation and a large number of dark states that could be used for quantum information storage. More important is the fact that, when we neglect dissipation, the resulting long-range XY spin-spin interaction is equivalent to boson sampling under the same conditions that make boson sampling efficient. This allows efficient implementations of boson sampling using quantum simulators & quantum computers. We acknowledge support from Spanish Mineco Project FIS2012-33022, CAM Research Network QUITEMAD+ and EU FP7 FET-Open Project PROMISCE.

  10. Gaussian Multiple Instance Learning Approach for Mapping the Slums of the World Using Very High Resolution Imagery

    SciTech Connect

    Vatsavai, Raju

    2013-01-01

    In this paper, we present a computationally efficient algo- rithm based on multiple instance learning for mapping infor- mal settlements (slums) using very high-resolution remote sensing imagery. From remote sensing perspective, infor- mal settlements share unique spatial characteristics that dis- tinguish them from other urban structures like industrial, commercial, and formal residential settlements. However, regular pattern recognition and machine learning methods, which are predominantly single-instance or per-pixel classi- fiers, often fail to accurately map the informal settlements as they do not capture the complex spatial patterns. To overcome these limitations we employed a multiple instance based machine learning approach, where groups of contigu- ous pixels (image patches) are modeled as generated by a Gaussian distribution. We have conducted several experi- ments on very high-resolution satellite imagery, represent- ing four unique geographic regions across the world. Our method showed consistent improvement in accurately iden- tifying informal settlements.

  11. A Monte Carlo simulation based two-stage adaptive resonance theory mapping approach for offshore oil spill vulnerability index classification.

    PubMed

    Li, Pu; Chen, Bing; Li, Zelin; Zheng, Xiao; Wu, Hongjing; Jing, Liang; Lee, Kenneth

    2014-09-15

    In this paper, a Monte Carlo simulation based two-stage adaptive resonance theory mapping (MC-TSAM) model was developed to classify a given site into distinguished zones representing different levels of offshore Oil Spill Vulnerability Index (OSVI). It consisted of an adaptive resonance theory (ART) module, an ART Mapping module, and a centroid determination module. Monte Carlo simulation was integrated with the TSAM approach to address uncertainties that widely exist in site conditions. The applicability of the proposed model was validated by classifying a large coastal area, which was surrounded by potential oil spill sources, based on 12 features. Statistical analysis of the results indicated that the classification process was affected by multiple features instead of one single feature. The classification results also provided the least or desired number of zones which can sufficiently represent the levels of offshore OSVI in an area under uncertainty and complexity, saving time and budget in spill monitoring and response. PMID:25044043

  12. A multi-temporal fusion-based approach for land cover mapping in support of nuclear incident response

    NASA Astrophysics Data System (ADS)

    Sah, Shagan

    An increasingly important application of remote sensing is to provide decision support during emergency response and disaster management efforts. Land cover maps constitute one such useful application product during disaster events; if generated rapidly after any disaster, such map products can contribute to the efficacy of the response effort. In light of recent nuclear incidents, e.g., after the earthquake/tsunami in Japan (2011), our research focuses on constructing rapid and accurate land cover maps of the impacted area in case of an accidental nuclear release. The methodology involves integration of results from two different approaches, namely coarse spatial resolution multi-temporal and fine spatial resolution imagery, to increase classification accuracy. Although advanced methods have been developed for classification using high spatial or temporal resolution imagery, only a limited amount of work has been done on fusion of these two remote sensing approaches. The presented methodology thus involves integration of classification results from two different remote sensing modalities in order to improve classification accuracy. The data used included RapidEye and MODIS scenes over the Nine Mile Point Nuclear Power Station in Oswego (New York, USA). The first step in the process was the construction of land cover maps from freely available, high temporal resolution, low spatial resolution MODIS imagery using a time-series approach. We used the variability in the temporal signatures among different land cover classes for classification. The time series-specific features were defined by various physical properties of a pixel, such as variation in vegetation cover and water content over time. The pixels were classified into four land cover classes - forest, urban, water, and vegetation - using Euclidean and Mahalanobis distance metrics. On the other hand, a high spatial resolution commercial satellite, such as RapidEye, can be tasked to capture images over the

  13. Mapping shallow lakes in a large South American floodplain: A frequency approach on multitemporal Landsat TM/ETM data

    NASA Astrophysics Data System (ADS)

    Borro, Marta; Morandeira, Natalia; Salvia, Mercedes; Minotti, Priscilla; Perna, Pablo; Kandus, Patricia

    2014-05-01

    We propose a methodology to identify and map shallow lakes (SL) in the Paraná River floodplain, the largest freshwater wetland ecosystem in temperate South America. The presence and number of SL offer various ecosystem services and habitats for wildlife biodiversity. Our approach involved a frequency analysis over a 1987-2010 time series of the Normalized Difference Vegetation Index (NDVI), derived from Landsat 5 and 7 TM/ETM data. Through descriptive statistics of samples of pixels and field work in different types of SL, we established an NDVI threshold of 0.34 below which we assumed the presence of water in each pixel. The standard deviation of the estimated SL area decreases with the number of images in the analysis, being less than 10% when at least 30 images are used. The mean SL area for the whole period was 112,691 ha (10.9% of the study area). The influence of the hydrological conditions on the resulting SL map was evaluated by analyzing twelve sets of images, which were selected to span the whole period and different time frames according to multiannual dry and wet periods and to relative water level within each period. The Kappa index was then calculated between pairs of resulting SL maps. We compared our maps with the available national and international cartographic documents and with other published maps that used one or a few Landsat images. Landsat images time series provide an accurate spatial and temporal resolution for SL identification in floodplains, particularly in temperate zones with a good provision of cloud free images. The method evaluated in this paper considers the dynamics of SL and reduces the uncertainties of the fuzzy boundaries. Thus, it provides a robust database of SL and its temporal behavior to establish future monitoring programs based on the recent launch of Landsat 8 satellite.

  14. A spatial, statistical approach to map the risk of milk contamination by β-hexachlorocyclohexane in dairy farms.

    PubMed

    Battisti, Sabrina; Caminiti, Antonino; Ciotoli, Giancarlo; Panetta, Valentina; Rombolà, Pasquale; Sala, Marcello; Ubaldi, Alessandro; Scaramozzino, Paola

    2013-11-01

    In May 2005, beta-hexachlorocyclohexane (β-HCH) was found in a sample of bovine bulk milk from a farm in the Sacco River valley (Latium region, central Italy). The primary source of contamination was suspected to be industrial discharge into the environment with the Sacco River as the main mean of dispersion. Since then, a surveillance programme on bulk milk of the local farms was carried out by the veterinary services. In order to estimate the spatial probability of β- HCH contamination of milk produced in the Sacco River valley and draw probability maps of contamination, probability maps of β-HCH values in milk were estimated by indicator kriging (IK), a geo-statistical estimator, and traditional logistic regression (LR) combined with a geographical information systems approach. The former technique produces a spatial view of probabilities above a specific threshold at non-sampled locations on the basis of observed values in the area, while LR gives the probabilities in specific locations on the basis of certain environmental predictors, namely the distance from the river, the distance from the pollution site, the elevation above the river level and the intrinsic vulnerability of hydro-geological formations. Based on the β-HCH data from 2005 in the Sacco River valley, the two techniques resulted in similar maps of high risk of milk contamination. However, unlike the IK method, the LR model was capable of estimating coefficients that could be used in case of future pollution episodes. The approach presented produces probability maps and define high-risk areas already in the early stages of an emergency before sampling operations have been carried out. PMID:24258885

  15. Nonequilibrium functional bosonization of quantum wire networks

    SciTech Connect

    Ngo Dinh, Stephane; Bagrets, Dmitry A.; Mirlin, Alexander D.

    2012-11-15

    We develop a general approach to nonequilibrium nanostructures formed by one-dimensional channels coupled by tunnel junctions and/or by impurity scattering. The formalism is based on nonequilibrium version of functional bosonization. A central role in this approach is played by the Keldysh action that has a form reminiscent of the theory of full counting statistics. To proceed with evaluation of physical observables, we assume the weak-tunneling regime and develop a real-time instanton method. A detailed exposition of the formalism is supplemented by two important applications: (i) tunneling into a biased Luttinger liquid with an impurity, and (ii) quantum Hall Fabry-Perot interferometry. - Highlights: Black-Right-Pointing-Pointer A nonequilibrium functional bosonization framework for quantum wire networks is developed Black-Right-Pointing-Pointer For the study of observables in the weak tunneling regime a real-time instanton method is elaborated. Black-Right-Pointing-Pointer We consider tunneling into a biased Luttinger liquid with an impurity. Black-Right-Pointing-Pointer We analyze electronic Fabry-Perot interferometers in the integer quantum Hall regime.

  16. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach

    PubMed Central

    Lawrence, Emma; Hayes, Keith R.; Lucieer, Vanessa L.; Nichol, Scott L.; Dambacher, Jeffrey M.; Hill, Nicole A.; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve’s IUCN zone IV, and

  17. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach.

    PubMed

    Lawrence, Emma; Hayes, Keith R; Lucieer, Vanessa L; Nichol, Scott L; Dambacher, Jeffrey M; Hill, Nicole A; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve's IUCN zone IV, and in

  18. Density and temperature of bosons from quantum fluctuations

    NASA Astrophysics Data System (ADS)

    Zheng, Hua; Giuliani, Gianluca; Bonasera, Aldo

    2012-10-01

    A method to determine the density and temperature of a system is proposed based on quantum fluctuations typical of bosons in the limit where the temperature T is close to the critical temperature Tc for a Bose-Einstein condensate (BEC) at a given density ρ. Quadrupole and particle multiplicity fluctuations using Landau's theory of fluctuations near the critical point are derived. As an example, we apply our approach to heavy ion collisions using the Constrained Molecular Dynamics model (CoMD) which includes the Fermi statistics. The model shows some clusterization into deuteron (d) and alpha (α) clusters but it is not enough to reproduce available experimental data. We propose a modification of the collision term in the approach to include the possibility of α-α collisions. The relevant Bose-Einstein factor in the collision term is properly taken into account. This approach increases the yields of bosons relative to fermions closer to data. Boson fluctuations become larger than 1 as expected. If they are confirmed a new field of research could open up for a mixture of strongly interacting fermions and bosons which requires novel techniques both theoretically and experimentally.

  19. Mapping the Monte Carlo scheme to Langevin dynamics: a Fokker-Planck approach.

    PubMed

    Cheng, X Z; Jalil, M B A; Lee, Hwee Kuan; Okabe, Yutaka

    2006-02-17

    We propose a general method of using the Fokker-Planck equation (FPE) to link the Monte Carlo (MC) and the Langevin micromagnetic schemes. We derive the drift and diffusion FPE terms corresponding to the MC method and show that it is analytically equivalent to the stochastic Landau-Lifshitz-Gilbert (LLG) equation of Langevin-based micromagnetics. Subsequent results such as the time-quantification factor for the Metropolis MC method can be rigorously derived from this mapping equivalence. The validity of the mapping is shown by the close numerical convergence between the MC method and the LLG equation for the case of a single magnetic particle as well as interacting arrays of particles. We also find that our Metropolis MC method is accurate for a large range of damping factors alpha, unlike previous time-quantified MC methods which break down at low alpha, where precessional motion dominates. PMID:16606044

  20. Mapping the Monte Carlo Scheme to Langevin Dynamics: A Fokker-Planck Approach

    NASA Astrophysics Data System (ADS)

    Cheng, X. Z.; Jalil, M. B.; Lee, Hwee Kuan; Okabe, Yutaka

    2006-02-01

    We propose a general method of using the Fokker-Planck equation (FPE) to link the Monte Carlo (MC) and the Langevin micromagnetic schemes. We derive the drift and diffusion FPE terms corresponding to the MC method and show that it is analytically equivalent to the stochastic Landau-Lifshitz-Gilbert (LLG) equation of Langevin-based micromagnetics. Subsequent results such as the time-quantification factor for the Metropolis MC method can be rigorously derived from this mapping equivalence. The validity of the mapping is shown by the close numerical convergence between the MC method and the LLG equation for the case of a single magnetic particle as well as interacting arrays of particles. We also find that our Metropolis MC method is accurate for a large range of damping factors α, unlike previous time-quantified MC methods which break down at low α, where precessional motion dominates.

  1. A symbiotic approach to SETI observations: use of maps from the Westerbork Synthesis Radio Telescope

    NASA Technical Reports Server (NTRS)

    Tarter, J. C.; Israel, F. P.

    1982-01-01

    High spatial resolution continuum radio maps produced by the Westerbork Synthesis Radio Telescope (WSRT) of The Netherlands at frequencies near the 21 cm HI line have been examined for anomalous sources of emmission coincident with the locations of nearby bright stars. From a total of 542 stellar positions investigated, no candidates for radio stars or ETI signals were discovered to formal limits on the minimum detectable signal ranging from 7.7 x 10(-22) W/m2 to 6.4 x 10(-24) W/m2. This preliminary study has verified that data collected by radio astronomers at large synthesis arrays can profitably be analysed for SETI signals (in a non-interfering manner) provided only that the data are available in the form of a more or less standard two dimensional map format.

  2. Mapping a mutation in Caenorhabditis elegans using a polymerase chain reaction-based approach.

    PubMed

    Myers, Edith M

    2014-01-01

    Many single nucleotide polymorphisms (SNPs) have been identified within the Caenorhabditis elegans genome. SNPs present in the genomes of two isogenic C. elegans strains have been routinely used as a tool in forward genetics to map a mutation to a particular chromosome. This article describes a laboratory exercise in which undergraduate students use molecular biological techniques to map a mutation to a chromosome using a set of SNPs. Through this multi-week exercise, students perform genetic crosses, DNA extraction, polymerase chain reaction, restriction enzyme digests, agarose gel electrophoresis, and analysis of restriction fragment length polymorphisms. Students then analyze their results to deduce the chromosomal location of the mutation. Students also use bioinformatics websites to develop hypotheses that link the genotype to the phenotype. PMID:24615818

  3. Mass Movement Susceptibility in the Western San Juan Mountains, Colorado: A Preliminary 3-D Mapping Approach

    NASA Astrophysics Data System (ADS)

    Kelkar, K. A.; Giardino, J. R.

    2015-12-01

    Mass movement is a major activity that impacts lives of humans and their infrastructure. Human activity in steep, mountainous regions is especially at risk to this potential hazard. Thus, the identification and quantification of risk by mapping and determining mass movement susceptibility are fundamental in protecting lives, resources and ensuring proper land use regulation and planning. Specific mass-movement processes including debris flows, rock falls, snow avalanches and landslides continuously modify the landscape of the San Juan Mountains. Historically, large-magnitude slope failures have repeatedly occurred in the region. Common triggers include intense, long-duration precipitation, freeze-thaw processes, human activity and various volcanic lithologies overlying weaker sedimentary formations. Predicting mass movement is challenging because of its episodic and spatially, discontinuous occurrence. Landslides in mountain terrain are characterized as widespread, highly mobile and have a long duration of activity. We developed a 3-D model for landslide susceptibility using Geographic Information Systems Technology (GIST). The study area encompasses eight USGS quadrangles: Ridgway, Dallas, Mount Sneffels, Ouray, Telluride, Ironton, Ophir and Silverton. Fieldwork consisted of field reconnaissance mapping at 1:5,000 focusing on surficial geomorphology. Field mapping was used to identify potential locations, which then received additional onsite investigation and photographic documentation of features indicative of slope failure. A GIS module was created using seven terrain spatial databases: geology, surficial geomorphology (digitized), slope aspect, slope angle, vegetation, soils and distance to infrastructure to map risk. The GIS database will help determine risk zonation for the study area. Correlations between terrain parameters leading to slope failure were determined through the GIS module. This 3-D model will provide a spatial perspective of the landscape to

  4. A Machine Learning Approach to Mapping Agricultural Fields Across Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Debats, S. R.; Fuchs, T. J.; Thompson, D. R.; Estes, L. D.; Evans, T. P.; Caylor, K. K.

    2013-12-01

    Food production in sub-Saharan Africa is dominated by smallholder agriculture. Rainfed farming practices and the prevailing dryland conditions render crop yields vulnerable to increasing climatic variability. As a result, smallholder farmers are among the poorest and most food insecure groups among the region's population. Quantifying the distribution of smallholder agriculture across sub-Saharan Africa would greatly assist efforts to boost food security. Existing agricultural land cover data sets are limited to estimating the percentage of cropland within a coarse grid cell. The goal of this research is to develop a statistical machine learning algorithm to map individual agricultural fields, mirroring the accuracy of hand-digitization. For the algorithm, a random forest pixel-wise classifier learns by example from training data to distinguish between fields and non-fields. The algorithm then applies this training to classify previously unseen data. These classifications can then be smoothed into coherent regions corresponding to agricultural fields. Our training data set consists of hand-digitized boundaries of agricultural fields in South Africa, commissioned by its government in 2008. Working with 1 km x 1 km scenes across South Africa, the hand-digitized field boundaries are matched with satellite images extracted from Google Maps. To highlight different information contained within the images, several image processing filters are applied. The inclusion of Landsat images for additional training information is also explored. After training and testing the algorithm in South Africa, we aim to expand our mapping efforts across sub-Saharan Africa. Through Princeton's Mapping Africa project, crowdsourcing will produce additional training data sets of hand-digitized field boundaries in new areas of interest. This algorithm and the resulting data sets will provide previously unavailable information at an unprecedented level of detail on the largest and most

  5. Contemporary approaches to studying and mapping of active water exchange zone of ground water

    NASA Astrophysics Data System (ADS)

    Moraru, C. Ye

    2016-03-01

    The article deals with a zone of ground water active exchange. New principles of the zone study and mapping under the platform hydrogeological condition are discussed. The assessment and distribution techniques are suggested for the active water exchange zone under the condition of hydrogeological parameterization uncertainty. The efficiency and significance of the suggested techniques are proved using the example of ground water in the southwest of Black Sea artesian basin.

  6. Assessment of Ice Shape Roughness Using a Self-Orgainizing Map Approach

    NASA Technical Reports Server (NTRS)

    Mcclain, Stephen T.; Kreeger, Richard E.

    2013-01-01

    Self-organizing maps are neural-network techniques for representing noisy, multidimensional data aligned along a lower-dimensional and nonlinear manifold. For a large set of noisy data, each element of a finite set of codebook vectors is iteratively moved in the direction of the data closest to the winner codebook vector. Through successive iterations, the codebook vectors begin to align with the trends of the higher-dimensional data. Prior investigations of ice shapes have focused on using self-organizing maps to characterize mean ice forms. The Icing Research Branch has recently acquired a high resolution three dimensional scanner system capable of resolving ice shape surface roughness. A method is presented for the evaluation of surface roughness variations using high-resolution surface scans based on a self-organizing map representation of the mean ice shape. The new method is demonstrated for 1) an 18-in. NACA 23012 airfoil 2 AOA just after the initial ice coverage of the leading 5 of the suction surface of the airfoil, 2) a 21-in. NACA 0012 at 0AOA following coverage of the leading 10 of the airfoil surface, and 3) a cold-soaked 21-in.NACA 0012 airfoil without ice. The SOM method resulted in descriptions of the statistical coverage limits and a quantitative representation of early stages of ice roughness formation on the airfoils. Limitations of the SOM method are explored, and the uncertainty limits of the method are investigated using the non-iced NACA 0012 airfoil measurements.

  7. a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps

    NASA Astrophysics Data System (ADS)

    Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.

    2016-06-01

    Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.

  8. A thermodynamic approach to predict apparent contact angles on microstructures using surface polygonal maps.

    PubMed

    Calvimontes, A

    2014-11-01

    The thermodynamic model of wetting developed and tested in this work allows the understanding and prediction of apparent contact angles on topographic maps of real and digitally-generated microstructures. The model considers the solid component as a set of finite areal elements in the form of a polygonal map. Liquid and gas components are instead evaluated as continuous and incompressible volumes. In this study, the concept of the wetting topographic spectrum (WTS) is proposed to simulate the changes in the liquid-solid contact areas and of the interfacial energies while wetting the microstructure from the top to the bottom of the topographic map, passing through various states of metastable equilibrium, to find a stable configuration. The model was successfully applied to predict the wetting apparent contact angles on randomly micro-structured polypropylene (PP) surfaces and on a superhydrophobic and superoleophobic transparent polydimethylsiloxane (PDMS) microstructure previously presented as a communication in this journal by other authors. The method presented in this study can be used to design and predict the geometry of microstructures with special wetting characteristics. PMID:25192555

  9. Compression map, functional groups and fossilization: A chemometric approach (Pennsylvanian neuropteroid foliage, Canada)

    USGS Publications Warehouse

    D'Angelo, J. A.; Zodrow, E.L.; Mastalerz, Maria

    2012-01-01

    Nearly all of the spectrochemical studies involving Carboniferous foliage of seed-ferns are based on a limited number of pinnules, mainly compressions. In contrast, in this paper we illustrate working with a larger pinnate segment, i.e., a 22-cm long neuropteroid specimen, compression-preserved with cuticle, the compression map. The objective is to study preservation variability on a larger scale, where observation of transparency/opacity of constituent pinnules is used as a first approximation for assessing the degree of pinnule coalification/fossilization. Spectrochemical methods by Fourier transform infrared spectrometry furnish semi-quantitative data for principal component analysis.The compression map shows a high degree of preservation variability, which ranges from comparatively more coalified pinnules to less coalified pinnules that resemble fossilized-cuticles, noting that the pinnule midveins are preserved more like fossilized-cuticles. A general overall trend of coalified pinnules towards fossilized-cuticles, i.e., variable chemistry, is inferred from the semi-quantitative FTIR data as higher contents of aromatic compounds occur in the visually more opaque upper location of the compression map. The latter also shows a higher condensation of the aromatic nuclei along with some variation in both ring size and degree of aromatic substitution. From principal component analysis we infer correspondence between transparency/opacity observation and chemical information which correlate with varying degree to fossilization/coalification among pinnules. ?? 2011 Elsevier B.V.

  10. An Approach to Optimize Size Parameters of Forging by Combining Hot-Processing Map and FEM

    NASA Astrophysics Data System (ADS)

    Hu, H. E.; Wang, X. Y.; Deng, L.

    2014-11-01

    The size parameters of 6061 aluminum alloy rib-web forging were optimized by using hot-processing map and finite element method (FEM) based on high-temperature compression data. The results show that the stress level of the alloy can be represented by a Zener-Holloman parameter in a hyperbolic sine-type equation with the hot deformation activation energy of 343.7 kJ/mol. Dynamic recovery and dynamic recrystallization concurrently preceded during high-temperature deformation of the alloy. Optimal hot-processing parameters for the alloy corresponding to the peak value of 0.42 are 753 K and 0.001 s-1. The instability domain occurs at deformation temperature lower than 653 K. FEM is an available method to validate hot-processing map in actual manufacture by analyzing the effect of corner radius, rib width, and web thickness on workability of rib-web forging of the alloy. Size parameters of die forgings can be optimized conveniently by combining hot-processing map and FEM.

  11. Exotic Gauge Bosons in the 331 Model

    SciTech Connect

    Romero, D.; Ravinez, O.; Diaz, H.; Reyes, J.

    2009-04-30

    We analize the bosonic sector of the 331 model which contains exotic leptons, quarks and bosons (E,J,U,V) in order to satisfy the weak gauge SU(3){sub L} invariance. We develop the Feynman rules of the entire kinetic bosonic sector which will let us to compute some of the Z(0)' decays modes.

  12. Hard-core lattice bosons: new insights from algebraic graph theory

    NASA Astrophysics Data System (ADS)

    Squires, Randall W.; Feder, David L.

    2014-03-01

    Determining the characteristics of hard-core lattice bosons is a problem of long-standing interest in condensed matter physics. While in one-dimensional systems the ground state can be formally obtained via a mapping to free fermions, various properties (such as correlation functions) are often difficult to calculate. In this work we discuss the application of techniques from algebraic graph theory to hard-core lattice bosons in one dimension. Graphs are natural representations of many-body Hamiltonians, with vertices representing Fock basis states and edges representing matrix elements. We prove that the graphs for hard-core bosons and non-interacting bosons have identical connectivity; the only difference is the existence of edge weights. A formal mapping between the two is therefore possible by manipulating the graph incidence matrices. We explore the implications of these insights, in particular the intriguing possibility that ground-state properties of hard-core bosons can be calculated directly from those of non-interacting bosons.

  13. Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches

    ERIC Educational Resources Information Center

    Oakleaf, Megan

    2008-01-01

    The culture of assessment in higher education requires academic librarians to demonstrate the impact of information literacy instruction on student learning. As a result, many librarians seek to gain knowledge about the information literacy assessment approaches available to them. This article identifies three major assessment approaches: (1)…

  14. A Concept Map Approach to Developing Collaborative Mindtools for Context-Aware Ubiquitous Learning

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Shi, Yen-Ru; Chu, Hui-Chun

    2011-01-01

    Recent advances in mobile and wireless communication technologies have enabled various new learning approaches which situate students in environments that combine real-world and digital-world learning resources; moreover, students are allowed to share knowledge or experiences with others during the learning process. Although such an approach seems…

  15. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    PubMed

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not. PMID:21316191

  16. Wavelet-based approaches for multiple hypothesis testing in activation mapping of functional magnetic resonance images of the human brain

    NASA Astrophysics Data System (ADS)

    Fadili, Jalal M.; Bullmore, Edward T.

    2003-11-01

    Wavelet-based methods for multiple hypothesis testing are described and their potential for activation mapping of human functional magnetic resonance imaging (fMRI) data is investigated. In this approach, we emphasize convergence between methods of wavelet thresholding or shrinkage and the problem of multiple hypothesis testing in both classical and Bayesian contexts. Specifically, our interest will be focused on ensuring a trade off between type I probability error control and power dissipation. We describe a technique for controlling the false discovery rate at an arbitrary level of type 1 error in testing multiple wavelet coefficients generated by a 2D discrete wavelet transform (DWT) of spatial maps of {fMRI} time series statistics. We also describe and apply recursive testing methods that can be used to define a threshold unique to each level and orientation of the 2D-DWT. Bayesian methods, incorporating a formal model for the anticipated sparseness of wavelet coefficients representing the signal or true image, are also tractable. These methods are comparatively evaluated by analysis of "null" images (acquired with the subject at rest), in which case the number of positive tests should be exactly as predicted under the hull hypothesis, and an experimental dataset acquired from 5 normal volunteers during an event-related finger movement task. We show that all three wavelet-based methods of multiple hypothesis testing have good type 1 error control (the FDR method being most conservative) and generate plausible brain activation maps.

  17. Approaches to vegetation mapping and ecophysiological hypothesis testing using combined information from TIMS, AVIRIS, and AIRSAR

    NASA Technical Reports Server (NTRS)

    Oren, R.; Vane, G.; Zimmermann, R.; Carrere, V.; Realmuto, V.; Zebker, Howard A.; Schoeneberger, P.; Schoeneberger, M.

    1991-01-01

    The Tropical Rainforest Ecology Experiment (TREE) had two primary objectives: (1) to design a method for mapping vegetation in tropical regions using remote sensing and determine whether the result improves on available vegetation maps; and (2) to test a specific hypothesis on plant/water relations. Both objectives were thought achievable with the combined information from the Thermal Infrared Multispectral Scanner (TIMS), Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), and Airborne Synthetic Aperture Radar (AIRSAR). Implicitly, two additional objectives were: (1) to ascertain that the range within each variable potentially measurable with the three instruments is large enough in the site, relative to the sensitivity of the instruments, so that differences between ecological groups may be detectable; and (2) to determine the ability of the three systems to quantify different variables and sensitivities. We found that the ranges in values of foliar nitrogen concentration, water availability, stand structure and species composition, and plant/water relations were large, even within the upland broadleaf vegetation type. The range was larger when other vegetation types were considered. Unfortunately, cloud cover and navigation errors compromised the utility of the TIMS and AVIRIS data. Nevertheless, the AIRSAR data alone appear to have improved on the available vegetation map for the study area. An example from an area converted to a farm is given to demonstrate how the combined information from AIRSAR, TIMS, and AVIRIS can uniquely identify distinct classes of land use. The example alludes to the potential utility of the three instruments for identifying vegetation at an ecological scale finer than vegetation types.

  18. A Quantitative Visual Mapping and Visualization Approach for Deep Ocean Floor Research

    NASA Astrophysics Data System (ADS)

    Hansteen, T. H.; Kwasnitschka, T.

    2013-12-01

    Geological fieldwork on the sea floor is still impaired by our inability to resolve features on a sub-meter scale resolution in a quantifiable reference frame and over an area large enough to reveal the context of local observations. In order to overcome these issues, we have developed an integrated workflow of visual mapping techniques leading to georeferenced data sets which we examine using state-of-the-art visualization technology to recreate an effective working style of field geology. We demonstrate a microbathymetrical workflow, which is based on photogrammetric reconstruction of ROV imagery referenced to the acoustic vehicle track. The advantage over established acoustical systems lies in the true three-dimensionality of the data as opposed to the perspective projection from above produced by downward looking mapping methods. A full color texture mosaic derived from the imagery allows studies at resolutions beyond the resolved geometry (usually one order of magnitude below the image resolution) while color gives additional clues, which can only be partly resolved in acoustic backscatter. The creation of a three-dimensional model changes the working style from the temporal domain of a video recording back to the spatial domain of a map. We examine these datasets using a custom developed immersive virtual visualization environment. The ARENA (Artificial Research Environment for Networked Analysis) features a (lower) hemispherical screen at a diameter of six meters, accommodating up to four scientists at once thus providing the ability to browse data interactively among a group of researchers. This environment facilitates (1) the development of spatial understanding analogue to on-land outcrop studies, (2) quantitative observations of seafloor morphology and physical parameters of its deposits, (3) more effective formulation and communication of working hypotheses.

  19. Dynamics of open bosonic quantum systems in coherent state representation

    SciTech Connect

    Dalvit, D. A. R.; Berman, G. P.; Vishik, M.

    2006-01-15

    We consider the problem of decoherence and relaxation of open bosonic quantum systems from a perspective alternative to the standard master equation or quantum trajectories approaches. Our method is based on the dynamics of expectation values of observables evaluated in a coherent state representation. We examine a model of a quantum nonlinear oscillator with a density-density interaction with a collection of environmental oscillators at finite temperature. We derive the exact solution for dynamics of observables and demonstrate a consistent perturbation approach.

  20. Aerial Mapping and Multi-Sensors Approaches from Remote Sensing Applied to the Roman Archaeological Heritage

    NASA Astrophysics Data System (ADS)

    Uribe, P.; Angás, J.; Pérez-Cabello, F.; de la Riva, J.; Bea, M.; Serreta, A.; Magallón, M. A.; Sáenz, C.; Martín-Bueno, M.

    2015-02-01

    This report details the preliminary results of the research focused on Roman archaeological heritage in the Middle Ebro Valley (Spain). The principal objective of this project was to obtain several different readings by means of a UAV equipped with different sensors. Firstly, it has been possible to obtain accurate maps, 3D models and digital elevation models of the site. Secondly, it has been possible to investigate and define archaeological remains still underground, via a new methodology which utilises visible and near-infrared wavelengths.

  1. Mapping suitability of rice production systems for mitigation: Strategic approach for prioritizing improved irrigation management across scales

    NASA Astrophysics Data System (ADS)

    Wassmann, Reiner; Sander, Bjoern Ole

    2016-04-01

    After the successful conclusion of the COP21 in Paris, many developing countries are now embracing the task of reducing emissions with much vigor than previously. In many countries of South and South-East Asia, the agriculture sector constitutes a vast share of the national GHG budget which can mainly be attributed to methane emissions from flooded rice production. Thus, rice growing countries are now looking for tangible and easily accessible information as to how to reduce emissions from rice production in an efficient manner. Given present and future food demand, mitigation options will have to comply with aim of increasing productivity. At the same time, limited financial resources demand for strategic planning of potential mitigation projects based on cost-benefit ratios. At this point, the most promising approach for mitigating methane emissions from rice is an irrigation technique called Alternate Wetting and Drying (AWD). AWD was initially developed for saving water and subsequently, represents an adaptation strategy in its own right by coping with less rainfall. Moreover, AWD also reduces methane emissions in a range from 30-70%. However, AWD is not universally suitable. It is attractive to farmers who have to pump water and may save fuel under AWD, but renders limited incentives in situations where there is no real pressing water scarcity. Thus, planning for AWD adoption at larger scale, e.g. for country-wide programs, should be based on a systematic prioritization of target environments. This presentation encompasses a new methodology for mapping suitability of water-saving in rice production - as a means for planning adaptation and mitigation programs - alongside with preliminary results. The latter comprises three new GIS maps on climate-driven suitability of AWD in major rice growing countries (Philippines, Vietnam, Bangladesh). These maps have been derived from high-resolution data of the areal and temporal extent of rice production that are now

  2. A multi-pronged approach for compiling a global map of allosteric regulation in the apoptotic caspases

    PubMed Central

    Dagbay, Kevin; Eron, Scott J.; Serrano, Banyuhay P.; Velázquez-Delgado, Elih M.; Zhao, Yunlong; Lin, Di; Vaidya, Sravanti; Hardy, Jeanne A.

    2014-01-01

    One of the most promising and as yet underutilized means of regulating protein function is exploitation of allosteric sites. All caspases catalyze the same overall reaction, but they perform different biological roles and are differentially regulated. It is our hypothesis that many allosteric sites exist on various caspases and that understanding both the distinct and overlapping mechanisms by which each caspase can be allosterically controlled should ultimately enable caspase-specific inhibition. Here we describe the ongoing work and methods for compiling a comprehensive map of apoptotic caspase allostery. Central to this approach are the use of i) the embedded record of naturally evolved allosterically sites that are sensitive to zinc-medicated inhibition, phosphorylation and other post-translationally modifications, ii) structural and mutagenic approaches and iii) novel binding sites identified by both rationally-designed and screening-derived small-molecule inhibitors. PMID:24974292

  3. An integrated remote sensing approach for landslide susceptibly mapping at the volcanic islands of Vulcano and Lipari (Eolian Island, Italy)

    NASA Astrophysics Data System (ADS)

    Scifoni, Silvia; Palenzuela Baena, José A.; Marsella, Maria; Pepe, Susi; Sansosti, Eugenio; Solaro, Giuseppe; Tizzani, Piero

    2015-10-01

    Volcanic Island can be affected by instability phenomena such as landslide and partial collapse events, even in quiescent period. Starting from data collected by an aerial laser scanning survey at cm-level accuracy), a GIS based approach was implemented in order to perform a landslide-susceptibility analysis. The results of this analysis were compared and integrated with data derived from Differential Synthetic Aperture Radar Interferometry (DinSAR) analysis able to identify the most active areas and quantify the on-going deformation processes. The analysis is focused on the on the active volcanic edifice of Vulcano Island and in some areas of Lipari island, both include in the Eaolian Islands in Sicily (Italy). The developed approach represent a step-forward for the compilation of hazard maps furnishing in an overall contest, updated and georeferenced quantitative data, describing the morphology and the present behaviour of the slopes in the area of investigation.

  4. Scattering of stringy states in compactified closed bosonic string

    NASA Astrophysics Data System (ADS)

    Maharana, Jnanadeva

    2015-07-01

    We present scattering of stringy states of closed bosonic string compactified on torus Td. We focus our attention on scattering of moduli and gauge bosons. These states appear when massless excitations such as graviton and antisymmetric tensor field of the uncompactified theory are dimensionally reduced to lower dimension. The toroidally compactified theory is endowed with the T-duality symmetry, O (d, d). Therefore, it is expected that the amplitude for scattering of such states will be T-duality invariant. The formalism of Kawai-Lewellen-Tye is adopted and appropriately tailored to construct the vertex operators of moduli and gauge bosons. It is shown, in our approach, that N-point amplitude is T-duality invariant. We present illustrative examples for the four point amplitude to explicitly demonstrate the economy of our formalism when three spatial dimensions are compactified on T3. It is also shown that if we construct an amplitude with a set of 'initial' backgrounds, the T-duality operation transforms it to an amplitude associated with another set backgrounds. We propose a modified version of KLT approach to construct vertex operators for nonabelian massless gauge bosons which appear in certain compactification schemes.

  5. Application of digital soil mapping in traditional soil survey - an approach used for the production of the national soil map of the United Arab

    NASA Astrophysics Data System (ADS)

    Abdelfattah, M. A.; Pain, C.

    2012-04-01

    Digital soil maps are essential part of the soil assessment framework which supports soil-related decisions and policy-making and therefore it is of crucial importance that they are of known quality. Digital soil mapping is perhaps the next great advancement in soil survey information. Traditional soil survey has always struggled with the collection of data. The amount of soil data and information required to justify the mapping product, how to interpolate date to similar areas, and how to incorporate older data are all challenges that need further exploration. The present study used digital soil mapping to develop a generalized national soil map of the United Arab Emirates with available recent traditional soil survey of Abu Dhabi Emirate (2006-2009) and Northern Emirates (2010-2012), together with limited data from Dubai Emirate, an important part of the country. The map was developed by joining, generalizing, and correlating the information contained in the Soil Survey of Abu Dhabi Emirate, the Soil map of Dubai with limited data, and the Soil Survey of the Northern Emirates. Because the soil surveys were completed at different times and with different standards and procedures, the original map lines and soil classifications had to be modified in order to integrate the three original maps and legends into this single national level map. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) version 2 was used to guide line placement of the map units. It was especially helpful for the Torripsamments units which are separated based on local landscape relief characteristics. A generalized soil map of the United Arab Emirates is produced, which consists of fifteen map units, twelve are named for the soil great group that dominants each unit. Three are named "Rock outcrop", "Mountains", or "Miscellaneous units". Statistical details are also presented. Soil great groups are appropriate taxa to use for soil

  6. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach

    PubMed Central

    Ammendolia, Carlo; Cassidy, David; Steensta, Ivan; Soklaridis, Sophie; Boyle, Eleanor; Eng, Stephanie; Howard, Hamer; Bhupinder, Bains; Côté, Pierre

    2009-01-01

    Background Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP) and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW) coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. Methods We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. Results A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Conclusion Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting. PMID:19508728

  7. A spinor boson AB chain

    NASA Astrophysics Data System (ADS)

    Cruz Reyes, Greis Julieth; Franco, Roberto; Silva Valencia, Jereson; Universidad Santo Tomas Collaboration; Universidad Nacional de Colombia Collaboration

    Recent research is focused on superlattices arising from optical lattices, which allow a tunable environment. Experimentally bosons present transitions from superfluid to Mott insulator by changing the energy offset in the unit cell [Nat. Commun. 5:5735 (2014)]. Many studies displayed that ground state of spinless boson systems on superlattices present superfluid, Mott insulator and an additional CDW phase created by the energy shift between the sites into the unit cell [Phys. Rev. A 83, 053621 (2011)]. The first confinement methods were magnetic traps, which freezes the spin; with optical lattices the grade of freedom of spin plays an important role. We consider bosons with spin S =1 on a superlattice made by two sites with energy offset per unit cell (AB chain). The Hamiltonian that describes the system is the Bose-Hubbard model with the superlattice potential (W) and the exchange interaction (V) parameters. This model supports CDW, Mott insulator and superfluid phases. For W near to U, with V =0, Mott phase disappears, but for V increasing, a new CDW appears due to the spin interaction, while the half-integer CDW decrease. These results are widely different from spinless boson, where the CDW phases are stables.

  8. A NEW APPROACH TO CONSTRAIN BLACK HOLE SPINS IN ACTIVE GALAXIES USING OPTICAL REVERBERATION MAPPING

    SciTech Connect

    Wang, Jian-Min; Du, Pu; Li, Yan-Rong; Hu, Chen; Ho, Luis C.; Bai, Jin-Ming

    2014-09-01

    A tight relation between the size of the broad-line region (BLR) and optical luminosity has been established in about 50 active galactic nuclei studied through reverberation mapping of the broad Hβ emission line. The R {sub BLR}-L relation arises from simple photoionization considerations. Using a general relativistic model of an optically thick, geometrically thin accretion disk, we show that the ionizing luminosity jointly depends on black hole mass, accretion rate, and spin. The non-monotonic relation between the ionizing and optical luminosity gives rise to a complicated relation between the BLR size and the optical luminosity. We show that the reverberation lag of Hβ to the varying continuum depends very sensitively on black hole spin. For retrograde spins, the disk is so cold that there is a deficit of ionizing photons in the BLR, resulting in shrinkage of the hydrogen ionization front with increasing optical luminosity, and hence shortened Hβ lags. This effect is specially striking for luminous quasars undergoing retrograde accretion, manifesting in strong deviations from the canonical R {sub BLR}-L relation. This could lead to a method to estimate black hole spins of quasars and to study their cosmic evolution. At the same time, the small scatter of the observed R {sub BLR}-L relation for the current sample of reverberation-mapped active galaxies implies that the majority of these sources have rapidly spinning black holes.

  9. A minimalist approach to gene mapping: locating the gene for acheiropodia, by homozygosity analysis.

    PubMed Central

    Escamilla, M A; DeMille, M C; Benavides, E; Roche, E; Almasy, L; Pittman, S; Hauser, J; Lew, D F; Freimer, N B; Whittle, M R

    2000-01-01

    Acheiropodia is an autosomal recessive disease that results in hemimelia (lack of formation of the distal extremities). We performed a complete genome screen of seven members of an extended pedigree that included three siblings with acheiropodia. Homozygosity mapping was used to identify regions most likely to harbor the gene for acheiropodia in this pedigree. In these two key regions (14p and 7q), further genotyping of one additional affected member of this pedigree plus seven additional unaffected siblings provided evidence, through linkage analysis, that the 7q36 region contains the acheiropodia gene. In this region, a maximum two-point LOD score of 3.81 (4.2 with multipoint analysis) was achieved, and a homozygous haplotype spanning a region of 11.7 cM was seen in all affected in this pedigree. Finally, genotypic analysis of two additional cases of acheiropodia with no known relation to the other samples revealed homozygous sharing of a portion of the same haplotype on 7q36, which reduces the chromosomal location of the acheiropodia gene to an 8.6-cM region. Localization of this gene, at the screening level, by use of data from only three affected subjects, provides an example of how certain genes may be mapped by use of a minimal number of affected cases. PMID:10780921

  10. An approximate Bayesian approach for mapping paired-end DNA reads to a reference genome

    PubMed Central

    Shrestha, Anish Man Singh; Frith, Martin C.

    2013-01-01

    Summary: Many high-throughput sequencing experiments produce paired DNA reads. Paired-end DNA reads provide extra positional information that is useful in reliable mapping of short reads to a reference genome, as well as in downstream analyses of structural variations. Given the importance of paired-end alignments, it is surprising that there have been no previous publications focusing on this topic. In this article, we present a new probabilistic framework to predict the alignment of paired-end reads to a reference genome. Using both simulated and real data, we compare the performance of our method with six other read-mapping tools that provide a paired-end option. We show that our method provides a good combination of accuracy, error rate and computation time, especially in more challenging and practical cases, such as when the reference genome is incomplete or unavailable for the sample, or when there are large variations between the reference genome and the source of the reads. An open-source implementation of our method is available as part of Last, a multi-purpose alignment program freely available at http://last.cbrc.jp. Contact: martin@cbrc.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23413433

  11. Exploring stereographic surface energy maps of cubic metals via an effective pair-potential approach

    NASA Astrophysics Data System (ADS)

    Yoo, Su-Hyun; Lee, Ji-Hwan; Jung, Young-Kwang; Soon, Aloysius

    2016-01-01

    A fast and efficient way to calculate and generate an accurate surface energy database (of more than several million surface energy data points) for all bcc and fcc metals is proposed based on an effective pair-wise-potential model. The accuracy of this model is rigorously tested and verified by employing density functional theory calculations, which shows good agreement within a mean absolute error of 0.03 eV/atom. The surface energy database generated by this model is then visualized and mapped in various ways; namely, the surface energy as a function of relative orientation, a orientation-dependent stereographic projection (the so-called Wulff net), and Gibbs-Wulff construction of the equilibrium crystal shape, for comparison and analysis. The Wulff nets (drawn with several million surface energy data points) provide us with characteristic surface energy maps of these cubic metals. In an attempt to explain the surface energy anomaly in bcc Li, we demonstrate how our effective-pair-potential-derived Wulff net can clearly discriminate the strong influence of the second- and third-nearest-neighbor bonds on the high-Miller-index surface energetics of bcc Li.

  12. Self-organizing maps as an approach to exploring spatiotemporal diffusion patterns

    PubMed Central

    2013-01-01

    Background Self-organizing maps (SOMs) have now been applied for a number of years to identify patterns in large datasets; yet, their application in the spatiotemporal domain has been lagging. Here, we demonstrate how spatialtemporal disease diffusion patterns can be analysed using SOMs and Sammon’s projection. Methods SOMs were applied to identify synchrony between spatial locations, to group epidemic waves based on similarity of diffusion pattern and to construct sequence of maps of synoptic states. The Sammon’s projection was used to created diffusion trajectories from the SOM output. These methods were demonstrated with a dataset that reports Measles outbreaks that took place in Iceland in the period 1946–1970. The dataset reports the number of Measles cases per month in 50 medical districts. Results Both stable and incidental synchronisation between medical districts were identified as well as two distinct groups of epidemic waves, a uniformly structured fast developing group and a multiform slow developing group. Diffusion trajectories for the fast developing group indicate a typical diffusion pattern from Reykjavik to the northern and eastern parts of the island. For the other group, diffusion trajectories are heterogeneous, deviating from the Reykjavik pattern. Conclusions This study demonstrates the applicability of SOMs (combined with Sammon’s Projection and GIS) in spatiotemporal diffusion analyses. It shows how to visualise diffusion patterns to identify (dis)similarity between individual waves and between individual waves and an overall time-series performing integrated analysis of synchrony and diffusion trajectories. PMID:24359538

  13. Landslide susceptibility mapping of vicinity of Yaka Landslide (Gelendost, Turkey) using conditional probability approach in GIS

    NASA Astrophysics Data System (ADS)

    Ozdemir, Adnan

    2009-06-01

    On 19 February 2007, a landslide occurred on the Alaardıç Slope, located 1.6 km south of the town of Yaka (Gelendost, Turkey.) Subsequently, the displaced materials transformed into a mud flow in Eğlence Creek and continued 750 m downstream towards the town of Yaka. The mass poised for motion in the Yaka Landslide source area and its vicinity, which would be triggered to a kinetic state by trigger factors such as heavy or sustained rainfall and/or snowmelt, poise a danger in the form of loss of life and property to Yaka with its population of 3,000. This study was undertaken to construct a susceptibility mapping of the vicinity of the Yaka Landslide’s source area and to relate it to movement of the landslide mass with the goal of prevention or mitigation of loss of life and property. The landslide susceptibility map was formulated by designating the relationship of the effecting factors that cause landslides such as lithology, gradient, slope aspect, elevation, topographical moisture index, and stream power index to the landslide map, as determined by analysis of the terrain, through the implementation of the conditional probability method. It was determined that the surface area of the Goksogut formation, which has attained lithological characteristics of clayey limestone with a broken and separated base and where area landslides occur, possesses an elevation of 1,100-1,300 m, a slope gradient of 15°-35° and a slope aspect between 0°-67.5° and 157°-247°. Loss of life and property may be avoided by the construction of structures to check the debris mass in Eğlence Creek, the cleaning of the canal which passes through Yaka, the broadening of the canal’s base area, elevating the protective edges along the canal and the establishment of a protective zone at least 10-m wide on each side of the canal to deter against damage from probable landslide occurrence and mud flow.

  14. Analyzing the impact of social factors on homelessness: a Fuzzy Cognitive Map approach

    PubMed Central

    2013-01-01

    Background The forces which affect homelessness are complex and often interactive in nature. Social forces such as addictions, family breakdown, and mental illness are compounded by structural forces such as lack of available low-cost housing, poor economic conditions, and insufficient mental health services. Together these factors impact levels of homelessness through their dynamic relations. Historic models, which are static in nature, have only been marginally successful in capturing these relationships. Methods Fuzzy Logic (FL) and fuzzy cognitive maps (FCMs) are particularly suited to the modeling of complex social problems, such as homelessness, due to their inherent ability to model intricate, interactive systems often described in vague conceptual terms and then organize them into a specific, concrete form (i.e., the FCM) which can be readily understood by social scientists and others. Using FL we converted information, taken from recently published, peer reviewed articles, for a select group of factors related to homelessness and then calculated the strength of influence (weights) for pairs of factors. We then used these weighted relationships in a FCM to test the effects of increasing or decreasing individual or groups of factors. Results of these trials were explainable according to current empirical knowledge related to homelessness. Results Prior graphic maps of homelessness have been of limited use due to the dynamic nature of the concepts related to homelessness. The FCM technique captures greater degrees of dynamism and complexity than static models, allowing relevant concepts to be manipulated and interacted. This, in turn, allows for a much more realistic picture of homelessness. Through network analysis of the FCM we determined that Education exerts the greatest force in the model and hence impacts the dynamism and complexity of a social problem such as homelessness. Conclusions The FCM built to model the complex social system of homelessness

  15. The sensitivity of the Higgs boson branching ratios to the W boson width

    NASA Astrophysics Data System (ADS)

    Murray, William

    2016-07-01

    The Higgs boson branching ratio into vector bosons is sensitive to the decay widths of those vector bosons because they are produced with at least one boson significantly off-shell. Γ (H → VV) is approximately proportional to the product of the Higgs boson coupling and the vector boson width. ΓZ is well measured, but ΓW gives an uncertainty on Γ (H → WW) which is not negligible. The ratio of branching ratios, BR (H → WW) / BR (H → ZZ) measured by a combination of ATLAS and CMS at LHC is used herein to extract a width for the W boson of ΓW =1.8-0.3+0.4 GeV by assuming Standard Model couplings of the Higgs bosons. This dependence of the branching ratio on ΓW is not discussed in most Higgs boson coupling analyses.

  16. Generating synthetic magnetic field intermittency using a Minimal Multiscale Lagrangian Mapping approach

    SciTech Connect

    Subedi, P.; Chhiber, R.; Tessein, J. A.; Wan, M.; Matthaeus, W. H.

    2014-12-01

    The Minimal Multiscale Lagrangian Mapping procedure developed in the context of neutral fluid turbulence is a simple method for generating synthetic vector fields. Using a sequence of low-pass filtered fields, fluid particles are displaced at their rms speed for some scale-dependent time interval, and then interpolated back to a regular grid. Fields produced in this way are seen to possess certain properties of real turbulence. This paper extends the technique to plasmas by taking into account the coupling between the velocity and magnetic fields. We examine several possible applications to plasma systems. One use is as initial conditions for simulations, wherein these synthetic fields may efficiently produce a strongly intermittent cascade. The intermittency properties of the synthetic fields are also compared with those of the solar wind. Finally, studies of cosmic ray transport and modulation in the test particle approximation may benefit from improved realism in synthetic fields produced in this way.

  17. Computational approach towards promoter sequence comparison via TF mapping using a new distance measure.

    PubMed

    Meera, A; Rangarajan, Lalitha; Bhat, Savithri

    2011-03-01

    We propose a method for identifying transcription factor binding sites (TFBS) in the given promoter sequence and mapping the transcription factors (TFs). The proposed algorithm searches the +1 transcription start site (TSS) for eukaryotic and prokaryotic sequences individually. The algorithm was tested with sequences from both eukaryotes and prokaryotes for at least 9 experimentally verified and validated functional TFs in promoter sequences. The order and type of TF binding to the promoter of genes encoding central metabolic pathway (CMP) enzyme was tabulated. A new similarity measure was devised for scoring the similarity between a pair of promoter sequences based on the number and order of motifs. Further, these were grouped in clusters considering the scores between them. The distance between each of the clusters in individual pathway was calculated and a phylogenetic tree was developed. This method is further applied to other pathways such as lipid and amino acid biosynthesis to retrieve and compare experimentally verified and conserved TFBS. PMID:21369887

  18. Generating Synthetic Magnetic Field Intermittency Using a Minimal Multiscale Lagrangian Mapping Approach

    NASA Astrophysics Data System (ADS)

    Subedi, P.; Chhiber, R.; Tessein, J. A.; Wan, M.; Matthaeus, W. H.

    2014-12-01

    The Minimal Multiscale Lagrangian Mapping procedure developed in the context of neutral fluid turbulence is a simple method for generating synthetic vector fields. Using a sequence of low-pass filtered fields, fluid particles are displaced at their rms speed for some scale-dependent time interval, and then interpolated back to a regular grid. Fields produced in this way are seen to possess certain properties of real turbulence. This paper extends the technique to plasmas by taking into account the coupling between the velocity and magnetic fields. We examine several possible applications to plasma systems. One use is as initial conditions for simulations, wherein these synthetic fields may efficiently produce a strongly intermittent cascade. The intermittency properties of the synthetic fields are also compared with those of the solar wind. Finally, studies of cosmic ray transport and modulation in the test particle approximation may benefit from improved realism in synthetic fields produced in this way.

  19. Lagrangian Mapping Approach to Generate Intermittency and its Application in Plasma Turbulence

    NASA Astrophysics Data System (ADS)

    Subedi, P.; Matthaeus, W. H.; Tessein, J.; Chhiber, R.; Wan, M.

    2014-12-01

    The Minimal Lagrangian Mapping procedure developed in the context of neutral fluid turbulence(Rosales and Meneveau 2006) is a simple method to generate synthetic vector fields. Using a sequenceof low pass filtered fields, fluid particles are displaced at their rms-speed for some scale-dependenttime interval, and then interpolated back to a regular grid. Fields produced in this way are seen topossess certain properties of real turbulence. We extend the technique to plasmas by takinginto account the coupling between the velocity and magnetic fields. We examine several possibleapplications to plasma systems. One use is as initial conditions for simulations, wherein these syntheticfields may efficiently produce a strongly intermittent cascade. The intermittency properties of thesynthetic fields are also compared with those of the solar wind. Finally, studies of cosmic ray transportand modulation in the test particle approximation may benefit from improved realism in syntheticfields produced in this way.

  20. Chain mapping approach of Hamiltonian for FMO complex using associated, generalized and exceptional Jacobi polynomials

    NASA Astrophysics Data System (ADS)

    Mahdian, M.; Arjmandi, M. B.; Marahem, F.

    2016-06-01

    The excitation energy transfer (EET) in photosynthesis complex has been widely investigated in recent years. However, one of the main problems is simulation of this complex under realistic condition. In this paper by using the associated, generalized and exceptional Jacobi polynomials, firstly, we introduce the spectral density of Fenna-Matthews-Olson (FMO) complex. Afterward, we obtain a map that transforms the Hamiltonian of FMO complex as an open quantum system to a one-dimensional chain of oscillatory modes with only nearest neighbor interaction in which the system is coupled only to first mode of chain. The frequency and coupling strength of each mode can be analytically obtained from recurrence coefficient of mentioned orthogonal polynomials.

  1. Mapping permeability in low-resolution micro-CT images: A multiscale statistical approach

    NASA Astrophysics Data System (ADS)

    Botha, Pieter W. S. K.; Sheppard, Adrian P.

    2016-06-01

    We investigate the possibility of predicting permeability in low-resolution X-ray microcomputed tomography (µCT). Lower-resolution whole core images give greater sample coverage and are therefore more representative of heterogeneous systems; however, the lower resolution causes connecting pore throats to be represented by intermediate gray scale values and limits information on pore system geometry, rendering such images inadequate for direct permeability simulation. We present an imaging and computation workflow aimed at predicting absolute permeability for sample volumes that are too large to allow direct computation. The workflow involves computing permeability from high-resolution µCT images, along with a series of rock characteristics (notably open pore fraction, pore size, and formation factor) from spatially registered low-resolution images. Multiple linear regression models correlating permeability to rock characteristics provide a means of predicting and mapping permeability variations in larger scale low-resolution images. Results show excellent agreement between permeability predictions made from 16 and 64 µm/voxel images of 25 mm diameter 80 mm tall core samples of heterogeneous sandstone for which 5 µm/voxel resolution is required to compute permeability directly. The statistical model used at the lowest resolution of 64 µm/voxel (similar to typical whole core image resolutions) includes open pore fraction and formation factor as predictor characteristics. Although binarized images at this resolution do not completely capture the pore system, we infer that these characteristics implicitly contain information about the critical fluid flow pathways. Three-dimensional permeability mapping in larger-scale lower resolution images by means of statistical predictions provides input data for subsequent permeability upscaling and the computation of effective permeability at the core scale.

  2. A robotic approach to mapping post-eruptive volcanic fissure conduits

    NASA Astrophysics Data System (ADS)

    Parcheta, Carolyn E.; Pavlov, Catherine A.; Wiltsie, Nicholas; Carpenter, Kalind C.; Nash, Jeremy; Parness, Aaron; Mitchell, Karl L.

    2016-06-01

    VolcanoBot was developed to map volcanic vents and their underlying conduit systems, which are rarely preserved and generally inaccessible to human exploration. It uses a PrimeSense Carmine 1.09 sensor for mapping and carries an IR temperature sensor, analog distance sensor, and an inertial measurement unit (IMU) inside a protective shell. The first field test succeeded in collecting valuable scientific data but revealed several needed improvements, including more rugged cable connections and mechanical couplers, increased ground clearance, and higher-torque motors for uphill mobility. The second field test significantly improved on all of these aspects but it traded electrical ruggedness for reduced data collection speed. Data collected by the VolcanoBots, while intermittent, yield the first insights into the cm-scale geometry of volcanic fissures at depths of up to 25 m. VolcanoBot was deployed at the 1969 Mauna Ulu fissure system on Kīlauea volcano in Hawai'i. It collected first-of-its-kind data from inside the fissure system. We hypothesized that 1) fissure sinuosity should decrease with depth, 2) irregularity should be persistent with depth, 3) any blockages in the conduit should occur at the narrowest points, and 4) the fissure should narrow with depth until it is too narrow for VolcanoBot to pass or is plugged with solidified lava. Our field campaigns did not span enough lateral or vertical area to test sinuosity. The preliminary data indicate that 1) there were many irregularities along fissures at depth, 2) blockages occurred, but not at obviously narrow locations, and 3) the conduit width remained a consistent 0.4-0.5 m for most of the upper 10 m that we analyzed.

  3. 3D models mapping optimization through an integrated parameterization approach: cases studies from Ravenna

    NASA Astrophysics Data System (ADS)

    Cipriani, L.; Fantini, F.; Bertacchi, S.

    2014-06-01

    Image-based modelling tools based on SfM algorithms gained great popularity since several software houses provided applications able to achieve 3D textured models easily and automatically. The aim of this paper is to point out the importance of controlling models parameterization process, considering that automatic solutions included in these modelling tools can produce poor results in terms of texture utilization. In order to achieve a better quality of textured models from image-based modelling applications, this research presents a series of practical strategies aimed at providing a better balance between geometric resolution of models from passive sensors and their corresponding (u,v) map reference systems. This aspect is essential for the achievement of a high-quality 3D representation, since "apparent colour" is a fundamental aspect in the field of Cultural Heritage documentation. Complex meshes without native parameterization have to be "flatten" or "unwrapped" in the (u,v) parameter space, with the main objective to be mapped with a single image. This result can be obtained by using two different strategies: the former automatic and faster, while the latter manual and time-consuming. Reverse modelling applications provide automatic solutions based on splitting the models by means of different algorithms, that produce a sort of "atlas" of the original model in the parameter space, in many instances not adequate and negatively affecting the overall quality of representation. Using in synergy different solutions, ranging from semantic aware modelling techniques to quad-dominant meshes achieved using retopology tools, it is possible to obtain a complete control of the parameterization process.

  4. Mixed linear model approach for mapping quantitative trait loci underlying crop seed traits

    PubMed Central

    Qi, T; Jiang, B; Zhu, Z; Wei, C; Gao, Y; Zhu, S; Xu, H; Lou, X

    2014-01-01

    The crop seed is a complex organ that may be composed of the diploid embryo, the triploid endosperm and the diploid maternal tissues. According to the genetic features of seed characters, two genetic models for mapping quantitative trait loci (QTLs) of crop seed traits are proposed, with inclusion of maternal effects, embryo or endosperm effects of QTL, environmental effects and QTL-by-environment (QE) interactions. The mapping population can be generated either from double back-cross of immortalized F2 (IF2) to the two parents, from random-cross of IF2 or from selfing of IF2 population. Candidate marker intervals potentially harboring QTLs are first selected through one-dimensional scanning across the whole genome. The selected candidate marker intervals are then included in the model as cofactors to control background genetic effects on the putative QTL(s). Finally, a QTL full model is constructed and model selection is conducted to eliminate false positive QTLs. The genetic main effects of QTLs, QE interaction effects and the corresponding P-values are computed by Markov chain Monte Carlo algorithm for Gaussian mixed linear model via Gibbs sampling. Monte Carlo simulations were performed to investigate the reliability and efficiency of the proposed method. The simulation results showed that the proposed method had higher power to accurately detect simulated QTLs and properly estimated effect of these QTLs. To demonstrate the usefulness, the proposed method was used to identify the QTLs underlying fiber percentage in an upland cotton IF2 population. A computer software, QTLNetwork-Seed, was developed for QTL analysis of seed traits. PMID:24619175

  5. Self-Organizing Maps approaches to analyze extremes of multivariate wave climate

    NASA Astrophysics Data System (ADS)

    Barbariol, F.; Falcieri, F. M.; Scotton, C.; Benetazzo, A.; Carniel, S.; Sclavo, M.

    2015-08-01

    In this paper the Self-Organizing Map (SOM) technique to assess the multivariate sea wave climate at a site is analyzed and discussed with the aim of a more complete representation which includes the most severe sea states that otherwise would be missed by the standard SOM. Indeed, it is commonly recognized, and herein confirmed, that SOM is a good regressor of a sample where the density of events is high (e.g. for low/moderate and frequent sea states), while SOM fails where the density is low (e.g. for severe and rare sea states). Therefore, we have considered a trivariate wave climate (composed by significant wave height, mean wave period, and mean wave direction) collected continuously at the Acqua Alta oceanographic tower (northern Adriatic Sea, Italy) during the period 1979-2008. Three different strategies derived by the standard SOM have been tested in order to widen the range of applicability to extreme events. The first strategy contemplates a pre-processing of the input dataset with the Maximum Dissimilarity Algorithm; the second and the third strategies focus on the post-processing of SOM outputs, resulting in a two-steps SOM, where the first step is the standard SOM applied to the original dataset, and the second step is an additional SOM on the events exceeding a threshold (either taking all the events over the threshold or only the peaks of storms). Results suggest that post-processing strategies are more effective than the pre-processing one in representing the extreme wave climate, both in the time series and probability density spaces. In addition, a complete graphical representation of the outcomes of two-steps SOM as double-sided maps is proposed.

  6. A measurement approach based on micro-Doppler maps for signature and motion analysis

    NASA Astrophysics Data System (ADS)

    Ricci, R.; Sona, A.

    2013-05-01

    In this paper, a novel and comprehensive measurement approach is proposed for the detection and analysis of human motion signature. The approach combines theoretical concepts and tools of micro-Doppler theory, image processing, and human modeling, in a original way. The attention is primarily focused on the description of the most meaningful parameters influencing the accuracy of the obtained signature. The ultimate purpose is to provide a framework through which organizing, comparing, and merging future research activities, ideas and results in the field of human motion signature analysis for security, health and disaster recovery purposes. Some simulation and experimental results underlying the feasibility and effectiveness of the measurement approach are also summarized and analyzed.

  7. Zoo of Quantum Phases and Excitations of Cold Bosonic Atoms in Optical Lattices

    SciTech Connect

    Alon, Ofir E.; Streltsov, Alexej I.; Cederbaum, Lorenz S.

    2005-07-15

    Quantum phases and phase transitions of weakly to strongly interacting bosonic atoms in deep to shallow optical lattices are described by a single multiorbital mean-field approach in real space. For weakly interacting bosons in one dimension, the critical value of the superfluid to Mott insulator (MI) transition found is in excellent agreement with many-body treatments of the Bose-Hubbard model. For strongly interacting bosons (i) additional MI phases appear, for which two (or more) atoms residing in each site undergo a Tonks-Girardeau-like transition and localize, and (ii) on-site excitation becomes the excitation lowest in energy. Experimental implications are discussed.

  8. An automatic approach for rice mapping in temperate region using time series of MODIS imagery: first results for Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Boschetti, M.; Nelson, A.; Manfrom, G.; Brivio, P. A.

    2012-04-01

    Timely and accurate information on crop typology and status are required to support suitable action to better manage agriculture production and reduce food insecurity. More specifically, regional crop masking and phenological information are important inputs for spatialized crop growth models for yield forecasting systems. Digital cartographic data available at global/regional scale, such as GLC2000, GLOBCOVER or MODIS land cover products (MOD12), are often not adequate for this crop modeling application. For this reason, there is a need to develop and test methods that can provide such information for specific cropsusing automated classification techniques.. In this framework we focused our analysis on the rice cultivation area detection due to the importance of this crop. Rice is a staple food for half of the world's population (FAO 2004). Over 90% of the world's rice is produced and consumed in Asia and the region is home to 70% of the world's poor, most of whom depend on rice for their livelihoods andor food security. Several initiatives are being promoted at the international level to provide maps of rice cultivated areas in South and South East Asia using different approaches available in literature for rice mapping in tropical regions. We contribute to these efforts by proposing an automatic method to detect rice cultivated areas in temperate regions exploiting MODIS 8-Day composite of Surface Reflectance at 500m spatial resolution (MOD09A1product). Temperate rice is cultivated worldwide in more than 20 countries covering around 16M ha for a total production of about 65M tons of paddy per year. The proposed method is based on a common approach available in literature that first identifies flood condition that can be related to rice agronomic practice and then checks for vegetation growth. The method presents innovative aspects related both to the flood detection, exploiting Short Wave Infrared spectral information, and to the crop grow monitoring analyzing

  9. LANDSCAPE ECOLOGY APPROACHES FOR DETECTING, MAPPING, AND ASSESSING THE VULNERABILITY OF DEPRESSIONAL WETLANDS

    EPA Science Inventory

    U.S. EPA is using a landscape ecology approach to assess the ecological/hydrologic functions and related human values of depressional wetlands along coastal Texas, considered to be vulnerable to human disturbance. Many of those wetlands may be at high risk because of recent court...

  10. A multi-sensor approach to retrieving high resolution daily evapotranspiration maps

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The use of residual surface energy balance approaches to determine spatially distributed evapotranspiration (ET) over large areas has been considered an effective solution in the last years, especially due to the increasing availability of remotely-observed land-surface temperature (LST) data. Howev...

  11. Geospatial Approach to Regional Mapping of Research Library Holdings: Use of Arcinfo at IRANDOC

    ERIC Educational Resources Information Center

    Sedighi, Mehri-e-

    2007-01-01

    Purpose: The purpose of this paper is to provide a report on the application of a Geographic Information System (GIS), ArcInfo, in the cataloguing of geosciences documents held by IRANDOC. Design/methodology/approach: The steps involved in the application are described: gathering the data and required input including the attribute and spatial…

  12. Mapping trees outside forests using high-resolution aerial imagery: a comparison of pixel- and object-based classification approaches.

    PubMed

    Meneguzzo, Dacia M; Liknes, Greg C; Nelson, Mark D

    2013-08-01

    Discrete trees and small groups of trees in nonforest settings are considered an essential resource around the world and are collectively referred to as trees outside forests (ToF). ToF provide important functions across the landscape, such as protecting soil and water resources, providing wildlife habitat, and improving farmstead energy efficiency and aesthetics. Despite the significance of ToF, forest and other natural resource inventory programs and geospatial land cover datasets that are available at a national scale do not include comprehensive information regarding ToF in the United States. Additional ground-based data collection and acquisition of specialized imagery to inventory these resources are expensive alternatives. As a potential solution, we identified two remote sensing-based approaches that use free high-resolution aerial imagery from the National Agriculture Imagery Program (NAIP) to map all tree cover in an agriculturally dominant landscape. We compared the results obtained using an unsupervised per-pixel classifier (independent component analysis-[ICA]) and an object-based image analysis (OBIA) procedure in Steele County, Minnesota, USA. Three types of accuracy assessments were used to evaluate how each method performed in terms of: (1) producing a county-level estimate of total tree-covered area, (2) correctly locating tree cover on the ground, and (3) how tree cover patch metrics computed from the classified outputs compared to those delineated by a human photo interpreter. Both approaches were found to be viable for mapping tree cover over a broad spatial extent and could serve to supplement ground-based inventory data. The ICA approach produced an estimate of total tree cover more similar to the photo-interpreted result, but the output from the OBIA method was more realistic in terms of describing the actual observed spatial pattern of tree cover. PMID:23255169

  13. An indicators' based approach to Drought and Water Scarcity Risk Mapping in Pinios River Basin, Greece.

    NASA Astrophysics Data System (ADS)

    Kossida, Maggie; Mimikou, Maria

    2013-04-01

    Assessing the vulnerability and the associated risk to water scarcity and drought is a complex multi-factor problem. The underlying exposure to climatic stresses may be similar even in quite different conditions, yet the vulnerability and prevailing risk are a function of the socio-economic state, the current policy and institutional setting, the adaptive capacity of the affected area and population, and the response strategies adopted (Kossida et al., 2012). Although flood risk assessment has been elaborated under the EU Floods Directive, there is currently a lack of analytical frameworks for the definition and assessment of drought and water scarcity related risk at European level. This can partially be attributed to the inherent complexity of such phenomena which lie at the crossroads between physical and anthropogenic drivers and pressures, operating on many scales, and with a variety of impacts on many sectors. The quantification of the various components of drought and water scarcity risk is challenging since data present limitations, relevant indicators that can represent or proxy the various components are still not clearly defined, while their relevant weights need to be determined in view of the prevailing regional conditions. The current study in Pinios River Basin, an area highly impacted by drought and water scarcity, proposes a methodology for drought and water scarcity risk assessment using blended indicators. Using the Standard Precipitation Index (SPI) as a base drought indicator, relevant sub-indicators reflecting the magnitude, severity, duration and recurrence of drought events from 1980-2011 have been produced. These sub-indicators have been assigned relevant scores and have been blended into a Drought Vulnerability Index (DVI) using different weights derived from an analytical hierarchy process (AHP). The resulting map of DVI has been then blended with additional socio-economic indicators of surface and groundwater exploitation, water deficit

  14. A Novel Approach to Mapping Intertidal Areas Using Shore-Based X-band Marine Radar

    NASA Astrophysics Data System (ADS)

    Bird, Cai; Bell, Paul

    2014-05-01

    Monitoring the morphology of coastal zones in response to high energy weather events and changing patterns of erosion and deposition over time is vital in enabling effective decision-making at the coast. Common methods of mapping intertidal bathymetry currently include vessel-based sonar and airborne LiDAR surveys, which are expensive and thus not routinely collected on a continuous basis. Marine radar is a ubiquitous technology in the marine industry and many ports operate a system to guide ships into port, this work aims to utilise this already existing infrastructure to determine bathymetry over large intertidal areas, currently up to 4 km from the radar. Standard X-band navigational radar has been used in the marine industry to measure hydrodynamics and derive bathymetry using empirical techniques for several decades. Methods of depth mapping thus far have relied on the electromagnetic backscattering from wind-roughened water surface, which allows a radar to gather sea surface image data but requires the waves to be clearly defined. The work presented here does not rely on identifying and measuring these spatial wave features, which increases the robustness of the method. Image data collected by a 9.4Ghz Kelvin Hughes radar from a weather station on Hilbre Island at the mouth of the River Dee estuary, UK were used in the development of this method. Image intensity at each pixel is a function of returned electromagnetic energy, which in turn can be related to the roughness of the sea surface. Images collected over time periods of 30 minutes show general patterns of wave breaking and mark the advance and retreat of the waterline in accordance with the tidal cycle and intertidal morphology. Each pixel value can be extracted from these mean images and analysed over the course of several days, giving a fluctuating time series of pixel intensity, the gradient of which gives a series of pulses representing transitions between wet and dry at each location. A tidal

  15. Empirical approach for estimating the ExB velocity from VTEC map

    NASA Astrophysics Data System (ADS)

    Ao, Xi

    For the development of wireless communication, the Earth's ionosphere is very critical. A Matlab program is designed to improve the techniques for monitoring and forecasting the conditions of the Earth's ionosphere. The work in this thesis aims to modeling of the dependency between the equatorial anomaly gap (EAP) in the Earth's ionosphere and the crucial driver, ExB velocity, of the Earth's ionosphere. In this thesis, we review the mathematics of the model in the eleventh generation of the International Geomagnetic Reference Field (IGRF) and an enhancement version of Global Assimilative Ionospheric Model (GAIM), GAIM++ Model. We then use the IGRF Model and a Vertical Total Electron Content (VTEC) map from GAIM++ Model to determine the EAP in the Earth's ionosphere. Then, by changing the main parameters, the 10.7cm solar radio flux (F10.7) and the planetary geomagnetic activity index (AP), we compare the different value of the EAP in the Earth's ionosphere and the ExB velocity of the Earth's ionosphere. At last, we demonstrate that the program can be effective in determining the dependency between the EAP in the Earth's ionosphere and the ExB velocity of the Earth's ionosphere.

  16. An approach to improve the spatial resolution of a force mapping sensing system

    NASA Astrophysics Data System (ADS)

    Negri, Lucas Hermann; Manfron Schiefer, Elberth; Sade Paterno, Aleksander; Muller, Marcia; Luís Fabris, José

    2016-02-01

    This paper proposes a smart sensor system capable of detecting sparse forces applied to different positions of a metal plate. The sensing is performed with strain transducers based on fiber Bragg gratings (FBG) distributed under the plate. Forces actuating in nine squared regions of the plate, resulting from up to three different loads applied simultaneously to the plate, were monitored with seven transducers. The system determines the magnitude of the force/pressure applied on each specific area, even in the absence of a dedicated transducer for that area. The set of strain transducers with coupled responses and a compressive sensing algorithm are employed to solve the underdetermined inverse problem which emerges from mapping the force. In this configuration, experimental results have shown that the system is capable of recovering the value of the load distributed on the plate with a signal-to-noise ratio better than 12 dB, when the plate is submitted to three simultaneous test loads. The proposed method is a practical illustration of compressive sensing algorithms for the reduction of the number of FBG-based transducers used in a quasi-distributed configuration.

  17. Physical activity, physical fitness and academic achievement in adolescents: a self-organizing maps approach.

    PubMed

    Pellicer-Chenoll, Maite; Garcia-Massó, Xavier; Morales, Jose; Serra-Añó, Pilar; Solana-Tramunt, Mònica; González, Luis-Millán; Toca-Herrera, José-Luis

    2015-06-01

    The relationship among physical activity, physical fitness and academic achievement in adolescents has been widely studied; however, controversy concerning this topic persists. The methods used thus far to analyse the relationship between these variables have included mostly traditional lineal analysis according to the available literature. The aim of this study was to perform a visual analysis of this relationship with self-organizing maps and to monitor the subject's evolution during the 4 years of secondary school. Four hundred and forty-four students participated in the study. The physical activity and physical fitness of the participants were measured, and the participants' grade point averages were obtained from the five participant institutions. Four main clusters representing two primary student profiles with few differences between boys and girls were observed. The clustering demonstrated that students with higher energy expenditure and better physical fitness exhibited lower body mass index (BMI) and higher academic performance, whereas those adolescents with lower energy expenditure exhibited worse physical fitness, higher BMI and lower academic performance. With respect to the evolution of the students during the 4 years, ∼25% of the students originally clustered in a negative profile moved to a positive profile, and there was no movement in the opposite direction. PMID:25953972

  18. Critical phenomena in self-organizing feature maps: Ginzburg-Landau approach

    NASA Astrophysics Data System (ADS)

    der, R.; Herrmann, M.

    1994-06-01

    Self-organizing feature maps (SOFM's) as generated by Kohonen's algorithm are prominent examples of the cross fertilization between theoretical physics and neurobiology. SOFM's serve as high-fidelity models for the internal representation of the external world in the cortex. This is exploited for applications in the fields of data analysis, robotics, and for the data-driven coarse graining of state spaces of nonlinear dynamical systems. From the point of view of physics Kohonen's algorithm may be viewed as a stochastic dynamical equation of motion for a many particle system of high complexity which may be analyzed by methods of nonequilibrium statistical mechanics. We present analytical and numerical studies of symmetry-breaking phenomena in Kohonen's SOFM that occur due to a topological mismatch between the input space and the neuron setup. We give a microscopic derivation for the time dependent Ginzburg-Landau equations describing the behavior of the order parameter close to the critical point where a topology preserving second-order phase transition takes place. By extensive computer simulations we do not only support our theoretical findings, but also discover a first order transition leading to a topology violating metastable state. Consequently, close to the critical point we observe a phase-coexistence regime.

  19. Approaches to strategic research and technology (R&T) analysis and road mapping

    NASA Astrophysics Data System (ADS)

    Mankins, John C.

    2002-07-01

    Increasingly, the timely and successful incorporation of innovative technologies into new systems is a critical factor in their success or failure. This is true for both commercial and government space missions. In addition, continuing progress in methodologies that may enable the effective identification of long-term technology needs and opportunities—and the guidance of ongoing research and technology (R&T) programs to address them—is vital to progress in space exploration and commercial development. NASA's long-standing use of technology readiness levels (TRLs) is one such approach. These technology discipline-independent metrics provide a valuable tool in technology management at all levels in an organization. However, TRLs provide only the basic guideposts for R&T management: information on the current and desired level of maturity of a technology for a particular application. In order to succeed over the longer term, additional methodologies are needed, including those which allow the identification of anticipated uncertainty in planned R&T programs, as well as approaches that permit the identification of overall technology-derived uncertainty in future space systems developments. This paper provides a preliminary discussion of this critical subject, including an overview of the history and the current practices of the TRL approach. In addition, the paper presents a recently-formulated strategic technology management approach that attempts to address the question of uncertainty in technology development and applications: the Integrated Technology Analysis Methodology (ITAM). The paper concludes with a discussion of a future directions for space technology management, and how these tools might be used to facilitate coordination and discussions in an international setting.

  20. A Tetrahedron-Based Endmember Selection Approach for Urban Impervious Surface Mapping

    PubMed Central

    Wang, Wei; Yao, Xinfeng; Zhai, Junpeng; Ji, Minhe

    2014-01-01

    The pixel purity index (PPI) and two-dimensional (2-D) scatter plots are two popular techniques for endmember extraction in remote sensing spectral mixture analysis, yet both suffer from one major drawback, that is, the selection of a final set of endmembers has to endure a cumbersome process of iterative visual inspection and human intervention, especially when a spectrally-complex urban scene is involved. Within the conceptual framework of a V-H-L-S (vegetation-high albedo-low albedo-soil) model, which is expanded from the classic V-I-S (vegetation-impervious surface-soil) model, a tetrahedron-based endmember selection approach combined with a multi-objective optimization genetic algorithm (MOGA) was designed to identify urban endmembers from multispectral imagery. The tetrahedron defining the enclosing volume of MNF-transformed pixels in a three-dimensional (3-D) space was algorithmically sought, so that the tetrahedral vertices can ideally match the four components of the adopted model. A case study with Landsat Enhanced Thematic Mapper Plus (ETM+) satellite imagery in Shanghai, China was conducted to verify the validity of the method. The method performance was compared with those of the traditional PPI and 2-D scatter plots approaches. The results indicated that the tetrahedron-based endmember selection approach performed better in both accuracy and ease of identification for urban surface endmembers owing to the 3-D visualization analysis and use of the MOGA. PMID:24892938