Science.gov

Sample records for boson mapping approach

  1. Composite fermion-boson mapping for fermionic lattice models.

    PubMed

    Zhao, J; Jiménez-Hoyos, C A; Scuseria, G E; Huerga, D; Dukelsky, J; Rombouts, S M A; Ortiz, G

    2014-11-12

    We present a mapping of elementary fermion operators onto a quadratic form of composite fermionic and bosonic cluster operators. The mapping is an exact isomorphism as long as the physical constraint of one composite particle per cluster is satisfied. This condition is treated on average in a composite particle mean-field approach, which consists of an ansatz that decouples the composite fermionic and bosonic sectors. The theory is tested on the 1D and 2D Hubbard models. Using a Bogoliubov determinant for the composite fermions and either a coherent or Bogoliubov state for the bosons, we obtain a simple and accurate procedure for treating the Mott insulating phase of the Hubbard model with mean-field computational cost.

  2. A Spin-Boson Screening approach for unraveling dominant vibrational energy transfer pathways in molecular materials

    NASA Astrophysics Data System (ADS)

    Chuntonov, Lev; Peskin, Uri

    2017-01-01

    Vibrational energy transfer driven by anharmonicity is the major mechanism of energy dissipation in polyatomic molecules and in molecules embedded in condensed phase environment. Energy transfer pathways are sensitive to the particular intra-molecular structure as well as to specific interactions between the molecule and its environment, and their identification is a challenging many-body problem. This work introduces a theoretical approach which enables to identify the dominant pathways for specified initial excitations, by screening the different possible relaxation channels. For each channel, the many-body Hamiltonian is mapped onto a respective all-vibrational Spin-Boson Hamiltonian, expressed in terms of the harmonic frequencies and the anharmonic coupling parameters obtained from the electronic structure of the molecule in its environment. A focus is given on the formulation of the relaxation rates when different limits of perturbation theory apply. In these cases the proposed Spin-Boson Screening approach becomes especially powerful.

  3. X-boson cumulant approach to the topological Kondo insulators

    NASA Astrophysics Data System (ADS)

    Ramos, E.; Franco, R.; Silva-Valencia, J.; Foglio, M. E.; Figueira, M. S.

    2014-12-01

    In this work we present a generalization of our previous work of the X-boson approach to the periodic Anderson model (PAM), adequate to study a novel class of intermetallic 4f and 5f orbitals materials: the topological Kondo insulators, whose paradigmatic material is the compound SmB6. For simplicity, we consider a version of the PAM on a 2D square lattice, adequate to describe Ce-based compounds in two dimensions. The starting point of the model is the 4f - Ce ions orbitals, with J = 5/2 multiplet, in the presence of spin-orbit coupling. Our technique works well for all of the parameters of the model and avoids the unwanted phase transitions of the slave boson mean field theory. We present a critical comparison of our results with those of the usual slave boson method, that has been intensively used to describe this class of materials. We also obtain a new valence first order transition which we attribute to the vec k dependence of the hybridization.

  4. Self-consistent Hartree-Fock approach for interacting bosons in optical lattices

    NASA Astrophysics Data System (ADS)

    Lü, Qin-Qin; Patton, Kelly R.; Sheehy, Daniel E.

    2014-12-01

    A theoretical study of interacting bosons in a periodic optical lattice is presented. Instead of the commonly used tight-binding approach (applicable near the Mott-insulating regime of the phase diagram), the present work starts from the exact single-particle states of bosons in a cubic optical lattice, satisfying the Mathieu equation, an approach that can be particularly useful at large boson fillings. The effects of short-range interactions are incorporated using a self-consistent Hartree-Fock approximation, and predictions for experimental observables such as the superfluid transition temperature, condensate fraction, and boson momentum distribution are presented.

  5. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  6. Map Projections: Approaches and Themes

    ERIC Educational Resources Information Center

    Steward, H. J.

    1970-01-01

    Map projections take on new meaning with location systems needed for satellites, other planets and space. A classroom approach deals first with the relationship between the earth and the globe, then with transformations to flat maps. Problems of preserving geometric qualities: distance, angles, directions are dealt with in some detail as are…

  7. X-slave boson approach to the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Franco, R.; Figueira, M. S.; Foglio, M. E.

    2001-05-01

    The periodic anderson model (PAM) in the limit U=∞, can be studied by employing the Hubbard X operators to project out the unwanted states. In a previous work, we have studied the cumulant expansion of this Hamiltonian employing the hybridization as a perturbation, but probability conservation of the local states (completeness) is not usually satisfied when partial expansions like the "chain approximation (CHA)" are employed. To consider this problem, we use a technique similar to the one employed by Coleman to treat the same problem with slave-bosons in the mean-field approximation. Assuming a particular renormalization for hybridization, we obtain a description that avoids an unwanted phase transition that appears in the mean-field slave-boson method at intermediate temperatures.

  8. X-boson cumulant approach to the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Franco, R.; Figueira, M. S.; Foglio, M. E.

    2002-07-01

    The periodic Anderson model can be studied in the limit U=∞ by employing the Hubbard X operators to project out the unwanted states. We had already studied this problem by employing the cumulant expansion with the hybridization as perturbation, but the probability conservation of the local states (completeness) is not usually satisfied when partial expansions like the ``chain approximation'' (CHA) are employed. To rectify this situation, we modify the CHA by employing a procedure that was used in the mean-field approximation of Coleman's slave-boson method. Our technique reproduces the features of that method in its region of validity, but avoids the unwanted phase transition that appears in the same method both when μ>>Ef at low T and for all values of the parameters at intermediate temperatures. Our method also has a dynamic character that is absent from the mean-field slave-boson method.

  9. Nonunitary and unitary approach to Eigenvalue problem of Boson operators and squeezed coherent states

    NASA Technical Reports Server (NTRS)

    Wunsche, A.

    1993-01-01

    The eigenvalue problem of the operator a + zeta(boson creation operator) is solved for arbitrarily complex zeta by applying a nonunitary operator to the vacuum state. This nonunitary approach is compared with the unitary approach leading for the absolute value of zeta less than 1 to squeezed coherent states.

  10. Study of molecular vibration by coupled cluster method: Bosonic approach

    NASA Astrophysics Data System (ADS)

    Banik, Subrata; Pal, Sourav; Prasad, M. Durga

    2015-01-01

    The vibrational coupled cluster method in bosonic representation is formulated to describe the molecular anharmonic vibrational spectra. The vibrational coupled cluster formalism is based on Watson Hamiltonian in normal coordinates. The vibrational excited states are described using coupled cluster linear response theory (CCLRT). The quality of the coupled cluster wave function is analyzed. Specifically, the mean displacement values of the normal coordinates and expectation values of the square of the normal coordinates of different vibrational states are calculated. A good agreement between the converged full CI results and coupled cluster results is found for the lower lying vibrational states.

  11. Coherent state approach to the interacting boson model: Test of its validity in the transitional region

    SciTech Connect

    Inci, I.; Alonso, C. E.; Arias, J. M.; Fortunato, L.; Vitturi, A.

    2009-09-15

    The predictive power of the coherent state (CS) approach to the interacting boson model (IBM) is tested far from the IBM dynamical symmetry limits. The transitional region along the {gamma}-unstable path from U(5) to O(6) is considered. Excitation energy of the excited {beta} band and intraband and interband transitions obtained within the CS approach are compared with the exact results as a function of the boson number N. We find that the CS formalism provides approximations to the exact results that are correct up to the order 1/N in the transitional region, except in a narrow region close to the critical point.

  12. Correlated-pair approach to composite-boson scattering lengths

    NASA Astrophysics Data System (ADS)

    Shiau, Shiue-Yuan; Combescot, Monique; Chang, Yia-Chung

    2016-11-01

    We derive the scattering length of composite bosons (cobosons) within the framework of the composite-boson many-body formalism that uses correlated-pair states as a basis instead of free-fermion states. The integral equation constructed from this physically relevant basis makes transparent the role of fermion exchange in the coboson-coboson effective scattering. Three potentials used for Cooper pairs, fermionic-atom dimers, and semiconductor excitons are considered. While the s -wave scattering length for the BCS-like potential is just equal to its Born value, the other two are substantially smaller. For fermionic-atom dimers and semiconductor excitons, our results, calculated within a restricted correlated-pair basis, are in good agreement with those obtained from procedures numerically more demanding. We also propose model coboson-coboson scatterings that are separable and thus easily workable and that produce scattering lengths which match quantitatively well with the numerically obtained values for all fermion mass ratios. These separable model scatterings can facilitate future works on many-body effects in coboson gases.

  13. Perturbative Approaching for Boson Fields' System in a Lewis-Papapetrou Space-Time

    SciTech Connect

    Murariu, G.; Dariescu, M. A.; Dariescu, C.

    2010-08-04

    In this paper the first order solutions of a Klein--Gordon--Maxwell--Einstein coupled system equations were derived for boson fields in a Lewis Papapetrou space time. The results expand the previous static solutions obtained in literature. A main goal is represented by the symbolic script built for such approach.

  14. Mott transition in the dynamic Hubbard model within slave boson mean-field approach

    NASA Astrophysics Data System (ADS)

    Le, Duc-Anh

    2014-04-01

    At zero temperature, the Kotliar-Ruckenstein slave boson mean-field approach is applied to the dynamic Hubbard model. In this paper, the influences of the dynamics of the auxiliary boson field on the Mott transition are investigated. At finite boson frequency, the Mott-type features of the Hubbard model is found to be enhanced by increasing the pseudospin coupling parameter g. For sufficiently large pseudospin coupling g, the Mott transition occurs even for modest values of the bare Hubbard interaction U. The lack of electron-hole symmetry is highlighted through the quasiparticle weight. Our results are in good agreement with the ones obtained by two-site dynamical mean-field theory and determinant quantum Monte Carlo simulation.

  15. Quantum phase transitions in the sub-Ohmic spin-boson model: failure of the quantum-classical mapping.

    PubMed

    Vojta, Matthias; Tong, Ning-Hua; Bulla, Ralf

    2005-02-25

    The effective theories for many quantum phase transitions can be mapped onto those of classical transitions. Here we show that the naive mapping fails for the sub-Ohmic spin-boson model which describes a two-level system coupled to a bosonic bath with power-law spectral density, J(omega) proportional, variantomega(s). Using an epsilon expansion we prove that this model has a quantum transition controlled by an interacting fixed point at small s, and support this by numerical calculations. In contrast, the corresponding classical long-range Ising model is known to display mean-field transition behavior for 0 < s < 1/2, controlled by a noninteracting fixed point. The failure of the quantum-classical mapping is argued to arise from the long-ranged interaction in imaginary time in the quantum model.

  16. Quantum Langevin approach for non-Markovian quantum dynamics of the spin-boson model

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng-Yang; Chen, Mi; Yu, Ting; You, J. Q.

    2016-02-01

    One longstanding difficult problem in quantum dissipative dynamics is to solve the spin-boson model in a non-Markovian regime where a tractable systematic master equation does not exist. The spin-boson model is particularly important due to its crucial applications in quantum noise control and manipulation as well as its central role in developing quantum theories of open systems. Here we solve this important model by developing a non-Markovian quantum Langevin approach. By projecting the quantum Langevin equation onto the coherent states of the bath, we can derive a set of non-Markovian quantum Bloch equations containing no explicit noise variables. This special feature offers a tremendous advantage over the existing stochastic Schrödinger equations in numerical simulations. The physical significance and generality of our approach are briefly discussed.

  17. Verwey Metal-Insulator Transition in Magnetite from the Slave-Boson Approach

    NASA Astrophysics Data System (ADS)

    Sherafati, Mohammad; Satpathy, Sashi; Pettey, Dix

    2013-03-01

    We study the Verwey metal-insulator transition in magnetite (Ref.1) by solving a three-band extended Hubbard Hamiltonian for spinless fermions using the slave-boson approach, which also includes coupling to the local phonon modes. This model is suggested from the earlier density-functional studies of magnetite.(Ref.2) We first solve the 1D Hubbard model for the spinless fermions with nearest-neighbor interaction by both Gutzwiller variational and slave-boson methods and show that these two approaches yield different results unlike in the case of the standard Hubbard model, thereby clarifying some of the discrepancies in the literature (Ref.3), then we extend the formalism to three-band Hamiltonian for magnetite. The results suggest a metal-insulator transition at a critical value for the intersite interaction.

  18. The exhaustion problem in the periodic Anderson model: An X-boson approach

    NASA Astrophysics Data System (ADS)

    Franco, R.; Silva-Valencia, J.; Figueira, M. S.

    2006-10-01

    We study the thermodynamical properties of the periodic Anderson model (PAM), within the X-boson approach. The exhaustion problem is studied and we calculate the entropy and the specific heat for the heavy fermion Kondo regime (HF-K) of the PAM. We compute numerically the evolution of the Kondo lattice TKL and the Fermi liquid T* temperatures as function of the conduction electron occupation number nc. The results obtained are consistent with others reported in the literature for the Kondo lattice.

  19. Supersymmetric Ito equation: Bosonization and exact solutions

    SciTech Connect

    Ren Bo; Yu Jun; Lin Ji

    2013-04-15

    Based on the bosonization approach, the N=1 supersymmetric Ito (sIto) system is changed to a system of coupled bosonic equations. The approach can effectively avoid difficulties caused by intractable fermionic fields which are anticommuting. By solving the coupled bosonic equations, the traveling wave solutions of the sIto system are obtained with the mapping and deformation method. Some novel types of exact solutions for the supersymmetric system are constructed with the solutions and symmetries of the usual Ito equation. In the meanwhile, the similarity reduction solutions of the model are also studied with the Lie point symmetry theory.

  20. Usage-Oriented Topic Maps Building Approach

    NASA Astrophysics Data System (ADS)

    Ellouze, Nebrasse; Lammari, Nadira; Métais, Elisabeth; Ben Ahmed, Mohamed

    In this paper, we present a collaborative and incremental construction approach of multilingual Topic Maps based on enrichment and merging techniques. In recent years, several Topic Map building approaches have been proposed endowed with different characteristics. Generally, they are dedicated to particular data types like text, semi-structured data, relational data, etc. We note also that most of these approaches take as input monolingual documents to build the Topic Map. The problem is that the large majority of resources available today are written in various languages, and these resources could be relevant even to non-native speakers. Thus, our work is driven towards a collaborative and incremental method for Topic Map construction from textual documents available in different languages. To enrich the Topic Map, we take as input a domain thesaurus and we propose also to explore the Topic Map usage which means available potential questions related to the source documents.

  1. The New Approach for Earhtquake Hazard Mapping

    NASA Astrophysics Data System (ADS)

    Handayani, B.; Karnawati, D.; Anderson, R.

    2008-05-01

    It is the fact the hazard map, such as Earthquake Hazard Map, may not always effectively implemented in the mitigation effort. All of the hazard maps are technical maps which is not always easy to be understood and followed by the community living in the vulnerable areas. Therefore, some effots must be done to guarantee the effectiveness of hazard map. This paper will discuss about the approach and method for developing more appropriate earthquake hazard map in Bantul Regency, Yogyakarta, Indonesia. Psychological mapping to identify levels and distributions of community trauma is proposed as the early reference for earhquake hazard mapping. By referring to this trauma zonation and combining with the seismicity and geological mapping, the earthquake hazard mapping can be established. It is also interesting that this approach is not only providing more appropriate hazard map, but also stimulating the community empowerement in the earthquake vulnerable areas. Several training for improving community awareness are also conducted as a part of the mapping process.

  2. A new approach to shortest paths on networks based on the quantum bosonic mechanism

    NASA Astrophysics Data System (ADS)

    Jiang, Xin; Wang, Hailong; Tang, Shaoting; Ma, Lili; Zhang, Zhanli; Zheng, Zhiming

    2011-01-01

    This paper presents quantum bosonic shortest path searching (QBSPS), a natural, practical and highly heuristic physical algorithm for reasoning about the recognition of network structure via quantum dynamics. QBSPS is based on an Anderson-like itinerant bosonic system in which a boson's Green function is used as a navigation pointer for one to accurately approach the terminals. QBSPS is demonstrated by rigorous mathematical and physical proofs and plenty of simulations, showing how it can be used as a greedy routing to seek the shortest path between different locations. In methodology, it is an interesting and new algorithm rooted in the quantum mechanism other than combinatorics. In practice, for the all-pairs shortest-path problem in a random scale-free network with N vertices, QBSPS runs in O(μ(N) ln ln N) time. In application, we suggest that the corresponding experimental realizations are feasible by considering path searching in quantum optical communication networks; in this situation, the method performs a pure local search on networks without requiring the global structure that is necessary for current graph algorithms.

  3. Supersymmetric transformation approach to pseudopotentials in condensed matter physics and bosonic superconductivity in two dimensions

    NASA Astrophysics Data System (ADS)

    Zhu, Wei

    This thesis is divided into two parts. The first part, "Supersymmetric Transformation Approach to Pseudopotentials in Condensed Matter Physics", provides a new method to obtain pseudopotentials, The conventional methods of constructing pseudopotentials based on the spirit of Orthogonalized Plane Wave and Augmented Plane Wave, etc. as well as the modern version of the norm-conserving pseudopotentials through density functional theory are first reviewed. Our new supersymmetric approach is aimed at eliminating some of the disadvantages while retaining in full the advantages such as phase equivalence or norm-conserving properties of the pseudopotentials. Vast amounts of numerical computation can be eliminated as compared to the old methods. Details and examples are given. Part two, "Bosonic Superconductivity in Two Dimensions", describes a theory for high Tc superconductivity aimed at the current cuprates superconductors. The current status of the cuprates is first reviewed. A one-band Hubbard model is used to formulate the interaction among the holes doped into the layered compounds. Tightly bound pairs of size ˜ a few lattice spacings are obtained based on the Antiferromagnetic Background Approximation. They are shown to have the dsb{xsp2-ysp2} symmetry. Such boson-like pairs form the basis of charged boson models. After reviewing the properties of an ideal charged bose gas including a perfect Meissner effect for 3D, and a nearly perfect Meissner effect for 2D, we develop a theory for high Tc superconductivity without interlayer coupling as adapted, on the one hand, from Friedberg-Lee's mixed Boson-Fermion model to 2D and, on the other hand, from May's work on two-dimensional ideal charged bosons. In addition to the critical temperature Tsb{May} for transition to a phase exhibiting a near-perfect Meissner effect, a new transition temperature Tsb{c} depending on the finite area of the system and the temperature-dependent coherence length is introduced. The appearance

  4. Double occupancy in dynamical mean-field theory and the dual boson approach

    NASA Astrophysics Data System (ADS)

    van Loon, Erik G. C. P.; Krien, Friedrich; Hafermann, Hartmut; Stepanov, Evgeny A.; Lichtenstein, Alexander I.; Katsnelson, Mikhail I.

    2016-04-01

    We discuss the calculation of the double occupancy using dynamical mean-field theory in finite dimensions. The double occupancy can be determined from the susceptibility of the auxiliary impurity model or from the lattice susceptibility. The former method typically overestimates, whereas the latter underestimates the double occupancy. We illustrate this for the square-lattice Hubbard model. We propose an approach for which both methods lead to identical results by construction and which resolves this ambiguity. This self-consistent dual boson scheme results in a double occupancy that is numerically close to benchmarks available in the literature.

  5. Non-equilibrium slave bosons approach to quantum pumping in interacting quantum dots

    NASA Astrophysics Data System (ADS)

    Citro, Roberta; Romeo, Francesco

    2016-03-01

    We review a time-dependent slave bosons approach within the non-equilibrium Green's function technique to analyze the charge and spin pumping in a strongly interacting quantum dot. We study the pumped current as a function of the pumping phase and of the dot energy level and show that a parasitic current arises, beyond the pure pumping one, as an effect of the dynamical constraints. We finally illustrate an all-electrical mean for spin-pumping and discuss its relevance for spintronics applications.

  6. A flexible approach to genome map assembly

    SciTech Connect

    Harley, E.; Bonner, A.J.

    1994-12-31

    A major goal of the Human Genome Project is to construct detailed physical maps of the human genome. A physical map is an assignment of DNA fragments to their locations on the genome. Complete maps of large genomes require the integration of many kinds of experimental data, each with its own forms of noise and experimental error. To facilitate this integration, we are developing a flexible approach to map assembly based on logic programming and data visualization. Logic programming provides a convenient, and mathematically rigorous way of reasoning about data, while data visualization provides layout algorithms for assembling and displaying genome maps. To demonstrate the approach, this paper describes numerous rules for map assembly implemented in a data-visualization system called Hy+. Using these rules, we have successfully assembled contigs (partial maps) from real and simulated mapping data-data that is noisy, imprecise and contradictory. The main advantage of the approach is that it allows a user to rapidly develop, implement and test new rules for genome map assembly, with a minimum of programming effort.

  7. Exciton-exciton scattering: Composite boson versus elementary boson

    NASA Astrophysics Data System (ADS)

    Combescot, M.; Betbeder-Matibet, O.; Combescot, R.

    2007-05-01

    This paper shows the necessity of introducing a quantum object, the “coboson,” to properly describe, through a fermion scheme, any composite particle, such as the exciton, which is made of two fermions. Although commonly dealt with as elementary bosons, these composite bosons—cobosons in short—differ from them due to their composite nature which makes the handling of their many-body effects quite different from the existing treatments valid for elementary bosons. As a direct consequence of this composite nature, there is no correct way to describe the interaction between cobosons as a potential V . This is rather dramatic because, with the Hamiltonian not written as H=H0+V , all the usual approaches to many-body effects fail. In particular, the standard form of the Fermi golden rule, written in terms of V , cannot be used to obtain the transition rates of two cobosons. To get them, we have had to construct an unconventional expression for this Fermi golden rule in which H only appears. Making use of this expression, we give here a detailed calculation of the time evolution of two excitons. We compare the results of this exact approach with the ones obtained by using an effective bosonic Hamiltonian in which the excitons are considered as elementary bosons with effective scatterings between them, these scatterings resulting from an elaborate mapping between the two-fermion space and the ideal boson space. We show that the relation between the inverse lifetime and the sum of the transition rates for elementary bosons differs from the one of the composite bosons by a factor of 1/2 , so that it is impossible to find effective scatterings between bosonic excitons giving these two physical quantities correctly, whatever the mapping from composite bosons to elementary bosons is. The present paper thus constitutes a strong mathematical proof that, in spite of a widely spread belief, we cannot forget the composite nature of these cobosons, even in the extremely low

  8. A Tangible Approach to Concept Mapping

    NASA Astrophysics Data System (ADS)

    Tanenbaum, Karen; Antle, Alissa N.

    2009-05-01

    The Tangible Concept Mapping project investigates using a tangible user interface to engage learners in concept map creation. This paper describes a prototype implementation of the system, presents some preliminary analysis of its ease of use and effectiveness, and discusses how elements of tangible interaction support concept mapping by helping users organize and structure their knowledge about a domain. The role of physical engagement and embodiment in supporting the mental activity of creating the concept map is explored as one of the benefits of a tangible approach to learning.

  9. Microscopic calculation of interacting boson model parameters by potential-energy surface mapping

    SciTech Connect

    Bentley, I.; Frauendorf, S.

    2011-06-15

    A coherent state technique is used to generate an interacting boson model (IBM) Hamiltonian energy surface which is adjusted to match a mean-field energy surface. This technique allows the calculation of IBM Hamiltonian parameters, prediction of properties of low-lying collective states, as well as the generation of probability distributions of various shapes in the ground state of transitional nuclei, the last two of which are of astrophysical interest. The results for krypton, molybdenum, palladium, cadmium, gadolinium, dysprosium, and erbium nuclei are compared with experiment.

  10. A Statistical Approach for Ambiguous Sequence Mappings

    Technology Transfer Automated Retrieval System (TEKTRAN)

    When attempting to map RNA sequences to a reference genome, high percentages of short sequence reads are often assigned to multiple genomic locations. One approach to handling these “ambiguous mappings” has been to discard them. This results in a loss of data, which can sometimes be as much as 45% o...

  11. Slave-boson mean-field theory versus variational-wave-function approach for the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Yang, Min-Fong; Sun, Shih-Jye; Hong, Tzay-Ming

    1993-12-01

    We show that a special kind of slave-boson mean-field approximation, which allows for the symmetry-broken states appropriate for a bipartite lattice, can give essentially the same results as those by the variational-wave-function approach proposed by Gula´csi, Strack, and Vollhardt [Phys. Rev. B 47, 8594 (1993)]. The advantages of our approach are briefly discussed.

  12. An automated approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Sun, Weihua; Mckeown, Donald M.; Messinger, David W.

    2012-10-01

    Heavy rain from Tropical Storm Lee resulted in a major flood event for the southern tier of New York State in early September 2011 causing evacuation of approximately 20,000 people in and around the city of Binghamton. In support of the New York State Office of Emergency Management, a high resolution multispectral airborne sensor (WASP) developed by RIT was deployed over the flooded area to collect aerial images. One of the key benefits of these images is their provision for flood inundation area mapping. However, these images require a significant amount of storage space and the inundation mapping process is conventionally carried out using manual digitization. In this paper, we design an automated approach for flood inundation mapping from the WASP airborne images. This method employs Spectral Angle Mapper (SAM) for color RGB or multispectral aerial images to extract the flood binary map; then it uses a set of morphological processing and a boundary vectorization technique to convert the binary map into a shapefile. This technique is relatively fast and only requires the operator to select one pixel on the image. The generated shapefile is much smaller than the original image and can be imported to most GIS software packages. This enables critical flood information to be shared with and by disaster response managers very rapidly, even over cellular phone networks.

  13. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  14. Thermodynamics of a One-Dimensional System of Point Bosons: Comparison of the Traditional Approach with a New One

    NASA Astrophysics Data System (ADS)

    Tomchenko, Maksim

    2017-01-01

    We compare two approaches to the construction of the thermodynamics of a one-dimensional periodic system of spinless point bosons: the Yang-Yang approach and a new approach proposed by the author. In the latter, the elementary excitations are introduced so that there is only one type of excitations (as opposed to Lieb's approach with two types of excitations: particle-like and hole-like). At the weak coupling, these are the excitations of the Bogolyubov type. The equations for the thermodynamic quantities in these approaches are different, but their solutions coincide (this is shown below and is the main result). Moreover, the new approach is simpler. An important point is that the thermodynamic formulae in the new approach for any values of parameters are formulae for an ensemble of quasiparticles with Bose statistics, whereas a formulae in the traditional Yang-Yang approach have the Fermi-like one-particle form.

  15. Hydrochromic Approaches to Mapping Human Sweat Pores.

    PubMed

    Park, Dong-Hoon; Park, Bum Jun; Kim, Jong-Man

    2016-06-21

    colorimetric change near body temperature. This feature enables the use of this technique to generate high-quality images of sweat pores. This Account also focuses on the results of the most recent phase of this investigation, which led to the development of a simple yet efficient and reliable technique for sweat pore mapping. The method utilizes a hydrophilic polymer composite film containing fluorescein, a commercially available dye that undergoes a fluorometric response as a result of water-dependent interconversion between its ring-closed spirolactone (nonfluorescent) and ring-opened fluorone (fluorescent) forms. Surface-modified carbon nanodots (CDs) have also been found to be efficient for hydrochromic mapping of human sweat pores. The results discovered by Lou et al. [ Adv. Mater. 2015 , 27 , 1389 ] are also included in this Account. Sweat pore maps obtained from fingertips using these materials were found to be useful for fingerprint analysis. In addition, this hydrochromism-based approach is sufficiently sensitive to enable differentiation between sweat-secreting active pores and inactive pores. As a result, the techniques can be applied to clinical diagnosis of malfunctioning sweat pores. The directions that future research in this area will follow are also discussed.

  16. Composite-boson approach to molecular Bose-Einstein condensates in mixtures of ultracold Fermi gases

    NASA Astrophysics Data System (ADS)

    Bouvrie, P. Alexander; Tichy, Malte C.; Roditi, Itzhak

    2017-02-01

    We show that an ansatz based on independent composite bosons [Phys. Rep. 463, 215 (2008), 10.1016/j.physrep.2007.11.003] accurately describes the condensate fraction of molecular Bose-Einstein condensates in ultracold Fermi gases. The entanglement between the fermionic constituents of a single Feshbach molecule then governs the many-particle statistics of the condensate, from the limit of strong interaction to close to unitarity. This result strengthens the role of entanglement as the indispensable driver of composite-boson behavior. The condensate fraction of fermion pairs at zero temperature that we compute matches excellently previous results obtained by means of fixed-node diffusion Monte Carlo methods and the Bogoliubov depletion approximation. This paves the way towards the exploration of the BEC-BCS crossover physics in mixtures of cold Fermi gases with an arbitrary number of fermion pairs as well as the implementation of Hong-Ou-Mandel-like interference experiments proposed within coboson theory.

  17. New approach for anti-normally and normally ordering bosonic-operator functions in quantum optics

    NASA Astrophysics Data System (ADS)

    Xu, Shi-Min; Zhang, Yun-Hai; Xu, Xing-Lei; Li, Hong-Qi; Wang, Ji-Suo

    2016-12-01

    In this paper, we provide a new kind of operator formula for anti-normally and normally ordering bosonic-operator functions in quantum optics, which can help us arrange a bosonic-operator function f(λQ̂ + νP̂) in its anti-normal and normal ordering conveniently. Furthermore, mutual transformation formulas between anti-normal ordering and normal ordering, which have good universality, are derived too. Based on these operator formulas, some new differential relations and some useful mathematical integral formulas are easily derived without really performing these integrations. Project supported by the Natural Science Foundation of Shandong Province, China (Grant No. ZR2015AM025) and the Natural Science Foundation of Heze University, China (Grant No. XY14PY02).

  18. Learning topological maps: An alternative approach

    SciTech Connect

    Buecken, A.; Thrun, S.

    1996-12-31

    Our goal is autonomous real-time control of a mobile robot. In this paper we want to show a possibility to learn topological maps of a large-scale indoor environment autonomously. In the literature there are two paradigms how to store information on the environment of a robot: as a grid-based (geometric) or as a topological map. While grid-based maps are considerably easy to learn and maintain, topological maps are quite compact and facilitate fast motion-planning.

  19. New approach to the resummation of logarithms in Higgs-boson decays to a vector quarkonium plus a photon

    NASA Astrophysics Data System (ADS)

    Bodwin, Geoffrey T.; Chung, Hee Sok; Ee, June-Haak; Lee, Jungil

    2017-03-01

    We present a calculation of the rates for Higgs-boson decays to a vector heavy-quarkonium state plus a photon, where the heavy-quarkonium states are the J /ψ and the ϒ (n S ) states, with n =1 , 2, or 3. The calculation is carried out in the light-cone formalism, combined with nonrelativistic QCD factorization, and is accurate at leading order in mQ2/mH2, where mQ is the heavy-quark mass and mH is the Higgs-boson mass. The calculation contains corrections through next-to-leading order in the strong-coupling constant αs and the square of the heavy-quark velocity v , and includes a resummation of logarithms of mH2/mQ2 at next-to-leading logarithmic accuracy. We have developed a new method, which makes use of Abel summation, accelerated through the use of Padé approximants, to deal with divergences in the resummed expressions for the quarkonium light-cone distribution amplitudes. This approach allows us to make definitive calculations of the resummation effects. Contributions from the order-αs and order-v2 corrections to the light-cone distribution amplitudes that we obtain with this new method differ substantially from the corresponding contributions that one obtains from a model light-cone distribution amplitude [M. König and M. Neubert, J. High Energy Phys. 08 (2015) 012, 10.1007/JHEP08(2015)012]. Our results for the real parts of the direct-process amplitudes are considerably smaller than those from one earlier calculation [G. T. Bodwin, H. S. Chung, J.-H. Ee, J. Lee, and F. Petriello, Phys. Rev. D 90, 113010 (2014), 10.1103/PhysRevD.90.113010], reducing the sensitivity to the Higgs-boson-heavy-quark couplings, and are somewhat smaller than those from another earlier calculation [M. König and M. Neubert, J. High Energy Phys. 08 (2015) 012, 10.1007/JHEP08(2015)012]. However, our results for the standard-model Higgs-boson branching fractions are in good agreement with those in M. König and M. Neubert, J. High Energy Phys. 08 (2015) 012, 10.1007/JHEP08(2015)012.

  20. Fridel sum rules for one- and two-channel Kondo models and unitarity paradox via bosonization-refermionization approach

    NASA Astrophysics Data System (ADS)

    Kharitonov, Maxim; Andrei, Natan; Coleman, Piers

    2013-03-01

    We calculate the single-particle Green's functions and scattering amplitudes of the one-channel and channel-anisotropic two-channel Kondo models at the Toulouse and Emery-Kivelson lines, respectively, where exact solutions via the bosonization-refermionization approach are admitted. We demonstrate that in this approach the Friedel sum rules - the relations between the trapped spin and ``flavor'' moments and the scattering phase shifts in the Fermi-liquid regime - arise naturally and elucidate on their subtleties. We also recover the ``unitarity paradox'' - the vanishing of the single-particle scattering amplitude at the channel-symmetric point of the two-channel Kondo model - stemming from non-Fermi-liquid behavior. We discuss the implications of these results for the development of composite pairing in heavy fermion systems. This work was supported by National Science Foundation grants DMR 0907179 (MK, PC) and DMR 1006684 (NA).

  1. Combinatorial approach to generalized Bell and Stirling numbers and boson normal ordering problem

    SciTech Connect

    Mendez, M.A.; Blasiak, P.; Penson, K.A.

    2005-08-01

    We consider the numbers arising in the problem of normal ordering of expressions in boson creation a{sup {dagger}} and annihilation a operators ([a,a{sup {dagger}}]=1). We treat a general form of a boson string (a{sup {dagger}}){sup r{sub n}}a{sup s{sub n}}...(a{sup {dagger}}){sup r{sub 2}}a{sup s{sub 2}}(a{sup {dagger}}){sup r{sub 1}}a{sup s{sub 1}} which is shown to be associated with generalizations of Stirling and Bell numbers. The recurrence relations and closed-form expressions (Dobinski-type formulas) are obtained for these quantities by both algebraic and combinatorial methods. By extensive use of methods of combinatorial analysis we prove the equivalence of the aforementioned problem to the enumeration of special families of graphs. This link provides a combinatorial interpretation of the numbers arising in this normal ordering problem.

  2. On the multi-layer multi-configurational time-dependent Hartree approach for bosons and fermions

    NASA Astrophysics Data System (ADS)

    Manthe, Uwe; Weike, Thomas

    2017-02-01

    A multi-layer multi-configurational time-dependent Hartree (MCTDH) approach using a second quantization representation (SQR) based on optimized time-dependent orbitals is introduced. The approach combines elements of the multi-layer MCTDH-SQR approach of Wang and Thoss, which employs a preselected time-independent orbital basis, and the MCTDH for bosons and multi-configuration time-dependent Hartree-Fock approaches, which do not use multi-layering but employ time-dependent orbital bases. In contrast to existing MCTDH-type approaches, the results of the present approach for a given number of configurations are not invariant with respect to unitary transformations of the time-dependent orbital basis. Thus a natural orbital representation is chosen to achieve fast convergence with respect to the number of configurations employed. Equations of motion for the present ansatz, called (multi-layer) MCTDH in optimized second quantization representation, are derived. Furthermore, a scheme for the calculation of optimized unoccupied single-particle functions is given which can be used to avoid singularities in the equations of motion.

  3. Quantitative Genetic Interaction Mapping Using the E-MAP Approach

    PubMed Central

    Collins, Sean R.; Roguev, Assen; Krogan, Nevan J.

    2010-01-01

    Genetic interactions represent the degree to which the presence of one mutation modulates the phenotype of a second mutation. In recent years, approaches for measuring genetic interactions systematically and quantitatively have proven to be effective tools for unbiased characterization of gene function and have provided valuable data for analyses of evolution. Here, we present protocols for systematic measurement of genetic interactions with respect to organismal growth rate for two yeast species. PMID:20946812

  4. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  5. Exact results in a slave boson saddle point approach for a strongly correlated electron model

    SciTech Connect

    Fresard, Raymond; Kopp, Thilo

    2008-08-15

    We revisit the Kotliar-Ruckenstein (KR) slave boson saddle point evaluation for a two-site correlated electron model. As the model can be solved analytically, it is possible to compare the KR saddle point results with the exact many-particle levels. The considered two-site cluster mimics an infinite-U single-impurity Anderson model with a nearest-neighbor Coulomb interaction: one site is strongly correlated with an infinite local Coulomb repulsion, which hybridizes with the second site, on which the local Coulomb repulsion vanishes. Making use of the flexibility of the representation, we introduce appropriate weight factors in the KR saddle point scheme. Ground-state and all excitation levels agree with the exact diagonalization results. Thermodynamics and correlation functions may be recovered in a suitably renormalized saddle point evaluation.

  6. Improving long time behavior of Poisson bracket mapping equation: A non-Hamiltonian approach

    SciTech Connect

    Kim, Hyun Woo; Rhee, Young Min

    2014-05-14

    Understanding nonadiabatic dynamics in complex systems is a challenging subject. A series of semiclassical approaches have been proposed to tackle the problem in various settings. The Poisson bracket mapping equation (PBME) utilizes a partial Wigner transform and a mapping representation for its formulation, and has been developed to describe nonadiabatic processes in an efficient manner. Operationally, it is expressed as a set of Hamilton's equations of motion, similar to more conventional classical molecular dynamics. However, this original Hamiltonian PBME sometimes suffers from a large deviation in accuracy especially in the long time limit. Here, we propose a non-Hamiltonian variant of PBME to improve its behavior especially in that limit. As a benchmark, we simulate spin-boson and photosynthetic model systems and find that it consistently outperforms the original PBME and its Ehrenfest style variant. We explain the source of this improvement by decomposing the components of the mapping Hamiltonian and by assessing the energy flow between the system and the bath. We discuss strengths and weaknesses of our scheme with a viewpoint of offering future prospects.

  7. Comparison of Mixed-Model Approaches for Association Mapping

    PubMed Central

    Stich, Benjamin; Möhring, Jens; Piepho, Hans-Peter; Heckenberger, Martin; Buckler, Edward S.; Melchinger, Albrecht E.

    2008-01-01

    Association-mapping methods promise to overcome the limitations of linkage-mapping methods. The main objectives of this study were to (i) evaluate various methods for association mapping in the autogamous species wheat using an empirical data set, (ii) determine a marker-based kinship matrix using a restricted maximum-likelihood (REML) estimate of the probability of two alleles at the same locus being identical in state but not identical by descent, and (iii) compare the results of association-mapping approaches based on adjusted entry means (two-step approaches) with the results of approaches in which the phenotypic data analysis and the association analysis were performed in one step (one-step approaches). On the basis of the phenotypic and genotypic data of 303 soft winter wheat (Triticum aestivum L.) inbreds, various association-mapping methods were evaluated. Spearman's rank correlation between P-values calculated on the basis of one- and two-stage association-mapping methods ranged from 0.63 to 0.93. The mixed-model association-mapping approaches using a kinship matrix estimated by REML are more appropriate for association mapping than the recently proposed QK method with respect to (i) the adherence to the nominal α-level and (ii) the adjusted power for detection of quantitative trait loci. Furthermore, we showed that our data set could be analyzed by using two-step approaches of the proposed association-mapping method without substantially increasing the empirical type I error rate in comparison to the corresponding one-step approaches. PMID:18245847

  8. Recent developments in MAP - MODULAR APPROACH to PHYSICS

    NASA Astrophysics Data System (ADS)

    Rae, Jennifer; Austen, Dave; Brouwer, Wytze

    2002-05-01

    We present recent developments in MAP - MODULAR APPROACH to PHYSICS - JAVA enhanced modules to be used as aids in teaching the first 3 terms of university physics. The MAP project is very comprehensive and consists of a modular approach to physics that utilizes JAVA applets, FLASH animations and HTML based tutorials. The overall instructional philosophy of MAP is constructivist and the project emphasizes active learner participation. In this talk we will provide a quick overview of the project and the results of recent pilot testing at several Canadian universities. It will also include a discussion of the VIDEO LAB aspect of MAP. This is a component that is integrated into MAP and permits students to capture and evaluate otherwise difficult to study phenomena on video.

  9. A contact map matching approach to protein structure similarity analysis.

    PubMed

    de Melo, Raquel C; Lopes, Carlos Eduardo R; Fernandes, Fernando A; da Silveira, Carlos Henrique; Santoro, Marcelo M; Carceroni, Rodrigo L; Meira, Wagner; Araújo, Arnaldo de A

    2006-06-30

    We modeled the problem of identifying how close two proteins are structurally by measuring the dissimilarity of their contact maps. These contact maps are colored images, in which the chromatic information encodes the chemical nature of the contacts. We studied two conceptually distinct image-processing algorithms to measure the dissimilarity between these contact maps; one was a content-based image retrieval method, and the other was based on image registration. In experiments with contact maps constructed from the protein data bank, our approach was able to identify, with greater than 80% precision, instances of monomers of apolipoproteins, globins, plastocyanins, retinol binding proteins and thioredoxins, among the monomers of Protein Data Bank Select. The image registration approach was only slightly more accurate than the content-based image retrieval approach.

  10. Optogenetic Approaches for Mesoscopic Brain Mapping.

    PubMed

    Kyweriga, Michael; Mohajerani, Majid H

    2016-01-01

    Recent advances in identifying genetically unique neuronal proteins has revolutionized the study of brain circuitry. Researchers are now able to insert specific light-sensitive proteins (opsins) into a wide range of specific cell types via viral injections or by breeding transgenic mice. These opsins enable the activation, inhibition, or modulation of neuronal activity with millisecond control within distinct brain regions defined by genetic markers. Here we present a useful guide to implement this technique into any lab. We first review the materials needed and practical considerations and provide in-depth instructions for acute surgeries in mice. We conclude with all-optical mapping techniques for simultaneous recording and manipulation of population activity of many neurons in vivo by combining arbitrary point optogenetic stimulation and regional voltage-sensitive dye imaging. It is our intent to make these methods available to anyone wishing to use them.

  11. Tank Update System: A novel asset mapping approach for verifying and updating lakes using Google Maps

    NASA Astrophysics Data System (ADS)

    Reddy Pulsani, Bhaskar

    2016-06-01

    Mission Kakatiya is one of prestigious programs of Telangana state government under which restoration of tank across ten districts is being implemented. As part of the program, government plans to restore about 9,000 lakes. Therefore, to have a comprehensive list of lakes existing in Telangana state, Samagra Tank Survey was carried out. Data collected in this survey contained about 45,000 tanks. Since the mode of collection of data was not in a standard format and was made using excel, a web interface was created to fill the gaps and to standardise the data. A new approach for spatially identifying the lakes through Google maps was successfully implemented by developing a web interface. This approach is less common since it implements the nature of asset mapping for the lakes of Telangana state and shows the advantages of using online mapping applications such as Google maps in identifying and cross checking already existing lakes on it.

  12. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches

    NASA Astrophysics Data System (ADS)

    Igoshev, P. A.; Timirgazin, M. A.; Gilmutdinov, V. F.; Arzhnikov, A. K.; Irkhin, V. Yu

    2015-11-01

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed.

  13. A linear programming approach for optimal contrast-tone mapping.

    PubMed

    Wu, Xiaolin

    2011-05-01

    This paper proposes a novel algorithmic approach of image enhancement via optimal contrast-tone mapping. In a fundamental departure from the current practice of histogram equalization for contrast enhancement, the proposed approach maximizes expected contrast gain subject to an upper limit on tone distortion and optionally to other constraints that suppress artifacts. The underlying contrast-tone optimization problem can be solved efficiently by linear programming. This new constrained optimization approach for image enhancement is general, and the user can add and fine tune the constraints to achieve desired visual effects. Experimental results demonstrate clearly superior performance of the new approach over histogram equalization and its variants.

  14. Driven Boson Sampling

    NASA Astrophysics Data System (ADS)

    Barkhofen, Sonja; Bartley, Tim J.; Sansoni, Linda; Kruse, Regina; Hamilton, Craig S.; Jex, Igor; Silberhorn, Christine

    2017-01-01

    Sampling the distribution of bosons that have undergone a random unitary evolution is strongly believed to be a computationally hard problem. Key to outperforming classical simulations of this task is to increase both the number of input photons and the size of the network. We propose driven boson sampling, in which photons are input within the network itself, as a means to approach this goal. We show that the mean number of photons entering a boson sampling experiment can exceed one photon per input mode, while maintaining the required complexity, potentially leading to less stringent requirements on the input states for such experiments. When using heralded single-photon sources based on parametric down-conversion, this approach offers an ˜e -fold enhancement in the input state generation rate over scattershot boson sampling, reaching the scaling limit for such sources. This approach also offers a dramatic increase in the signal-to-noise ratio with respect to higher-order photon generation from such probabilistic sources, which removes the need for photon number resolution during the heralding process as the size of the system increases.

  15. Technology Mapping: An Approach for Developing Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    Technology mapping[TM] is proposed as an approach for developing technological pedagogical content knowledge (TPCK). The study discusses in detail instructional design guidelines in relation to the enactment of TM, and reports on empirical findings from a study with 72 pre-service primary teachers within the context of teaching them how to teach…

  16. Beyond mean-field dynamics of ultra-cold bosonic atoms in higher dimensions: facing the challenges with a multi-configurational approach

    NASA Astrophysics Data System (ADS)

    Bolsinger, V. J.; Krönke, S.; Schmelcher, P.

    2017-02-01

    Exploring the impact of dimensionality on the quantum dynamics of interacting bosons in traps including particle correlations is an interesting but challenging task. Due to the different participating length scales, the modelling of the short-range interactions in three dimensions plays a special role. We review different approaches for the latter and elaborate that for multi-configurational computational strategies, finite-range potentials are adequate resulting in the need for large grids to resolve the relevant length scales. This results in computational challenges, which include the exponential scaling of complexity with the number of atoms. We show that the recently developed ab initio multi-layer multi-configurational time-dependent Hartee method for bosons (ML-MCTDHB) (2013 J. Chem. Phys. 139 134103) can face both numerical challenges and present an efficient numerical implementation of ML-MCTDHB in three spatial dimensions, particularly suited to describe the quantum dynamics for elongated traps. The beneficial scaling of our approach is demonstrated by studying the tunnelling dynamics of bosonic ensembles in a double well. Comparing three-dimensional with quasi-one dimensional simulations, we find dimensionality-induced effects in the density. Furthermore, we study the crossover from weak transversal confinement, where a mean-field description of the system is sufficient, towards tight transversal confinement, where particle correlations and beyond mean-field effects are pronounced.

  17. An improved probability mapping approach to assess genome mosaicism

    PubMed Central

    Zhaxybayeva, Olga; Gogarten, J Peter

    2003-01-01

    Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events. PMID:12974984

  18. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  19. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  20. Noise pollution mapping approach and accuracy on landscape scales.

    PubMed

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations.

  1. Associated production of a quarkonium and a Z boson at one loop in a quark-hadron-duality approach

    NASA Astrophysics Data System (ADS)

    Lansberg, Jean-Philippe; Shao, Hua-Sheng

    2016-10-01

    In view of the large discrepancy about the associated production of a prompt J/ψ and a Z boson between the ATLAS data at √{s}=8 TeV and theoretical predictions for Single Parton Scattering (SPS) contributions, we perform an evaluation of the corresponding cross section at one loop accuracy (Next-to-Leading Order, NLO) in a quark-hadron-duality approach, also known as the Colour-Evaporation Model (CEM). This work is motivated by (i) the extremely disparate predictions based on the existing NRQCD fits conjugated with the absence of a full NLO NRQCD computation and (ii) the fact that we believe that such an evaluation provides a likely upper limit of the SPS cross section. In addition to these theory improvements, we argue that the ATLAS estimation of the Double Parton Scattering (DPS) yield may be underestimated by a factor as large as 3 which then reduces the size of the SPS yield extracted from the ATLAS data. Our NLO SPS evaluation also allows us to set an upper limit on σ eff driving the size of the DPS yield. Overall, the discrepancy between theory and experiment may be smaller than expected, which calls for further analyses by ATLAS and CMS, for which we provide predictions, and for full NLO computations in other models. As an interesting side product of our analysis, we have performed the first NLO computation of dσ /dP T for prompt single- J/ψ production in the CEM from which we have fit the CEM non-pertubative parameter at NLO using the most recent ATLAS data.

  2. Interacting Boson Model and nucleons

    NASA Astrophysics Data System (ADS)

    Otsuka, Takaharu

    2012-10-01

    An overview on the recent development of the microscopic derivation of the Interacting Boson Model is presented with some remarks not found elsewhere. The OAI mapping is reviewed very briefly, including the basic correspondence from nucleon-pair to boson. The new fermionboson mapping method is introduced, where intrinsic states of nucleons and bosons for a wide variation of shapes play an important role. Nucleon intrinsic states are obtained from mean field models, which is Skyrme model in examples to be shown. This method generates IBM-2 Hamiltonian which can describe and predict various situations of quadrupole collective states, including U(5), SU(3), O(6) and E(5) limits. The method is extended so that rotational response (cranking) can be handled, which enables us to describe rotational bands of strongly deformed nuclei. Thus, we have obtained a unified framework for the microscopic derivation of the IBM covering all known situations of quadrupole collectivity at low energy.

  3. A Bayesian approach to traffic light detection and mapping

    NASA Astrophysics Data System (ADS)

    Hosseinyalamdary, Siavash; Yilmaz, Alper

    2017-03-01

    Automatic traffic light detection and mapping is an open research problem. The traffic lights vary in color, shape, geolocation, activation pattern, and installation which complicate their automated detection. In addition, the image of the traffic lights may be noisy, overexposed, underexposed, or occluded. In order to address this problem, we propose a Bayesian inference framework to detect and map traffic lights. In addition to the spatio-temporal consistency constraint, traffic light characteristics such as color, shape and height is shown to further improve the accuracy of the proposed approach. The proposed approach has been evaluated on two benchmark datasets and has been shown to outperform earlier studies. The results show that the precision and recall rates for the KITTI benchmark are 95.78 % and 92.95 % respectively and the precision and recall rates for the LARA benchmark are 98.66 % and 94.65 % .

  4. Wildfire susceptibility mapping: comparing deterministic and stochastic approaches

    NASA Astrophysics Data System (ADS)

    Pereira, Mário; Leuenberger, Michael; Parente, Joana; Tonini, Marj

    2016-04-01

    Estimating the probability of wildfire-occurrence in a certain area under particular environmental conditions represents a modern tool to support forest protection plans and to reduce fires consequences. This can be performed by the implementation of wildfire susceptibility mapping, normally achieved employing more or less sophisticated models which combine the predisposing variables (as raster datasets) into a geographic information systems (GIS). The selection of the appropriate variables includes the evaluation of success and the implementation of prediction curves, as well as independent probabilistic validations for different scenarios. These methods allow to define the spatial pattern of wildfire-occurrences, characterize the susceptibility of the territory, namely for specific fire causes/types, and can also account for other factors such as human behavior and social aspects. We selected Portugal as the study region which, due to its favorable climatic, topographic and vegetation conditions, is by far the European country most affected by wildfires. In addition, Verde and Zêzere (2010) performed a first assessment and validation of wildfire susceptibility and hazard in Portugal which can be used as benchmarking. The objectives of the present study comprise: (1) assessing the structural forest fire risk in Portugal using updated datasets, namely, with higher spatial resolution (80 m to 25 m), most recent vegetation cover (Corine Land Cover), longer fire history (1975-2013); and, (2) comparing linear vs non-linear approaches for wildfire susceptibility mapping. The data we used includes: (i) a DEM derived from the Shuttle Radar Topographic Mission in a resolution of 1 arc-seconds (DEM-SRTM 25 m) to assess elevation and slope; (ii) the Corine Land Cover inventory provided by the European Environment Agency (http://www.eea.europa.eu/pt) to produce the land use land cover map; (iii) the National Mapping Burnt Areas (NMBA) provided by the Institute for the

  5. A new approach to reduce the mapping error of landslide inventory maps

    NASA Astrophysics Data System (ADS)

    Santangelo, Michele; Marchesini, Ivan; Bucci, Francesco; Cardinali, Mauro; Rossi, Mauro; Taylor, Faith; Malamud, Bruce; Guzzetti, Fausto

    2013-04-01

    Landslide inventory maps are key in documenting the type and extent of mass movements in local to regional areas, for both geomorphological studies and landslide hazard assessment. Geomorphologists usually prepare landslide inventories by aerial photo interpretation (API) of stereoscopic images aided by field surveys. Criteria adopted for visual image analyses are derived from the heuristic interpretation of photographic and morphological features of the image, such as shape, size, color tone, texture and pattern. The established (traditional) procedure for transferring photo-interpreted information to a GIS environment involves the manual drawing of information from the aerial photograph to the topographic base map. In this stage, mapping (i.e., positioning, shape, size) errors can occur due to (i) the change in scale, from the aerial photographs to the topographic map, (ii) object deformation in the stereoscopic model, due to the vertical exaggeration and the conical projection of the aerial photographs, (iii) differences in topography in the different cartographic media (aerial photographs and base maps). We recently developed a method to reduce mapping errors which exploits the ortho-rectification of the aerial photograph and the photo-interpreted thematic layers, thus avoiding manual transferring of information to the topographic map. The technique was evaluated in a test area of about 50 km2 in the neighboring of Taormina (Sicily, Southern Italy), where the information concerning mass movement was transferred to two inventory maps using the traditional and ortho-rectification technique. More than 500 landslides pairs have been compared in this test region, ranging in landlside area between 102 and 107 m2. The mapping error associated with the mapped features has been evaluated by calculating the mismatch index for each landslide pair as: E = (A U B)-(A ? B)/(A U B), where A is a landslide of the inventory obtained using the manual drawing approach and B is a

  6. Einstein's Gravitational Field Approach to Dark Matter and Dark Energy-Geometric Particle Decay into the Vacuum Energy Generating Higgs Boson and Heavy Quark Mass

    NASA Astrophysics Data System (ADS)

    Christensen, Walter James

    2015-08-01

    During an interview at the Niels Bohr Institute David Bohm stated, "according to Einstein, particles should eventually emerge as singularities, or very strong regions of stable pulses of (the gravitational) field" [1]. Starting from this premise, we show spacetime, indeed, manifests stable pulses (n-valued gravitons) that decay into the vacuum energy to generate all three boson masses (including Higgs), as well as heavy-quark mass; and all in precise agreement with the 2010 CODATA report on fundamental constants. Furthermore, our relativized quantum physics approach (RQP) answers to the mystery surrounding dark energy, dark matter, accelerated spacetime, and why ordinary matter dominates over antimatter.

  7. Symmetry-improved 2PI approach to the Goldstone-boson IR problem of the SM effective potential

    NASA Astrophysics Data System (ADS)

    Pilaftsis, Apostolos; Teresi, Daniele

    2016-05-01

    The effective potential of the Standard Model (SM), from three loop order and higher, suffers from infrared (IR) divergences arising from quantum effects due to massless would-be Goldstone bosons associated with the longitudinal polarizations of the W± and Z bosons. Such IR pathologies also hinder accurate evaluation of the two-loop threshold corrections to electroweak quantities, such as the vacuum expectation value of the Higgs field. However, these divergences are an artifact of perturbation theory, and therefore need to be consistently resummed in order to obtain an IR-safe effective potential. The so-called Two-Particle-Irreducible (2PI) effective action provides a rigorous framework to consistently perform such resummations, without the need to resort to ad hoc subtractions or running into the risk of over-counting contributions. By considering the recently proposed symmetry-improved 2PI formalism, we address the problem of the Goldstone-boson IR divergences of the SM effective potential in the gaugeless limit of the theory. In the same limit, we evaluate the IR-safe symmetry-improved 2PI effective potential, after taking into account quantum loops of chiral fermions, as well as the renormalization of spurious custodially breaking effects triggered by fermionic Yukawa interactions. Finally, we compare our results with those obtained with other methods presented in the literature.

  8. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (K-Map) Generation Skill

    ERIC Educational Resources Information Center

    Görgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  9. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (k-map) Generation Skill

    ERIC Educational Resources Information Center

    Gorgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  10. Map of isotachs - statistical approach and meteorological information transfer

    SciTech Connect

    Menezes, A.A.; da Silva, J.I.; Coutinho, C.E.O.

    1985-09-01

    This report gives a statistical treatment of available wind data from airports in Brazil and provides a map of isotachs for extreme yearly wind velocities. A comparison between the statistical models of Frechet and Gumbel is carried out, leading to the adoption of the latter. The low density of meteorological stations used in this approach restricts the knowledge of wind activity. This fact was accounted for in the analytical method for spatial transfer of climatic data. Recommendations are given on how to enlarge the amount of available data.

  11. Hyperspectral image super-resolution: a hybrid color mapping approach

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Kwan, Chiman; Budavari, Bence

    2016-07-01

    NASA has been planning a hyperspectral infrared imager mission which will provide global coverage using a hyperspectral imager with 60-m resolution. In some practical applications, such as special crop monitoring or mineral mapping, 60-m resolution may still be too coarse. There have been many pansharpening algorithms for hyperspectral images by fusing high-resolution (HR) panchromatic or multispectral images with low-resolution (LR) hyperspectral images. We propose an approach to generating HR hyperspectral images by fusing high spatial resolution color images with low spatial resolution hyperspectral images. The idea is called hybrid color mapping (HCM) and involves a mapping between a high spatial resolution color image and a low spatial resolution hyperspectral image. Several variants of the color mapping idea, including global, local, and hybrid, are proposed and investigated. It was found that the local HCM yielded the best performance. Comparison of the local HCM with >10 state-of-the-art algorithms using five performance metrics has been carried out using actual images from the air force and NASA. Although our HCM method does not require a point spread function (PSF), our results are comparable to or better than those methods that do require PSF. More importantly, our performance is better than most if not all methods that do not require PSF. After applying our HCM algorithm, not only the visual performance of the hyperspectral image has been significantly improved, but the target classification performance has also been improved. Another advantage of our technique is that it is very efficient and can be easily parallelized. Hence, our algorithm is very suitable for real-time applications.

  12. Interacting boson model from energy density functionals: {gamma}-softness and the related topics

    SciTech Connect

    Nomura, K.

    2012-10-20

    A comprehensive way of deriving the Hamiltonian of the interacting boson model (IBM) is described. Based on the fact that the multi-nucleon induced surface deformation in finite nucleus is simulated by effective boson degrees of freedom, the potential energy surface calculated with self-consistent mean-field method employing a given energy density functional (EDF) is mapped onto the IBM analog, and thereby the excitation spectra and transition rates with good symmetry quantum numbers are calculated. Recent applications of the proposed approach are reported: (i) an alternative robust interpretation of the {gamma}-soft nuclei and (ii) shape coexistence in lead isotopes.

  13. Structure Prior Effects in Bayesian Approaches of Quantitative Susceptibility Mapping

    PubMed Central

    Chen, Weiwei; Wang, Chunmei; Liu, Tian; Wang, Yi; Pan, Chu; Mu, Ketao; Zhu, Ce; Zhang, Xiang; Cheng, Jian

    2016-01-01

    Quantitative susceptibility mapping (QSM) has shown its potential for anatomical and functional MRI, as it can quantify, for in vivo tissues, magnetic biomarkers and contrast agents which have differential susceptibilities to the surroundings substances. For reconstructing the QSM with a single orientation, various methods have been proposed to identify a unique solution for the susceptibility map. Bayesian QSM approach is the major type which uses various regularization terms, such as a piece-wise constant, a smooth, a sparse, or a morphological prior. Six QSM algorithms with or without structure prior are systematically discussed to address the structure prior effects. The methods are evaluated using simulations, phantom experiments with the given susceptibility, and human brain data. The accuracy and image quality of QSM were increased when using structure prior in the simulation and phantom compared to same regularization term without it, respectively. The image quality of QSM method using the structure prior is better comparing, respectively, to the method without it by either sharpening the image or reducing streaking artifacts in vivo. The structure priors improve the performance of the various QSMs using regularized minimization including L1, L2, and TV norm. PMID:28097129

  14. Teaching Population Health: A Competency Map Approach to Education

    PubMed Central

    Kaprielian, Victoria S.; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K.; Murphy, Gwen; Tran, Anh N.; Sheline, Barbara L.; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J.; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H.; Scutchfield, F. Douglas; Michener, J. Lloyd

    2013-01-01

    A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals’ training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina to improve the local community’s health. Based on these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke’s efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919

  15. A POD Mapping Approach to Emulate Land Surface Models

    NASA Astrophysics Data System (ADS)

    Pau, G. S. H.; Bisht, G.; Liu, Y.; Riley, W. J.; Shen, C.

    2014-12-01

    Existing land surface models (LSMs) describe physical and biological processes that occur over a wide range of spatial and temporal scales. Since simulating LSMs at a spatial scale to explicitly resolve the finest resolution processes is computationally expensive, upscaling techniques are used in LSMs to capture effect of subgrid heterogeneity. However, routinely employed linear upscaling techniques that allow LSMs to be simulated at coarse spatial resolution can result in large prediction error. To efficiently predict fine-resolution solutions to LSMs, we studied the application of a reduce order model (ROM) technique known as the "Proper Orthogonal Decomposition mapping method" that reconstructs temporally-resolved fine-resolution solutions based on coarse-resolution solutions for two case studies. In the first case study, we applied POD approach on surface-subsurface isothermal simulations for four study sites (104 [m2]) in a polygonal tundra landscape near Barrow, Alaska. The results indicate that the ROM produced a significant computational speedup (>103) with very small relative approximation error (<0.1%) for two validation years not used in training the ROM. In the second case study, we illustrate the applicability of our ROM approach at watershed scale (1837 km2) model that is substantially more heterogeneous and demonstrate a hierarchical approach to emulating models at spatial scales consistent with mechanistic physical process representation.

  16. Cortical sulcal atlas construction using a diffeomorphic mapping approach.

    PubMed

    Joshi, Shantanu H; Cabeen, Ryan P; Sun, Bo; Joshi, Anand A; Gutman, Boris; Zamanyan, Alen; Chakrapani, Shruthi; Dinov, Ivo; Woods, Roger P; Toga, Arthur W

    2010-01-01

    We present a geometric approach for constructing shape atlases of sulcal curves on the human cortex. Sulci and gyri are represented as continuous open curves in R3, and their shapes are studied as elements of an infinite-dimensional sphere. This shape manifold has some nice properties--it is equipped with a Riemannian L2 metric on the tangent space and facilitates computational analyses and correspondences between sulcal shapes. Sulcal mapping is achieved by computing geodesics in the quotient space of shapes modulo rigid rotations and reparameterizations. The resulting sulcal shape atlas is shown to preserve important local geometry inherently present in the sample population. This is demonstrated in our experimental results for deep brain sulci, where we integrate the elastic shape model into surface registration framework for a population of 69 healthy young adult subjects.

  17. Mapping the distribution of malaria: current approaches and future directions

    USGS Publications Warehouse

    Johnson, Leah R.; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.; Chen, Dongmei; Moulin, Bernard; Wu, Jianhong

    2015-01-01

    Mapping the distribution of malaria has received substantial attention because the disease is a major source of illness and mortality in humans, especially in developing countries. It also has a defined temporal and spatial distribution. The distribution of malaria is most influenced by its mosquito vector, which is sensitive to extrinsic environmental factors such as rainfall and temperature. Temperature also affects the development rate of the malaria parasite in the mosquito. Here, we review the range of approaches used to model the distribution of malaria, from spatially explicit to implicit, mechanistic to correlative. Although current methods have significantly improved our understanding of the factors influencing malaria transmission, significant gaps remain, particularly in incorporating nonlinear responses to temperature and temperature variability. We highlight new methods to tackle these gaps and to integrate new data with models.

  18. Omics and Exercise: Global Approaches for Mapping Exercise Biological Networks.

    PubMed

    Hoffman, Nolan J

    2017-03-27

    The application of global "-omics" technologies to exercise has introduced new opportunities to map the complexity and interconnectedness of biological networks underlying the tissue-specific responses and systemic health benefits of exercise. This review will introduce major research tracks and recent advancements in this emerging field, as well as critical gaps in understanding the orchestration of molecular exercise dynamics that will benefit from unbiased omics investigations. Furthermore, significant research hurdles that need to be overcome to effectively fill these gaps related to data collection, computation, interpretation, and integration across omics applications will be discussed. Collectively, a cross-disciplinary physiological and omics-based systems approach will lead to discovery of a wealth of novel exercise-regulated targets for future mechanistic validation. This frontier in exercise biology will aid the development of personalized therapeutic strategies to improve athletic performance and human health through precision exercise medicine.

  19. A multi-model ensemble approach to seabed mapping

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  20. Decoherence of spin-deformed bosonic model

    SciTech Connect

    Dehdashti, Sh.; Mahdifar, A.; Bagheri Harouni, M.; Roknizadeh, R.

    2013-07-15

    The decoherence rate and some parameters affecting it are investigated for the generalized spin-boson model. We consider the spin-bosonic model when the bosonic environment is modeled by the deformed harmonic oscillators. We show that the state of the environment approaches a non-linear coherent state. Then, we obtain the decoherence rate of a two-level system which is in contact with a deformed bosonic environment which is either in thermal equilibrium or in the ground state. By using some recent realization of f-deformed oscillators, we show that some physical parameters strongly affect the decoherence rate of a two-level system. -- Highlights: •Decoherence of the generalized spin-boson model is considered. •In this model the environment consists of f-oscillators. •Via the interaction, the state of the environment approaches non-linear coherent states. •Effective parameters on decoherence are considered.

  1. Dynamics of quantum dissipation systems interacting with fermion and boson grand canonical bath ensembles: hierarchical equations of motion approach.

    PubMed

    Jin, Jinshuang; Welack, Sven; Luo, JunYan; Li, Xin-Qi; Cui, Ping; Xu, Rui-Xue; Yan, YiJing

    2007-04-07

    A hierarchical equations of motion formalism for a quantum dissipation system in a grand canonical bath ensemble surrounding is constructed on the basis of the calculus-on-path-integral algorithm, together with the parametrization of arbitrary non-Markovian bath that satisfies fluctuation-dissipation theorem. The influence functionals for both the fermion or boson bath interaction are found to be of the same path integral expression as the canonical bath, assuming they all satisfy the Gaussian statistics. However, the equation of motion formalism is different due to the fluctuation-dissipation theories that are distinct and used explicitly. The implications of the present work to quantum transport through molecular wires and electron transfer in complex molecular systems are discussed.

  2. Pure P2P mediation system: A mappings discovery approach

    NASA Astrophysics Data System (ADS)

    selma, El yahyaoui El idrissi; Zellou, Ahmed; Idri, Ali

    2015-02-01

    The information integration systems consist in offering a uniform interface to provide access to a set of autonomous and distributed information sources. The most important advantage of this system is that it allows users to specify what they want, rather than thinking about how to get the responses. The works realized in this area have particular leads to two major classes of integration systems: the mediation systems based on the paradigm mediator / adapter and peer to peer systems (P2P). The combination of both systems has led to a third type; is the mediation P2P systems. The P2P systems are large-scale systems, self-organized and distributed. They allow the resource management in a completely decentralized way. However, the integration of structured information sources, heterogeneous and distributed proves to be a complex problem. The objective of this work is to propose an approach to resolve conflicts and establish a mapping between the heterogeneous elements. This approach is based on clustering; the latter is to group similar Peers that share common information in the same subnet. Thus, to facilitate the heterogeneity, we introduced three additional layers of our hierarchy of peers: internal schema, external schema and Schema directory peer. We used linguistic techniques, and precisely the name correspondence technique, that is based on the similarity of names to propose a correspondence.

  3. Connectomics: comprehensive approaches for whole-brain mapping.

    PubMed

    Shibata, Shinsuke; Komaki, Yuji; Seki, Fumiko; Inouye, Michiko O; Nagai, Toshihiro; Okano, Hideyuki

    2015-02-01

    The aim of connectomics analysis is to understand whole-brain neural connections. This is accomplished using new biotechnologies. Here, we provide an overview of the recent progress in connectomics analysis. The entire neural network of an organism was revealed for the first time in the nematode. Caenorhabditis elegans (C. elegans) have an advantage of their limited number of neurons and their transparency, allowing the neural network to be visualized using light and electron microscopes (EMs). It is practically impossible to adopt the same approach for mammals because of the large number of neural cells and the opacity of the central nervous system. A variety of new technologies are being developed to perform computer-assisted high-throughput image acquisition and analysis to obtain whole-brain maps for higher species, including mammals. Diffusion tensor magnetic resonance imaging and tractography and three-dimensional imaging with the EM are examples of novel approaches to connectomics. These new technologies will soon be applied not only to Drosophila, C. elegans and rodent research, but also to comprehensive connectomics analysis in a wide range of species including humans and primates. In the near future, results from connectomics analysis will reveal the neural circuitry of the whole brain and enhance our understanding of the human mind and neuropsychiatric diseases.

  4. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  5. Comparison of Mixed-Model Approaches for Association Mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Association-mapping methods promise to overcome the limitations of linkage-mapping methods. The main objectives of this study were to (i) evaluate various methods for association mapping in the autogamous species wheat using an empirical data set, (ii) determine a marker-based kinship matrix using a...

  6. High School Biology: A Group Approach to Concept Mapping.

    ERIC Educational Resources Information Center

    Brown, David S.

    2003-01-01

    Explains concept mapping as an instructional method in cooperative learning environments, and describes a study investigating the effectiveness of concept mapping on student learning during a photosynthesis and cellular respiration unit. Reports on the positive effects of concept mapping in the experimental group. (Contains 16 references.) (YDS)

  7. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  8. Higgs boson at LHC: a diffractive opportunity

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2009-03-23

    An alternative process is presented for diffractive Higgs boson production in peripheral pp collisions, where the particles interact through the Double Pomeron Exchange. The event rate is computed as a central-rapidity distribution for Tevatron and LHC energies leading to a result around 0.6 pb, higher than the predictions from previous approaches. Therefore, this result arises as an enhanced signal for the detection of the Higgs boson in hadron colliders. The predictions for the Higgs boson photoproduction are compared to the ones obtained from a similar approach proposed by the Durham group, enabling an analysis of the future developments of its application to pp and AA collisions.

  9. The Higgs Boson.

    ERIC Educational Resources Information Center

    Veltman, Martinus J. G.

    1986-01-01

    Reports recent findings related to the particle Higgs boson and examines its possible contribution to the standard mode of elementary processes. Critically explores the strengths and uncertainties of the Higgs boson and proposed Higgs field. (ML)

  10. Unconventional quantum critical points in systems of strongly interacting bosons

    NASA Astrophysics Data System (ADS)

    Zaleski, T. A.; Kopeć, T. K.

    2014-09-01

    Using the combined Bogoliubov method and the quantum rotor approach, we map the Bose-Hubbard Hamiltonian of strongly interacting bosons onto U(1) phase action. By unraveling consequences of the nontrivial topology of the U(1) gauge group and the associated ground state degeneracy we found a close kinship of the zero-temperature divergence of the compressibility and the topological susceptibility at degeneracy points, which marks a novel quantum criticality governed by topological features rather than the Landau principle of the symmetry breaking. We argue that the existence of this new type of the criticality may be instrumental in explaining unconventional quantum critical points observed in superconducting cuprates.

  11. Mapping Diffusion in a Living Cell via the Phasor Approach

    PubMed Central

    Ranjit, Suman; Lanzano, Luca; Gratton, Enrico

    2014-01-01

    Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created. PMID:25517145

  12. Mapping diffusion in a living cell via the phasor approach.

    PubMed

    Ranjit, Suman; Lanzano, Luca; Gratton, Enrico

    2014-12-16

    Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created.

  13. A Hands-On Approach to Understanding Topographic Maps and Their Construction.

    ERIC Educational Resources Information Center

    Bart, Henry Anthony

    1991-01-01

    Describes a topographic map exercise designed for lab session of two to three hours in an introductory geology course. Students are taught the basic principles of topographic map construction and are then required to make a map of a section of campus. Author claims the approach has improved student test performance and resulted in a deeper…

  14. Improved Omnidirectional Odometry for a View-Based Mapping Approach

    PubMed Central

    Valiente, David; Gil, Arturo; Reinoso, Óscar; Juliá, Miguel; Holloway, Mathew

    2017-01-01

    This work presents an improved visual odometry using omnidirectional images. The main purpose is to generate a reliable prior input which enhances the SLAM (Simultaneous Localization and Mapping) estimation tasks within the framework of navigation in mobile robotics, in detriment of the internal odometry data. Generally, standard SLAM approaches extensively use data such as the main prior input to localize the robot. They also tend to consider sensory data acquired with GPSs, lasers or digital cameras, as the more commonly acknowledged to re-estimate the solution. Nonetheless, the modeling of the main prior is crucial, and sometimes especially challenging when it comes to non-systematic terms, such as those associated with the internal odometer, which ultimately turn to be considerably injurious and compromise the convergence of the system. This omnidirectional odometry relies on an adaptive feature point matching through the propagation of the current uncertainty of the system. Ultimately, it is fused as the main prior input in an EKF (Extended Kalman Filter) view-based SLAM system, together with the adaption of the epipolar constraint to the omnidirectional geometry. Several improvements have been added to the initial visual odometry proposal so as to produce better performance. We present real data experiments to test the validity of the proposal and to demonstrate its benefits, in contrast to the internal odometry. Furthermore, SLAM results are included to assess its robustness and accuracy when using the proposed prior omnidirectional odometry. PMID:28208766

  15. Current Approaches Toward Quantitative Mapping of the Interactome

    PubMed Central

    Buntru, Alexander; Trepte, Philipp; Klockmeier, Konrad; Schnoegl, Sigrid; Wanker, Erich E.

    2016-01-01

    Protein–protein interactions (PPIs) play a key role in many, if not all, cellular processes. Disease is often caused by perturbation of PPIs, as recently indicated by studies of missense mutations. To understand the associations of proteins and to unravel the global picture of PPIs in the cell, different experimental detection techniques for PPIs have been established. Genetic and biochemical methods such as the yeast two-hybrid system or affinity purification-based approaches are well suited to high-throughput, proteome-wide screening and are mainly used to obtain qualitative results. However, they have been criticized for not reflecting the cellular situation or the dynamic nature of PPIs. In this review, we provide an overview of various genetic methods that go beyond qualitative detection and allow quantitative measuring of PPIs in mammalian cells, such as dual luminescence-based co-immunoprecipitation, Förster resonance energy transfer or luminescence-based mammalian interactome mapping with bait control. We discuss the strengths and weaknesses of different techniques and their potential applications in biomedical research. PMID:27200083

  16. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  17. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    DOE PAGES

    Carena, Marcela; Haber, Howard E.; Low, Ian; ...

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combinationmore » of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.« less

  18. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    SciTech Connect

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combination of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.

  19. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-07-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff and flood predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP-mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexities of automatic DRP-mapping approaches affect hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison, and a deviation map were derived. The automatically derived DRP maps were used in synthetic runoff simulations with an adapted version of the PREVAH hydrological model, and simulation results compared with those from simulations using the reference maps. The DRP maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. Furthermore, we argue not to use

  20. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, M.; Buss, R.; Scherrer, S.; Margreth, M.; Zappa, M.

    2015-12-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexity of automatic DRP mapping approaches affects hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison and a deviation map were derived. The automatically derived DRP-maps were used in synthetic runoff simulations with an adapted version of the hydrological model PREVAH, and simulation results compared with those from simulations using the reference maps. The DRP-maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP-maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. We therefore recommend not only using expert

  1. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    ERIC Educational Resources Information Center

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  2. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    PubMed

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  3. A high-density, multi-parental SNP genetic map on apple validates a new mapping approach for outcrossing species

    PubMed Central

    Di Pierro, Erica A; Gianfranceschi, Luca; Di Guardo, Mario; Koehorst-van Putten, Herma JJ; Kruisselbrink, Johannes W; Longhi, Sara; Troggio, Michela; Bianco, Luca; Muranty, Hélène; Pagliarani, Giulia; Tartarini, Stefano; Letschka, Thomas; Lozano Luis, Lidia; Garkava-Gustavsson, Larisa; Micheletti, Diego; Bink, Marco CAM; Voorrips, Roeland E; Aziz, Ebrahimi; Velasco, Riccardo; Laurens, François; van de Weg, W Eric

    2016-01-01

    Quantitative trait loci (QTL) mapping approaches rely on the correct ordering of molecular markers along the chromosomes, which can be obtained from genetic linkage maps or a reference genome sequence. For apple (Malus domestica Borkh), the genome sequence v1 and v2 could not meet this need; therefore, a novel approach was devised to develop a dense genetic linkage map, providing the most reliable marker-loci order for the highest possible number of markers. The approach was based on four strategies: (i) the use of multiple full-sib families, (ii) the reduction of missing information through the use of HaploBlocks and alternative calling procedures for single-nucleotide polymorphism (SNP) markers, (iii) the construction of a single backcross-type data set including all families, and (iv) a two-step map generation procedure based on the sequential inclusion of markers. The map comprises 15 417 SNP markers, clustered in 3 K HaploBlock markers spanning 1 267 cM, with an average distance between adjacent markers of 0.37 cM and a maximum distance of 3.29 cM. Moreover, chromosome 5 was oriented according to its homoeologous chromosome 10. This map was useful to improve the apple genome sequence, design the Axiom Apple 480 K SNP array and perform multifamily-based QTL studies. Its collinearity with the genome sequences v1 and v3 are reported. To our knowledge, this is the shortest published SNP map in apple, while including the largest number of markers, families and individuals. This result validates our methodology, proving its value for the construction of integrated linkage maps for any outbreeding species. PMID:27917289

  4. Single point vs. mapping approach for spectral cytopathology (SCP).

    PubMed

    Schubert, Jennifer M; Mazur, Antonella I; Bird, Benjamin; Miljković, Milos; Diem, Max

    2010-08-01

    In this paper we describe the advantages of collecting infrared microspectral data in imaging mode opposed to point mode. Imaging data are processed using the PapMap algorithm, which co-adds pixel spectra that have been scrutinized for R-Mie scattering effects as well as other constraints. The signal-to-noise quality of PapMap spectra will be compared to point spectra for oral mucosa cells deposited onto low-e slides. Also the effects of software atmospheric correction will be discussed. Combined with the PapMap algorithm, data collection in imaging mode proves to be a superior method for spectral cytopathology.

  5. NEW APPROACHES: Using concept maps with trainee physics teachers

    NASA Astrophysics Data System (ADS)

    Adamczyk, Peter; Willson, Mike

    1996-11-01

    The technique of Concept Mapping described here is useful for identifying gaps in trainee teachers' knowledge, which may then be addressed to help those who must nowadays teach Science outside their own specialism.

  6. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships.

  7. Transboundary aquifer mapping and management in Africa: a harmonised approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Yvan; Villholth, Karen G.

    2013-11-01

    Recent attention to transboundary aquifers (TBAs) in Africa reflects the growing importance of these resources for development in the continent. However, relatively little research on these aquifers and their best management strategies has been published. This report recapitulates progress on mapping and management frameworks for TBAs in Africa. The world map on transboundary aquifers presented at the 6th World Water Forum in 2012 identified 71 TBA systems in Africa. This report presents an updated African TBA map including 80 shared aquifers and aquifer systems superimposed on 63 international river basins. Furthermore, it proposes a new nomenclature for the mapping based on three sub-regions, reflecting the leading regional development communities. The map shows that TBAs represent approximately 42 % of the continental area and 30 % of the population. Finally, a brief review of current international law, specific bi- or multilateral treaties, and TBA management practice in Africa reveals little documented international conflicts over TBAs. The existing or upcoming international river and lake basin organisations offer a harmonised institutional base for TBA management while alternative or supportive models involving the regional development communities are also required. The proposed map and geographical classification scheme for TBAs facilitates identification of options for joint institutional setups.

  8. Higgs Boson 2016

    SciTech Connect

    Lincoln, Don

    2016-11-16

    The Higgs boson burst into the public arena on July 4, 2012, when scientists working at the CERN laboratory announced the particle’s discovery. However the initial discovery was a bit tentative, with the need to verify that the discovered particle was, indeed, the Higgs boson. In this video, Fermilab’s Dr. Don Lincoln looks at the data from the perspective of 2016 and shows that more recent analyses further supports the idea that the Higgs boson is what was discovered.

  9. Eulerian Mapping Closure Approach for Probability Density Function of Concentration in Shear Flows

    NASA Technical Reports Server (NTRS)

    He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Eulerian mapping closure approach is developed for uncertainty propagation in computational fluid mechanics. The approach is used to study the Probability Density Function (PDF) for the concentration of species advected by a random shear flow. An analytical argument shows that fluctuation of the concentration field at one point in space is non-Gaussian and exhibits stretched exponential form. An Eulerian mapping approach provides an appropriate approximation to both convection and diffusion terms and leads to a closed mapping equation. The results obtained describe the evolution of the initial Gaussian field, which is in agreement with direct numerical simulations.

  10. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    PubMed Central

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  11. Raman mapping of oral buccal mucosa: a spectral histopathology approach

    NASA Astrophysics Data System (ADS)

    Behl, Isha; Kukreja, Lekha; Deshmukh, Atul; Singh, S. P.; Mamgain, Hitesh; Hole, Arti R.; Krishna, C. Murali

    2014-12-01

    Oral cancer is one of the most common cancers worldwide. One-fifth of the world's oral cancer subjects are from India and other South Asian countries. The present Raman mapping study was carried out to understand biochemical variations in normal and malignant oral buccal mucosa. Data were acquired using WITec alpha 300R instrument from 10 normal and 10 tumors unstained tissue sections. Raman maps of normal sections could resolve the layers of epithelium, i.e. basal, intermediate, and superficial. Inflammatory, tumor, and stromal regions are distinctly depicted on Raman maps of tumor sections. Mean and difference spectra of basal and inflammatory cells suggest abundance of DNA and carotenoids features. Strong cytochrome bands are observed in intermediate layers of normal and stromal regions of tumor. Epithelium and stromal regions of normal cells are classified by principal component analysis. Classification among cellular components of normal and tumor sections is also observed. Thus, the findings of the study further support the applicability of Raman mapping for providing molecular level insights in normal and malignant conditions.

  12. The Facebook influence model: a concept mapping approach.

    PubMed

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts.

  13. The Facebook Influence Model: A Concept Mapping Approach

    PubMed Central

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  14. Cognitions of Expert Supervisors in Academe: A Concept Mapping Approach

    ERIC Educational Resources Information Center

    Kemer, Gülsah; Borders, L. DiAnne; Willse, John

    2014-01-01

    Eighteen expert supervisors reported their thoughts while preparing for, conducting, and evaluating their supervision sessions. Concept mapping (Kane & Trochim, [Kane, M., 2007]) yielded 195 cognitions classified into 25 cognitive categories organized into 5 supervision areas: conceptualization of supervision, supervisee assessment,…

  15. Approaches to Mapping Nitrogen Removal: Examples at a Landscape Scale

    EPA Science Inventory

    Wetlands can provide the ecosystem service of improved water quality via nitrogen removal, providing clean drinking water and reducing the eutrophication of aquatic resources. Within the ESRP, mapping nitrogen removal by wetlands is a service that incorporates the goals of the ni...

  16. A New Approach for Constructing the Concept Map

    ERIC Educational Resources Information Center

    Tseng, Shian-Shyong; Sue, Pei-Chi; Su, Jun-Ming; Weng, Jui-Feng; Tsai, Wen-Nung

    2007-01-01

    In recent years, e-learning system has become more and more popular and many adaptive learning environments have been proposed to offer learners customized courses in accordance with their aptitudes and learning results. For achieving the adaptive learning, a predefined concept map of a course is often used to provide adaptive learning guidance…

  17. Mapping Sustainability Initiatives across a Region: An Innovative Survey Approach

    ERIC Educational Resources Information Center

    Somerville, Margaret; Green, Monica

    2012-01-01

    The project of mapping sustainability initiatives across a region is part of a larger program of research about place and sustainability education for the Anthropocene, the new geological age of human-induced planetary changes (Zalasiewicz, Williams, Steffen, & Crutzen, 2010). The study investigated the location, nature and type of…

  18. A Time Sequence-Oriented Concept Map Approach to Developing Educational Computer Games for History Courses

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Yang, Kai-Hsiang; Chen, Jing-Hong

    2015-01-01

    Concept maps have been recognized as an effective tool for students to organize their knowledge; however, in history courses, it is important for students to learn and organize historical events according to the time of their occurrence. Therefore, in this study, a time sequence-oriented concept map approach is proposed for developing a game-based…

  19. Concept Map Engineering: Methods and Tools Based on the Semantic Relation Approach

    ERIC Educational Resources Information Center

    Kim, Minkyu

    2013-01-01

    The purpose of this study is to develop a better understanding of technologies that use natural language as the basis for concept map construction. In particular, this study focuses on the semantic relation (SR) approach to drawing rich and authentic concept maps that reflect students' internal representations of a problem situation. The…

  20. Comparison of Sub-pixel Classification Approaches for Crop-specific Mapping

    EPA Science Inventory

    The Moderate Resolution Imaging Spectroradiometer (MODIS) data has been increasingly used for crop mapping and other agricultural applications. Phenology-based classification approaches using the NDVI (Normalized Difference Vegetation Index) 16-day composite (250 m) data product...

  1. Atom-atom correlations in time-of-flight imaging of ultracold bosons in optical lattices

    SciTech Connect

    Zaleski, T. A.; Kopec, T. K.

    2011-11-15

    We study the spatial correlations of strongly interacting bosons in a ground state, confined in a two-dimensional square and a three-dimensional cubic lattice. Using the combined Bogoliubov method and the quantum rotor approach, we map the Hamiltonian of strongly interacting bosons onto U(1) phase action in order to calculate the atom-atom correlations' decay along the principal axis and a diagonal of the lattice-plane direction as a function of distance. Lower tunneling rates lead to quicker decays of the correlations, whose character becomes exponential. Finally, correlation functions allow us to calculate quantities that are directly bound to experimental outcomes, namely time-of-flight absorption images and resulting visibility. Our results contain all the characteristic features present in experimental data (transition from Mott insulating blob to superfluid peaks, etc.), emphasizing the usability of the proposed approach.

  2. Two-dimensional thermofield bosonization

    SciTech Connect

    Amaral, R.L.P.G.

    2005-12-15

    The main objective of this paper was to obtain an operator realization for the bosonization of fermions in 1 + 1 dimensions, at finite, non-zero temperature T. This is achieved in the framework of the real-time formalism of Thermofield Dynamics. Formally, the results parallel those of the T = 0 case. The well-known two-dimensional Fermion-Boson correspondences at zero temperature are shown to hold also at finite temperature. To emphasize the usefulness of the operator realization for handling a large class of two-dimensional quantum field-theoretic problems, we contrast this global approach with the cumbersome calculation of the fermion-current two-point function in the imaginary-time formalism and real-time formalisms. The calculations also illustrate the very different ways in which the transmutation from Fermi-Dirac to Bose-Einstein statistics is realized.

  3. An integrated approach for automated cover-type mapping of large inaccessible areas in Alaska

    USGS Publications Warehouse

    Fleming, Michael D.

    1988-01-01

    The lack of any detailed cover type maps in the state necessitated that a rapid and accurate approach to be employed to develop maps for 329 million acres of Alaska within a seven-year period. This goal has been addressed by using an integrated approach to computer-aided analysis which combines efficient use of field data with the only consistent statewide spatial data sets available: Landsat multispectral scanner data, digital elevation data derived from 1:250 000-scale maps, and 1:60 000-scale color-infrared aerial photographs.

  4. A genetic mosaic approach for neural circuit mapping in Drosophila

    PubMed Central

    Bohm, Rudolf A.; Welch, William P.; Goodnight, Lindsey K.; Cox, Logan W.; Henry, Leah G.; Gunter, Tyler C.; Bao, Hong; Zhang, Bing

    2010-01-01

    Transgenic manipulation of subsets of brain cells is increasingly used for studying behaviors and their underlying neural circuits. In Drosophila, the GAL4–upstream activating sequence (UAS) binary system is powerful for gene manipulation, but GAL4 expression is often too broad for fine mapping of neural circuits. Here, we describe the development of unique molecular genetic tools to restrict GAL4 expression patterns. Building on the GAL4-UAS system, our method adds two components: a collection of enhancer-trap recombinase, Flippase (ET-FLP), transgenic lines that provide inheritable, reproducible, and tissue-specific FLP and an FRT-dependent GAL80 “flip-in” construct that converts FLP expression into tissue-specific repression of GAL4 by GAL80. By including a UAS-encoded fluorescent protein, circuit morphology can be simultaneously marked while the circuit function is assessed using another UAS transgene. In a proof-of-principle analysis, we applied this ET-FLP-induced intersectional GAL80/GAL4 repression (FINGR) method to map the neural circuitry underlying fly wing inflation. The FINGR system is versatile and powerful in combination with the vast collection of GAL4 lines for neural circuit mapping as well as for clonal analysis based on the infusion of the yeast-derived FRT/FLP system of mitotic recombination into Drosophila. The strategies and tactics underlying our FINGR system are also applicable to other genetically amenable organisms in which transgenes including the GAL4, UAS, GAL80, and FLP factors can be applied. PMID:20810922

  5. Supersymmetric Higgs Bosons in Weak Boson Fusion

    SciTech Connect

    Hollik, Wolfgang; Plehn, Tilman; Rauch, Michael; Rzehak, Heidi

    2009-03-06

    We compute the complete supersymmetric next-to-leading-order corrections to the production of a light Higgs boson in weak-boson fusion. The size of the electroweak corrections is of similar order as the next-to-leading-order corrections in the standard model. The supersymmetric QCD corrections turn out to be significantly smaller than expected and than their electroweak counterparts. These corrections are an important ingredient to a precision analysis of the (supersymmetric) Higgs sector at the LHC, either as a known correction factor or as a contribution to the theory error.

  6. Approaches to digital snow mapping with LANDSAT-1 data

    NASA Technical Reports Server (NTRS)

    Itten, K. I.

    1975-01-01

    Applying the same LANDSAT-1 data to three substantially different image processing systems, a snow mapping task was performed. LARSYS Ver.3, STANSORT-2, and General Electric Image-100 did all the jobs of detecting the snowline in forested mountainous terrain, and to determine the snowcovered area. While the control and accuracy achieved with LARSYS is remarkable, time and effort to perform the processing favor the systems STANSORT and Image-100. The experiences and results demonstrate the need for a fast interactive system for operational snowmapping with multispectral satellite data.

  7. Stationkeeping Approach for the Microwave Anisotropy Probe (MAP)

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, Dave; Schiff, Conrad

    2002-01-01

    The Microwave Anisotropy Probe was successfully launched on June 30, 2001 and placed into a Lissajous orbit about the L2 Sun-Earth-Moon libration point. However, the L2 libration point is unstable which necessitates occasional stationkeeping maneuvers in order to maintain the spacecraft s Lissajous orbit. Analyses were performed in order to develop a feasible L2 stationkeeping strategy for the MAP mission. The resulting strategy meets the allotted fuel budget, allowing for enough fuel to handle additional he1 taxes, while meeting the attitude requirements for the maneuvers. Results from the first two stationkeeping maneuvers are included.

  8. Concept mapping: a distinctive educational approach to foster critical thinking.

    PubMed

    Taylor, Laura A; Littleton-Kearney, Marguerite

    2011-01-01

    Advanced practice nurses must be able to link interventions to address pathophysiological processes with underlying alterations in normal physiological function to promote safe, effective patient care. Development of creative methods to assist students to make their own connections among healthcare concepts is imperative to create a positive learning environment. The authors discuss the use of concept mapping in conjunction with case-study clinical rounds to maximize critical thinking and greater learning retention among advanced practice nurses in a graduate physiology/pathophysiology course.

  9. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    PubMed

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology.

  10. A FISH approach for mapping the human genome using Bacterial Artificial Chromosomes (BACs)

    SciTech Connect

    Hubert, R.S.; Chen, X.N.; Mitchell, S.

    1994-09-01

    As the Human Genome Project progresses, large insert cloning vectors such as BACs, P1, and P1 Artificial Chromosomes (PACs) will be required to complement the YAC mapping efforts. The value of the BAC vector for physical mapping lies in the stability of the inserts, the lack of chimerism, the length of inserts (up to 300 kb), the ability to obtain large amounts of pure clone DNA and the ease of BAC manipulation. These features helped us design two approaches for generating physical mapping reagents for human genetic studies. The first approach is a whole genome strategy in which randomly selected BACs are mapped, using FISH, to specific chromosomal bands. To date, 700 BACs have been mapped to single chromosome bands at a resolution of 2-5 Mb in addition to BACs mapped to 14 different centromeres. These BACs represent more than 90 Mb of the genome and include >70% of all human chromosome bands at the 350-band level. These data revealed that >97% of the BACs were non-chimeric and have a genomic distribution covering most gaps in the existing YAC map with excellent coverage of gene-rich regions. In the second approach, we used YACs to identify BACs on chromosome 21. A 1.5 Mb contig between D21S339 and D21S220 nears completion within the Down syndrome congenital heart disease (DS-CHD) region. Seventeen BACs ranging in size from 80 kb to 240 kb were ordered using 14 STSs with FISH confirmation. We have also used 40 YACs spanning 21q to identify, on average, >1 BAC/Mb to provide molecular cytogenetic reagents and anchor points for further mapping. The contig generated on chromosome 21 will be helpful in isolating the genes for DS-CHD. The physical mapping reagents generated using the whole genome approach will provide cytogenetic markers and mapped genomic fragments that will facilitate positional cloning efforts and the identification of genes within most chromosomal bands.

  11. Equivalence between spin Hamiltonians and boson sampling

    NASA Astrophysics Data System (ADS)

    Peropadre, Borja; Aspuru-Guzik, Alán; García-Ripoll, Juan José

    2017-03-01

    Aaronson and Arkhipov showed that predicting or reproducing the measurement statistics of a general linear optics circuit with a single Fock-state input is a classically hard problem. Here we show that this problem, known as boson sampling, is as hard as simulating the short time evolution of a large but simple spin model with long-range X Y interactions. The conditions for this equivalence are the same for efficient boson sampling, namely, having a small number of photons (excitations) as compared to the number of modes (spins). This mapping allows efficient implementations of boson sampling in small quantum computers and simulators and sheds light on the complexity of time evolution with critical spin models.

  12. Bosonic self-energy functional theory

    NASA Astrophysics Data System (ADS)

    Hügel, Dario; Werner, Philipp; Pollet, Lode; Strand, Hugo U. R.

    2016-11-01

    We derive the self-energy functional theory for bosonic lattice systems with broken U(1) symmetry by parametrizing the bosonic Baym-Kadanoff effective action in terms of one- and two-point self-energies. The formalism goes beyond other approximate methods such as the pseudoparticle variational cluster approximation, the cluster composite boson mapping, and the Bogoliubov+U theory. It simplifies to bosonic dynamical-mean-field theory when constraining to local fields, whereas when neglecting kinetic contributions of noncondensed bosons, it reduces to the static mean-field approximation. To benchmark the theory, we study the Bose-Hubbard model on the two- and three-dimensional cubic lattice, comparing with exact results from path integral quantum Monte Carlo. We also study the frustrated square lattice with next-nearest-neighbor hopping, which is beyond the reach of Monte Carlo simulations. A reference system comprising a single bosonic state, corresponding to three variational parameters, is sufficient to quantitatively describe phase boundaries and thermodynamical observables, while qualitatively capturing the spectral functions, as well as the enhancement of kinetic fluctuations in the frustrated case. On the basis of these findings, we propose self-energy functional theory as the omnibus framework for treating bosonic lattice models, in particular, in cases where path integral quantum Monte Carlo methods suffer from severe sign problems (e.g., in the presence of nontrivial gauge fields or frustration). Self-energy functional theory enables the construction of diagrammatically sound approximations that are quantitatively precise and controlled in the number of optimization parameters but nevertheless remain computable by modest means.

  13. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  14. Engineering a robotic approach to mapping exposed volcanic fissures

    NASA Astrophysics Data System (ADS)

    Parcheta, C. E.; Parness, A.; Mitchell, K. L.

    2014-12-01

    Field geology provides a framework for advanced computer models and theoretical calculations of volcanic systems. Some field terrains, though, are poorly preserved or accessible, making documentation, quantification, and investigation impossible. Over 200 volcanologists at the 2012 Kona Chapman Conference on volcanology agreed that and important step forward in the field over the next 100 years should address the realistic size and shape of volcanic conduits. The 1969 Mauna Ulu eruption of Kīlauea provides a unique opportunity to document volcanic fissure conduits, thus, we have an ideal location to begin addressing this topic and provide data on these geometries. Exposed fissures can be mapped with robotics using machine vision. In order to test the hypothesis that fissures have irregularities with depth that will influence their fluid dynamical behavior, we must first map the fissure vents and shallow conduit to deci- or centimeter scale. We have designed, constructed, and field-tested the first version of a robotic device that will image an exposed volcanic fissure in three dimensions. The design phase included three steps: 1) create the payload harness and protective shell to prevent damage to the electronics and robot, 2) construct a circuit board to have the electronics communicate with a surface-based computer, and 3) prototype wheel shapes that can handle a variety of volcanic rock textures. The robot's mechanical parts were built using 3d printing, milling, casting and laser cutting techniques, and the electronics were assembled from off the shelf components. The testing phase took place at Mauna Ulu, Kīlauea, Hawai'i, from May 5 - 9, 2014. Many valuable design lessons were learned during the week, and the first ever 3D map from inside a volcanic fissure were successfully collected. Three vents had between 25% and 95% of their internal surfaces imaged. A fourth location, a non-eruptive crack (possibly a fault line) had two transects imaging the textures

  15. A New Approach to Mapping, Visualization, and Morphological Classification of Small Bodies

    NASA Astrophysics Data System (ADS)

    Clark, Pamela E.; Clark, C. S.; Stooke, P. J.

    2008-09-01

    We present a systematic approach to interpreting asteroid shape and surface morphology using Constant Scale Natural Boundary (CSNB) map projection applied to Deimos, Phobos, Eros, and Ida. With the CSNB projection, the ridges and troughs, `event horizons’ acting as encoders of asteroid history, can be prominently featured as map edges at constant scale. By contrast, simple cylindrical and mercator maps, although familiar and instantly orientating, produce great distortions, particularly for irregular objects. CSNB projection combines the best features of 3D mosaics and conformal maps, emphasizing highly irregular faceted shape in one view, without distortion, on a flat map. CSNB maps are designed to be conformal for antipodal areas and to preserve proportions in map interiors. For consistency and orientation, we locate the blunt `nose’ in the center of all maps in the equatorial plane, because most asteroids are elongated along the equatorial axis, and the blunt nose is a recognizable feature, but less morphologically complex than the `sharp’ end. The external boundaries then become the ridges connecting `peaks', which typically run parallel to the equator, and troughs connecting `basins', which typically separate the promontories. Three maps, two ridge-bound and one trough-bound, exist for each object. Segmented maps show separation of the surface into geodesic `facets', preserve resolution, and fold to a 3D facsimile of the asteroid. Connected maps are compact and preserve orientation. Morphological parameters manifested in CSNB map shape include E/W and N/S distribution of segments, roughness of boundaries associated with each segment, and aspect ratio for segmented map. Based on comparison of these parameters, Phobos has considerably greater asymmetry in E/W and N/S directions, has a higher aspect ratio, and is considerably rougher than Deimos.

  16. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    ERIC Educational Resources Information Center

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  17. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  18. Mapping New Approaches in Program Evaluation: A Cross-Cultural Perspective.

    ERIC Educational Resources Information Center

    Gorostiaga, Jorge M.; Paulston, Rolland G.

    This paper examines new approaches to program evaluation and explores their possible utility in Latin American educational settings. Part 1 briefly discusses why new ideas for evaluating educational studies are needed. Part 2 examines seven new evaluative approaches as follows: (1) "Concept Mapping," a type of structural…

  19. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    ERIC Educational Resources Information Center

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  20. Photoproduction of leptophobic bosons

    NASA Astrophysics Data System (ADS)

    Fanelli, Cristiano; Williams, Mike

    2017-01-01

    We propose a search for photoproduction of leptophobic bosons that couple to quarks at the GlueX experiment at Jefferson Lab. We study in detail a new gauge boson that couples to baryon number B, and estimate that γ p\\to {pB} will provide the best sensitivity for B masses above 0.5 GeV. This search will also provide sensitivity to other proposed dark-sector states that couple to quarks. Finally, our results motivate a similar search for B boson electroproduction at the CLAS experiment.

  1. Higgs Boson 2016

    ScienceCinema

    Lincoln, Don

    2016-12-14

    The Higgs boson burst into the public arena on July 4, 2012, when scientists working at the CERN laboratory announced the particle’s discovery. However the initial discovery was a bit tentative, with the need to verify that the discovered particle was, indeed, the Higgs boson. In this video, Fermilab’s Dr. Don Lincoln looks at the data from the perspective of 2016 and shows that more recent analyses further supports the idea that the Higgs boson is what was discovered.

  2. Comparison of four Vulnerability Approaches to Mapping of Shallow Aquifers of Eastern Dahomey Basin of Nigeria

    NASA Astrophysics Data System (ADS)

    Oke, Saheed; Vermeulen, Danie

    2016-04-01

    This study presents the outcome of mapping the shallow aquifers of the eastern Dahomey Basin of southwestern Nigeria vulnerability studies. The basin is a coastal transboundary aquifer extending from eastern Ghana to southwestern Nigeria. The study aimed to examine the most suitable method for mapping the basin shallow aquifers by comparing the results of four different vulnerability approaches. This is most important due to differences in vulnerability assessment parameters, approaches and results derived from most vulnerability methods on a particular aquifer. The methodology involves using vulnerability techniques that assess the intrinsic properties of the aquifer. Two methods from travel time approach (AVI and RTt) and index approach (DRASTIC and PI) were employed in the mapping of the basin. The results show the AVI has the least mapping parameters with 75% of the basin classified as very high vulnerability and 25% with high vulnerability. The DRASTIC mapping shows 18% as low vulnerability, 61% as moderate vulnerability and 21% reveal high vulnerability. Mapping with the PI method which has highest parameters shows 66% of the aquifer as low vulnerability and 34% reveal moderate vulnerability. The RTt method shows 18% as very high vulnerability, 8% as high vulnerability, 64% as moderate vulnerability and 10% reveal very low vulnerability. Further analysis involving correlation plots shows the highest correlation of 62% between the RTt and DRASTIC method than within any others methods. The analysis shows that the PI method is the mildest of all the vulnerability methods while the AVI method is the strictest of the methods considered in this vulnerability mapping. The significance of using four different approaches to the mapping of the shallow aquifers of the eastern Dahomey Basin will guide in the recommendation of the best vulnerability method for subsequent future assessment of this and other shallow aquifers. Keywords: Aquifer vulnerability, Dahomey Basin

  3. Effective Boson Number- A New Approach for Predicting Separation Energies with the IBM1, Applied to Zr, Kr, Sr isotopes near A = 100

    NASA Astrophysics Data System (ADS)

    Paul, Nancy; van Isacker, Pieter; García Ramos, José Enrique; Aprahamian, Ani

    2011-10-01

    This work uses effective boson numbers in the Interacting Boson Model (IBM1) to predict two neutron separation energies for neutron-rich zirconium, strontium, and krypton isotopes., We determine the functional forms of binding energy and excitation energies as a function of boson number for a given choice of IBM parameters that give a good overall description of the experimental spectra of the isotopic chain. The energy of the first excited 2+ level is then used to extract an effective boson number for a given nucleus, that is in turn used to calculate the separation energies. This method accounts for complex interactions among valence nucleons around magic and semi- magic nuclei and successfully predicts the phase transitional signature in separation energies around A=100 for 92-108Zr, 90-104Sr, and 86-96Kr Supported by the NSF under contract PHY0758100, the Joint Institute for Nuclear Astrophysics grant PHY0822648, University of Notre Dame Nanovic Institute, Glynn Family Honors Program, Center for Undergraduate Scholarly Engagement.

  4. Experimental mapping of soluble protein domains using a hierarchical approach.

    PubMed

    Pedelacq, Jean-Denis; Nguyen, Hau B; Cabantous, Stephanie; Mark, Brian L; Listwan, Pawel; Bell, Carolyn; Friedland, Natasha; Lockard, Meghan; Faille, Alexandre; Mourey, Lionel; Terwilliger, Thomas C; Waldo, Geoffrey S

    2011-10-01

    Exploring the function and 3D space of large multidomain protein targets often requires sophisticated experimentation to obtain the targets in a form suitable for structure determination. Screening methods capable of selecting well-expressed, soluble fragments from DNA libraries exist, but require the use of automation to maximize chances of picking a few good candidates. Here, we describe the use of an insertion dihydrofolate reductase (DHFR) vector to select in-frame fragments and a split-GFP assay technology to filter-out constructs that express insoluble protein fragments. With the incorporation of an IPCR step to create high density, focused sublibraries of fragments, this cost-effective method can be performed manually with no a priori knowledge of domain boundaries while permitting single amino acid resolution boundary mapping. We used it on the well-characterized p85α subunit of the phosphoinositide-3-kinase to demonstrate the robustness and efficiency of our methodology. We then successfully tested it onto the polyketide synthase PpsC from Mycobacterium tuberculosis, a potential drug target involved in the biosynthesis of complex lipids in the cell envelope. X-ray quality crystals from the acyl-transferase (AT), dehydratase (DH) and enoyl-reductase (ER) domains have been obtained.

  5. Experimental mapping of soluble protein domains using a hierarchical approach

    PubMed Central

    Pedelacq, Jean-Denis; Nguyen, Hau B.; Cabantous, Stephanie; Mark, Brian L.; Listwan, Pawel; Bell, Carolyn; Friedland, Natasha; Lockard, Meghan; Faille, Alexandre; Mourey, Lionel; Terwilliger, Thomas C.; Waldo, Geoffrey S.

    2011-01-01

    Exploring the function and 3D space of large multidomain protein targets often requires sophisticated experimentation to obtain the targets in a form suitable for structure determination. Screening methods capable of selecting well-expressed, soluble fragments from DNA libraries exist, but require the use of automation to maximize chances of picking a few good candidates. Here, we describe the use of an insertion dihydrofolate reductase (DHFR) vector to select in-frame fragments and a split-GFP assay technology to filter-out constructs that express insoluble protein fragments. With the incorporation of an IPCR step to create high density, focused sublibraries of fragments, this cost-effective method can be performed manually with no a priori knowledge of domain boundaries while permitting single amino acid resolution boundary mapping. We used it on the well-characterized p85α subunit of the phosphoinositide-3-kinase to demonstrate the robustness and efficiency of our methodology. We then successfully tested it onto the polyketide synthase PpsC from Mycobacterium tuberculosis, a potential drug target involved in the biosynthesis of complex lipids in the cell envelope. X-ray quality crystals from the acyl-transferase (AT), dehydratase (DH) and enoyl-reductase (ER) domains have been obtained. PMID:21771856

  6. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  7. A National Approach to Quantify and Map Biodiversity ...

    EPA Pesticide Factsheets

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human welfare. The degradation of natural ecosystems and climate variation impact the environment and society by affecting ecological integrity and ecosystems’ capacity to provide critical services (i.e., the contributions of ecosystems to human well-being). These challenges will require complex management decisions that can often involve significant trade-offs between societal desires and environmental needs. Evaluating trade-offs in terms of ecosystem services and human well-being provides an intuitive and comprehensive way to assess the broad implications of our decisions and to help shape policies that enhance environmental and social sustainability. In answer to this challenge, the U.S. government has created a partnership among the U.S. Environmental Protection Agency, other Federal agencies, academic institutions, and, Non-Governmental Organizations to develop the EnviroAtlas, an online Decision Support Tool that allows users (e.g., planners, policy-makers, resource managers, NGOs, private indu

  8. Geomatics Approach for Assessment of respiratory disease Mapping

    NASA Astrophysics Data System (ADS)

    Pandey, M.; Singh, V.; Vaishya, R. C.

    2014-11-01

    Air quality is an important subject of relevance in the context of present times because air is the prime resource for sustenance of life especially human health position. Then with the aid of vast sums of data about ambient air quality is generated to know the character of air environment by utilizing technological advancements to know how well or bad the air is. This report supplies a reliable method in assessing the Air Quality Index (AQI) by using fuzzy logic. The fuzzy logic model is designed to predict Air Quality Index (AQI) that report monthly air qualities. With the aid of air quality index we can evaluate the condition of the environment of that area suitability regarding human health position. For appraisal of human health status in industrial area, utilizing information from health survey questionnaire for obtaining a respiratory risk map by applying IDW and Gettis Statistical Techniques. Gettis Statistical Techniques identifies different spatial clustering patterns like hot spots, high risk and cold spots over the entire work area with statistical significance.

  9. MAPS: A Quantitative Radiomics Approach for Prostate Cancer Detection.

    PubMed

    Cameron, Andrew; Khalvati, Farzad; Haider, Masoom A; Wong, Alexander

    2016-06-01

    This paper presents a quantitative radiomics feature model for performing prostate cancer detection using multiparametric MRI (mpMRI). It incorporates a novel tumor candidate identification algorithm to efficiently and thoroughly identify the regions of concern and constructs a comprehensive radiomics feature model to detect tumorous regions. In contrast to conventional automated classification schemes, this radiomics-based feature model aims to ground its decisions in a way that can be interpreted and understood by the diagnostician. This is done by grouping features into high-level feature categories which are already used by radiologists to diagnose prostate cancer: Morphology, Asymmetry, Physiology, and Size (MAPS), using biomarkers inspired by the PI-RADS guidelines for performing structured reporting on prostate MRI. Clinical mpMRI data were collected from 13 men with histology-confirmed prostate cancer and labeled by an experienced radiologist. These annotated data were used to train classifiers using the proposed radiomics-driven feature model in order to evaluate the classification performance. The preliminary experimental results indicated that the proposed model outperformed each of its constituent feature groups as well as a comparable conventional mpMRI feature model. A further validation of the proposed algorithm will be conducted using a larger dataset as future work.

  10. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    NASA Astrophysics Data System (ADS)

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  11. Computer-based Approaches for Training Interactive Digital Map Displays

    DTIC Science & Technology

    2005-09-01

    SUPPLEMENTARY NOTES Subject Matter POC: Jean L. Dyer 14. ABSTRACT (Maximum 200 words): Five computer-based training approaches for learning digital skills...Training assessment Exploratory Learning Guided ExploratoryTraining Guided Discovery SECURITY CLASSIFICATION OF 19. LIMITATION OF 20. NUMBER 21...the other extreme of letting Soldiers learn a digital interface on their own. The research reported here examined these two conditions and three other

  12. A heuristic multi-criteria classification approach incorporating data quality information for choropleth mapping

    PubMed Central

    Sun, Min; Wong, David; Kronenfeld, Barry

    2016-01-01

    Despite conceptual and technology advancements in cartography over the decades, choropleth map design and classification fail to address a fundamental issue: estimates that are statistically indifferent may be assigned to different classes on maps or vice versa. Recently, the class separability concept was introduced as a map classification criterion to evaluate the likelihood that estimates in two classes are statistical different. Unfortunately, choropleth maps created according to the separability criterion usually have highly unbalanced classes. To produce reasonably separable but more balanced classes, we propose a heuristic classification approach to consider not just the class separability criterion but also other classification criteria such as evenness and intra-class variability. A geovisual-analytic package was developed to support the heuristic mapping process to evaluate the trade-off between relevant criteria and to select the most preferable classification. Class break values can be adjusted to improve the performance of a classification. PMID:28286426

  13. Flood Hazard Mapping over Large Regions using Geomorphic Approaches

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Troy, Tara J.; Manfreda, Salvatore

    2016-04-01

    Historically, man has always preferred to settle and live near the water. This tendency has not changed throughout time, and today nineteen of the twenty most populated agglomerations of the world (Demographia World Urban Areas, 2015) are located along watercourses or at the mouth of a river. On one hand, these locations are advantageous from many points of view. On the other hand, they expose significant populations and economic assets to a certain degree of flood hazard. Knowing the location and the extent of the areas exposed to flood hazards is essential to any strategy for minimizing the risk. Unfortunately, in data-scarce regions the use of traditional floodplain mapping techniques is prevented by the lack of the extensive data required, and this scarcity is generally most pronounced in developing countries. The present work aims to overcome this limitation by defining an alternative simplified procedure for a preliminary, but efficient, floodplain delineation. To validate the method in a data-rich environment, eleven flood-related morphological descriptors derived from DEMs have been used as linear binary classifiers over the Ohio River basin and its sub-catchments, measuring their performances in identifying the floodplains at the change of the topography and the size of the calibration area. The best performing classifiers among those analysed have been applied and validated across the continental U.S. The results suggest that the classifier based on the index ln(hr/H), named the Geomorphic Flood Index (GFI), is the most suitable to detect the flood-prone areas in data-scarce environments and for large-scale applications, providing good accuracy with low requirements in terms of data and computational costs. Keywords: flood hazard, data-scarce regions, large-scale studies, binary classifiers, DEM, USA.

  14. A chemical approach to mapping nucleosomes at base pair resolution in yeast.

    PubMed

    Brogaard, Kristin R; Xi, Liqun; Wang, Ji-Ping; Widom, Jonathan

    2012-01-01

    Most eukaryotic DNA exists in DNA-protein complexes known as nucleosomes. The exact locations of nucleosomes along the genome play a critical role in chromosome functions and gene regulation. However, the current methods for nucleosome mapping do not provide the necessary accuracy to identify the precise nucleosome locations. Here we describe a new experimental approach that directly maps nucleosome center locations in vivo genome-wide at single base pair resolution.

  15. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    PubMed

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  16. A taxonomy of behaviour change methods: an Intervention Mapping approach

    PubMed Central

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fernández, María E.; Markham, Christine; Bartholomew, L. Kay

    2016-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it

  17. Fractional-filling loophole insulator domains for ultracold bosons in optical superlattices

    SciTech Connect

    Buonsante, P.; Penna, V.; Vezzani, A.

    2004-12-01

    The zero-temperature phase diagram of a Bose-Einstein condensate confined in realistic one-dimensional l-periodic optical superlattices is investigated. The system of interacting bosons is modeled in terms of a Bose-Hubbard Hamiltonian whose site-dependent local potentials and hopping amplitudes reflect the periodicity of the lattice partition in l-site cells. Relying on the exact mapping between the hardcore limit of the boson Hamiltonian and the model of spinless noninteracting fermions, incompressible insulator domains are shown to exist for rational fillings that are predicted to be compressible in the atomic limit. The corresponding boundaries, qualitatively described in a multiple-site mean-field approach, are shown to exhibit an unusual loophole shape. A more quantitative description of the loophole domain boundaries at half filling for the special case l=2 is supplied in terms of analytic strong-coupling expansions and quantum Monte Carlo simulations.

  18. [Recent progress in gene mapping through high-throughput sequencing technology and forward genetic approaches].

    PubMed

    Lu, Cairui; Zou, Changsong; Song, Guoli

    2015-08-01

    Traditional gene mapping using forward genetic approaches is conducted primarily through construction of a genetic linkage map, the process of which is tedious and time-consuming, and often results in low accuracy of mapping and large mapping intervals. With the rapid development of high-throughput sequencing technology and decreasing cost of sequencing, a variety of simple and quick methods of gene mapping through sequencing have been developed, including direct sequencing of the mutant genome, sequencing of selective mutant DNA pooling, genetic map construction through sequencing of individuals in population, as well as sequencing of transcriptome and partial genome. These methods can be used to identify mutations at the nucleotide level and has been applied in complex genetic background. Recent reports have shown that sequencing mapping could be even done without the reference of genome sequence, hybridization, and genetic linkage information, which made it possible to perform forward genetic study in many non-model species. In this review, we summarized these new technologies and their application in gene mapping.

  19. Streamlined approach to mapping the magnetic induction of skyrmionic materials.

    PubMed

    Chess, Jordan J; Montoya, Sergio A; Harvey, Tyler R; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E; McMorran, Benjamin J

    2017-02-28

    Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  20. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  1. Approaches to interval mapping of QTL in a multigeneration pedigree: the example of porcine chromosome 4.

    PubMed

    Knott, S A; Nyström, P E; Andersson-Eklund, L; Stern, S; Marklund, L; Andersson, L; Haley, C S

    2002-02-01

    Quantitative trait loci (QTLs) have been mapped in many studies of F2 populations derived from crosses between diverse lines. One approach to confirming these effects and improving the mapping resolution is genetic chromosome dissection through a backcrossing programme. Analysis by interval mapping of the data generated is likely to provide additional power and resolution compared with treating data marker by marker. However, interval mapping approaches for such a programme are not well developed, especially where the founder lines were outbred. We explore alternative approaches to analysis using, as an example, data from chromosome 4 in an intercross between wild boar and Large White pigs where QTLs have been previously identified. A least squares interval mapping procedure was used to study growth rate and carcass traits in a subsequent second backcross generation (BC2). This procedure requires the probability of inheriting a wild boar allele for each BC2 animal for locations throughout the chromosome. Two methods for obtaining these probabilities were compared: stochastic or deterministic. The two methods gave similar probabilities for inheriting wild boar alleles and, hence, gave very similar results from the QTL analysis. The deterministic approach has the advantage of being much faster to run but requires specialized software. A QTL for fatness and for growth were confirmed and, in addition, a QTL for piglet growth from weaning at 5 weeks up to 7 weeks of age and another for carcass length were detected.

  2. Mapping paths: new approaches to dissect eukaryotic signaling circuitry

    PubMed Central

    Mutlu, Nebibe; Kumar, Anuj

    2016-01-01

    Eukaryotic cells are precisely “wired” to coordinate changes in external and intracellular signals with corresponding adjustments in the output of complex and often interconnected signaling pathways. These pathways are critical in understanding cellular growth and function, and several experimental trends are emerging with applicability toward more fully describing the composition and topology of eukaryotic signaling networks. In particular, recent studies have implemented CRISPR/Cas-based screens in mouse and human cell lines for genes involved in various cell growth and disease phenotypes. Proteomic methods using mass spectrometry have enabled quantitative and dynamic profiling of protein interactions, revealing previously undiscovered complexes and allele-specific protein interactions. Methods for the single-cell study of protein localization and gene expression have been integrated with computational analyses to provide insight into cell signaling in yeast and metazoans. In this review, we present an overview of exemplary studies using the above approaches, relevant for the analysis of cell signaling and indeed, more broadly, for many modern biological applications. PMID:27540473

  3. Mapping seabed sediments: Comparison of manual, geostatistical, object-based image analysis and machine learning approaches

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Green, Sophie L.; Stephens, David; Lark, R. Murray; Stewart, Heather A.; Dove, Dayton

    2014-08-01

    Marine spatial planning and conservation need underpinning with sufficiently detailed and accurate seabed substrate and habitat maps. Although multibeam echosounders enable us to map the seabed with high resolution and spatial accuracy, there is still a lack of fit-for-purpose seabed maps. This is due to the high costs involved in carrying out systematic seabed mapping programmes and the fact that the development of validated, repeatable, quantitative and objective methods of swath acoustic data interpretation is still in its infancy. We compared a wide spectrum of approaches including manual interpretation, geostatistics, object-based image analysis and machine-learning to gain further insights into the accuracy and comparability of acoustic data interpretation approaches based on multibeam echosounder data (bathymetry, backscatter and derivatives) and seabed samples with the aim to derive seabed substrate maps. Sample data were split into a training and validation data set to allow us to carry out an accuracy assessment. Overall thematic classification accuracy ranged from 67% to 76% and Cohen's kappa varied between 0.34 and 0.52. However, these differences were not statistically significant at the 5% level. Misclassifications were mainly associated with uncommon classes, which were rarely sampled. Map outputs were between 68% and 87% identical. To improve classification accuracy in seabed mapping, we suggest that more studies on the effects of factors affecting the classification performance as well as comparative studies testing the performance of different approaches need to be carried out with a view to developing guidelines for selecting an appropriate method for a given dataset. In the meantime, classification accuracy might be improved by combining different techniques to hybrid approaches and multi-method ensembles.

  4. Simulating spin-boson models with matrix product states

    NASA Astrophysics Data System (ADS)

    Wall, Michael; Safavi-Naini, Arghavan; Rey, Ana Maria

    2016-05-01

    The global coupling of few-level quantum systems (``spins'') to a discrete set of bosonic modes is a key ingredient for many applications in quantum science, including large-scale entanglement generation, quantum simulation of the dynamics of long-range interacting spin models, and hybrid platforms for force and spin sensing. In many situations, the bosons are integrated out, leading to effective long-range interactions between the spins; however, strong spin-boson coupling invalidates this approach, and spin-boson entanglement degrades the fidelity of quantum simulation of spin models. We present a general numerical method for treating the out-of-equilibrium dynamics of spin-boson systems based on matrix product states. While most efficient for weak coupling or small numbers of boson modes, our method applies for any spatial and operator dependence of the spin-boson coupling. In addition, our approach allows straightforward computation of many quantities of interest, such as the full counting statistics of collective spin measurements and quantum simulation infidelity due to spin-boson entanglement. We apply our method to ongoing trapped ion quantum simulator experiments in analytically intractable regimes. This work is supported by JILA-NSF-PFC-1125844, NSF-PIF- 1211914, ARO, AFOSR, AFOSR-MURI, and the NRC.

  5. An integrated two-stage support vector machine approach to forecast inundation maps during typhoons

    NASA Astrophysics Data System (ADS)

    Jhong, Bing-Chen; Wang, Jhih-Huang; Lin, Gwo-Fong

    2017-04-01

    During typhoons, accurate forecasts of hourly inundation depths are essential for inundation warning and mitigation. Due to the lack of observed data of inundation maps, sufficient observed data are not available for developing inundation forecasting models. In this paper, the inundation depths, which are simulated and validated by a physically based two-dimensional model (FLO-2D), are used as a database for inundation forecasting. A two-stage inundation forecasting approach based on Support Vector Machine (SVM) is proposed to yield 1- to 6-h lead-time inundation maps during typhoons. In the first stage (point forecasting), the proposed approach not only considers the rainfall intensity and inundation depth as model input but also simultaneously considers cumulative rainfall and forecasted inundation depths. In the second stage (spatial expansion), the geographic information of inundation grids and the inundation forecasts of reference points are used to yield inundation maps. The results clearly indicate that the proposed approach effectively improves the forecasting performance and decreases the negative impact of increasing forecast lead time. Moreover, the proposed approach is capable of providing accurate inundation maps for 1- to 6-h lead times. In conclusion, the proposed two-stage forecasting approach is suitable and useful for improving the inundation forecasting during typhoons, especially for long lead times.

  6. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    NASA Astrophysics Data System (ADS)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  7. Mapping water quality and substrate cover in optically complex coastal and reef waters: an integrated approach.

    PubMed

    Phinn, S R; Dekker, A G; Brando, V E; Roelfsema, C M

    2005-01-01

    Sustainable management of coastal and coral reef environments requires regular collection of accurate information on recognized ecosystem health indicators. Satellite image data and derived maps of water column and substrate biophysical properties provide an opportunity to develop baseline mapping and monitoring programs for coastal and coral reef ecosystem health indicators. A significant challenge for satellite image data in coastal and coral reef water bodies is the mixture of both clear and turbid waters. A new approach is presented in this paper to enable production of water quality and substrate cover type maps, linked to a field based coastal ecosystem health indicator monitoring program, for use in turbid to clear coastal and coral reef waters. An optimized optical domain method was applied to map selected water quality (Secchi depth, Kd PAR, tripton, CDOM) and substrate cover type (seagrass, algae, sand) parameters. The approach is demonstrated using commercially available Landsat 7 Enhanced Thematic Mapper image data over a coastal embayment exhibiting the range of substrate cover types and water quality conditions commonly found in sub-tropical and tropical coastal environments. Spatially extensive and quantitative maps of selected water quality and substrate cover parameters were produced for the study site. These map products were refined by interactions with management agencies to suit the information requirements of their monitoring and management programs.

  8. A multi-temporal analysis approach for land cover mapping in support of nuclear incident response

    NASA Astrophysics Data System (ADS)

    Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.

    2012-06-01

    Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.

  9. Mapping raised bogs with an iterative one-class classification approach

    NASA Astrophysics Data System (ADS)

    Mack, Benjamin; Roscher, Ribana; Stenzel, Stefanie; Feilhauer, Hannes; Schmidtlein, Sebastian; Waske, Björn

    2016-10-01

    Land use and land cover maps are one of the most commonly used remote sensing products. In many applications the user only requires a map of one particular class of interest, e.g. a specific vegetation type or an invasive species. One-class classifiers are appealing alternatives to common supervised classifiers because they can be trained with labeled training data of the class of interest only. However, training an accurate one-class classification (OCC) model is challenging, particularly when facing a large image, a small class and few training samples. To tackle these problems we propose an iterative OCC approach. The presented approach uses a biased Support Vector Machine as core classifier. In an iterative pre-classification step a large part of the pixels not belonging to the class of interest is classified. The remaining data is classified by a final classifier with a novel model and threshold selection approach. The specific objective of our study is the classification of raised bogs in a study site in southeast Germany, using multi-seasonal RapidEye data and a small number of training sample. Results demonstrate that the iterative OCC outperforms other state of the art one-class classifiers and approaches for model selection. The study highlights the potential of the proposed approach for an efficient and improved mapping of small classes such as raised bogs. Overall the proposed approach constitutes a feasible approach and useful modification of a regular one-class classifier.

  10. Higgs boson photoproduction at the LHC

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2011-07-15

    We present the current development of the photoproduction approach for the Higgs boson with its application to pp and pA collisions at the LHC. We perform a different analysis for the Gap Survival Probability, where we consider a probability of 3% and also a more optimistic value of 10% based on the HERA data for dijet production. As a result, the cross section for the exclusive Higgs boson production is about 2 fb and 6 fb in pp collisions and 617 and 2056 fb for pPb collisions, considering the gap survival factor of 3% and 10%, respectively.

  11. A whole spectroscopic mapping approach for studying the spatial distribution of pigments in paintings

    NASA Astrophysics Data System (ADS)

    Mosca, S.; Alberti, R.; Frizzi, T.; Nevin, A.; Valentini, G.; Comelli, D.

    2016-09-01

    We propose a non-invasive approach for the identification and mapping of pigments in paintings. The method is based on three highly complementary imaging spectroscopy techniques, visible multispectral imaging, X-Ray fluorescence mapping and Raman mapping, combined with multivariate data analysis of multidimensional spectroscopic datasets for the extraction of key distribution information in a semi-automatic way. The proposed approach exploits a macro-Raman mapping device, capable of detecting Raman signals from non-perfectly planar surfaces without the need of refocusing. Here, we show that the presence of spatially correlated Raman signals, detected in adjacent points of a painted surface, reinforces the level of confidence for material identification with respect to single-point analysis, even in the presence of very weak and complex Raman signals. The new whole-mapping approach not only provides the identification of inorganic and organic pigments but also gives striking information on the spatial distribution of pigments employed in complex mixtures for achieving different hues. Moreover, we demonstrate how the synergic combination on three spectroscopic methods, characterized by highly different time consumption, yields maximum information.

  12. Approximate gauge symmetry of composite vector bosons

    NASA Astrophysics Data System (ADS)

    Suzuki, Mahiko

    2010-08-01

    It can be shown in a solvable field theory model that the couplings of the composite vector bosons made of a fermion pair approach the gauge couplings in the limit of strong binding. Although this phenomenon may appear accidental and special to the vector bosons made of a fermion pair, we extend it to the case of bosons being constituents and find that the same phenomenon occurs in a more intriguing way. The functional formalism not only facilitates computation but also provides us with a better insight into the generating mechanism of approximate gauge symmetry, in particular, how the strong binding and global current conservation conspire to generate such an approximate symmetry. Remarks are made on its possible relevance or irrelevance to electroweak and higher symmetries.

  13. Deciphering the genomic architecture of the stickleback brain with a novel multilocus gene-mapping approach.

    PubMed

    Li, Zitong; Guo, Baocheng; Yang, Jing; Herczeg, Gábor; Gonda, Abigél; Balázs, Gergely; Shikano, Takahito; Calboli, Federico C F; Merilä, Juha

    2017-03-01

    Quantitative traits important to organismal function and fitness, such as brain size, are presumably controlled by many small-effect loci. Deciphering the genetic architecture of such traits with traditional quantitative trait locus (QTL) mapping methods is challenging. Here, we investigated the genetic architecture of brain size (and the size of five different brain parts) in nine-spined sticklebacks (Pungitius pungitius) with the aid of novel multilocus QTL-mapping approaches based on a de-biased LASSO method. Apart from having more statistical power to detect QTL and reduced rate of false positives than conventional QTL-mapping approaches, the developed methods can handle large marker panels and provide estimates of genomic heritability. Single-locus analyses of an F2 interpopulation cross with 239 individuals and 15 198, fully informative single nucleotide polymorphisms (SNPs) uncovered 79 QTL associated with variation in stickleback brain size traits. Many of these loci were in strong linkage disequilibrium (LD) with each other, and consequently, a multilocus mapping of individual SNPs, accounting for LD structure in the data, recovered only four significant QTL. However, a multilocus mapping of SNPs grouped by linkage group (LG) identified 14 LGs (1-6 depending on the trait) that influence variation in brain traits. For instance, 17.6% of the variation in relative brain size was explainable by cumulative effects of SNPs distributed over six LGs, whereas 42% of the variation was accounted for by all 21 LGs. Hence, the results suggest that variation in stickleback brain traits is influenced by many small-effect loci. Apart from suggesting moderately heritable (h(2)  ≈ 0.15-0.42) multifactorial genetic architecture of brain traits, the results highlight the challenges in identifying the loci contributing to variation in quantitative traits. Nevertheless, the results demonstrate that the novel QTL-mapping approach developed here has distinctive advantages

  14. Integrated environmental mapping and monitoring, a methodological approach to optimise knowledge gathering and sampling strategy.

    PubMed

    Nilssen, Ingunn; Ødegård, Øyvind; Sørensen, Asgeir J; Johnsen, Geir; Moline, Mark A; Berge, Jørgen

    2015-07-15

    New technology has led to new opportunities for a holistic environmental monitoring approach adjusted to purpose and object of interest. The proposed integrated environmental mapping and monitoring (IEMM) concept, presented in this paper, describes the different steps in such a system from mission of survey to selection of parameters, sensors, sensor platforms, data collection, data storage, analysis and to data interpretation for reliable decision making. The system is generic; it can be used by authorities, industry and academia and is useful for planning- and operational phases. In the planning process the systematic approach is also ideal to identify areas with gap of knowledge. The critical stages of the concept is discussed and exemplified by two case studies, one environmental mapping and one monitoring case. As an operational system, the IEMM concept can contribute to an optimised integrated environmental mapping and monitoring for knowledge generation as basis for decision making.

  15. Concept Maps in the Classroom: A New Approach to Reveal Students' Conceptual Change

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Liefländer, Anne K.; Bogner, Franz X.

    2015-01-01

    When entering the classroom, adolescents already hold various conceptions on science topics. Concept maps may function as useful tools to reveal such conceptions although labor-intensive analysis often prevents application in typical classroom situations. The authors aimed to provide teachers with an appropriate approach to analyze students'…

  16. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  17. Orbital stability during the mapping and approach phases of the MarcoPolo-R spacecraft

    NASA Astrophysics Data System (ADS)

    Wickhusen, K.; Hussmann, H.; Oberst, J.; Luedicke, F.

    2012-09-01

    In support of the Marco-Polo-R mission we are analyzing the motion of the spacecraft in the vicinity of its primary target, the binary asteroid system 175706 (1996 FG3). We ran simulations in order to support the general mapping, the approach, and the sampling phase

  18. The Criterion-Related Validity of a Computer-Based Approach for Scoring Concept Maps

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Koul, Ravinder; Salehi, Roya

    2006-01-01

    This investigation seeks to confirm a computer-based approach that can be used to score concept maps (Poindexter & Clariana, 2004) and then describes the concurrent criterion-related validity of these scores. Participants enrolled in two graduate courses (n=24) were asked to read about and research online the structure and function of the heart…

  19. Does Constructivist Approach Applicable through Concept Maps to Achieve Meaningful Learning in Science?

    ERIC Educational Resources Information Center

    Jena, Ananta Kumar

    2012-01-01

    This study deals with the application of constructivist approach through individual and cooperative modes of spider and hierarchical concept maps to achieve meaningful learning on science concepts (e.g. acids, bases & salts, physical and chemical changes). The main research questions were: Q (1): is there any difference in individual and…

  20. Improving Students' Creative Thinking and Achievement through the Implementation of Multiple Intelligence Approach with Mind Mapping

    ERIC Educational Resources Information Center

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…

  1. Determination of contact maps in proteins: A combination of structural and chemical approaches

    SciTech Connect

    Wołek, Karol; Cieplak, Marek

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  2. Mapping the genomic architecture of adaptive traits with interspecific introgressive origin: a coalescent-based approach.

    PubMed

    Hejase, Hussein A; Liu, Kevin J

    2016-01-11

    Recent studies of eukaryotes including human and Neandertal, mice, and butterflies have highlighted the major role that interspecific introgression has played in adaptive trait evolution. A common question arises in each case: what is the genomic architecture of the introgressed traits? One common approach that can be used to address this question is association mapping, which looks for genotypic markers that have significant statistical association with a trait. It is well understood that sample relatedness can be a confounding factor in association mapping studies if not properly accounted for. Introgression and other evolutionary processes (e.g., incomplete lineage sorting) typically introduce variation among local genealogies, which can also differ from global sample structure measured across all genomic loci. In contrast, state-of-the-art association mapping methods assume fixed sample relatedness across the genome, which can lead to spurious inference. We therefore propose a new association mapping method called Coal-Map, which uses coalescent-based models to capture local genealogical variation alongside global sample structure. Using simulated and empirical data reflecting a range of evolutionary scenarios, we compare the performance of Coal-Map against EIGENSTRAT, a leading association mapping method in terms of its popularity, power, and type I error control. Our empirical data makes use of hundreds of mouse genomes for which adaptive interspecific introgression has recently been described. We found that Coal-Map's performance is comparable or better than EIGENSTRAT in terms of statistical power and false positive rate. Coal-Map's performance advantage was greatest on model conditions that most closely resembled empirically observed scenarios of adaptive introgression. These conditions had: (1) causal SNPs contained in one or a few introgressed genomic loci and (2) varying rates of gene flow - from high rates to very low rates where incomplete lineage

  3. The W Boson Mass Measurement

    NASA Astrophysics Data System (ADS)

    Kotwal, Ashutosh V.

    2016-10-01

    The measurement of the W boson mass has been growing in importance as its precision has improved, along with the precision of other electroweak observables and the top quark mass. Over the last decade, the measurement of the W boson mass has been led at hadron colliders. Combined with the precise measurement of the top quark mass at hadron colliders, the W boson mass helped to pin down the mass of the Standard Model Higgs boson through its induced radiative correction on the W boson mass. With the discovery of the Higgs boson and the measurement of its mass, the electroweak sector of the Standard Model is over-constrained. Increasing the precision of the W boson mass probes new physics at the TeV-scale. We summarize an extensive Tevatron (1984-2011) program to measure the W boson mass at the CDF and Dø experiments. We highlight the recent Tevatron measurements and prospects for the final Tevatron measurements.

  4. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    NASA Astrophysics Data System (ADS)

    Raj, Raghu G.

    2016-07-01

    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  5. Toward real-time three-dimensional mapping of surficial aquifers using a hybrid modeling approach

    NASA Astrophysics Data System (ADS)

    Friedel, Michael J.; Esfahani, Akbar; Iwashita, Fabio

    2016-02-01

    A hybrid modeling approach is proposed for near real-time three-dimensional (3D) mapping of surficial aquifers. First, airborne frequency-domain electromagnetic (FDEM) measurements are numerically inverted to obtain subsurface resistivities. Second, a machine-learning (ML) algorithm is trained using the FDEM measurements and inverted resistivity profiles, and borehole geophysical and hydrogeologic data. Third, the trained ML algorithm is used together with independent FDEM measurements to map the spatial distribution of the aquifer system. Efficacy of the hybrid approach is demonstrated for mapping a heterogeneous surficial aquifer and confining unit in northwestern Nebraska, USA. For this case, independent performance testing reveals that aquifer mapping is unbiased with a strong correlation (0.94) among numerically inverted and ML-estimated binary (clay-silt or sand-gravel) layer resistivities (5-20 ohm-m or 21-5,000 ohm-m), and an intermediate correlation (0.74) for heterogeneous (clay, silt, sand, gravel) layer resistivities (5-5,000 ohm-m). Reduced correlation for the heterogeneous model is attributed to over-estimating the under-sampled high-resistivity gravels (about 0.5 % of the training data), and when removed the correlation increases (0.87). Independent analysis of the numerically inverted and ML-estimated resistivities finds that the hybrid procedure preserves both univariate and spatial statistics for each layer. Following training, the algorithms can map 3D surficial aquifers as fast as leveled FDEM measurements are presented to the ML network.

  6. Higgs boson hunting

    SciTech Connect

    Dawson, S.; Haber, H.E.; Rindani, S.D.

    1989-05-01

    This is the summary report of the Higgs Boson Working Group. We discuss a variety of search techniques for a Higgs boson which is lighter than the Z. The processes K /yields/ /pi/H, /eta//prime/ /yields/ /eta/H,/Upsilon/ /yields/ H/gamma/ and e/sup +/e/sup /minus// /yields/ ZH are examined with particular attention paid to theoretical uncertainties in the calculations. We also briefly examine new features of Higgs phenomenology in a model which contains Higgs triplets as well as the usual doublet of scalar fields. 33 refs., 6 figs., 1 tab.

  7. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    SciTech Connect

    Zhou, Shiguo; Kile, A.; Bechner, M.; Kvikstad, E.; Deng, W.; Wei, J.; Severin, J.; Runnheim, R.; Churas, C.; Forrest, D.; Dimalanta, E.; Lamers, C.; Burland, V.; Blattner, F. R.; Schwartz, David C.

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  8. Using Concept Mapping in Community-Based Participatory Research: A Mixed Methods Approach

    PubMed Central

    Windsor, Liliane Cambraia

    2015-01-01

    Community-based participatory research (CBPR) has been identified as a useful approach to increasing community involvement in research. Developing rigorous methods in conducting CBPR is an important step in gaining more support for this approach. The current article argues that concept mapping, a structured mixed methods approach, is useful in the initial development of a rigorous CBPR program of research aiming to develop culturally tailored and community-based health interventions for vulnerable populations. A research project examining social dynamics and consequences of alcohol and substance use in Newark, New Jersey, is described to illustrate the use of concept mapping methodology in CBPR. A total of 75 individuals participated in the study. PMID:26561484

  9. A new GIS approach for reconstructing and mapping dynamic late Holocene coastal plain palaeogeography

    NASA Astrophysics Data System (ADS)

    Pierik, H. J.; Cohen, K. M.; Stouthamer, E.

    2016-10-01

    The geomorphological development of Holocene coastal plains around the world has been studied since the beginning of the twentieth century from various disciplines, resulting in large amounts of data. However, the overwhelming quantities and heterogeneous nature of this data have caused the divided knowledge to remain inconsistent and fragmented. To keep improving the understanding of coastal plain geomorphology and geology, cataloguing of data and integration of knowledge are essential. In this paper we present a GIS that incorporates the accumulated data of the Netherlands' coastal plain and functions as a storage and integration tool for coastal plain mapped data. The GIS stores redigitised architectural elements (beach barriers, tidal channels, intertidal flats, supratidal flats, and coastal fresh water peat) from earlier mappings in separate map layers. A coupled catalogue-style database stores the dating information of these elements, besides references to source studies and annotations regarding changed insights. Using scripts, the system automatically establishes palaeogeographical maps for any chosen moment, combining the above mapping and dating information. In our approach, we strip the information to architectural element level, and we separate mapping from dating information, serving the automatic generation of time slice maps. It enables a workflow in which the maker can iteratively regenerate maps, which speeds up fine-tuning and thus the quality of palaeogeographical reconstruction. The GIS currently covers the late Holocene coastal plain development of the Netherlands. This period witnessed widespread renewed flooding along the southern North Sea coast, coinciding with large-scale reclamation and human occupation. Our GIS method is generic and can be expanded and adapted to allow faster integrated processing of growing amounts of data for many coastal areas and other large urbanising lowlands around the world. It allows maintaining actual data

  10. An automated approach for mapping persistent ice and snow cover over high latitude regions

    USGS Publications Warehouse

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  11. Correlation Between Local Structure and Boson Peak in Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Ahmad, Azkar Saeed; Zhao, Xiangnan; Xu, Mingxiang; Zhang, Dongxian; Hu, Junwen; Fecht, Hans J.; Wang, Xiaodong; Cao, Qingping; Jiang, J. Z.

    2017-01-01

    We made a systematic study of the boson peak for six different Zr-based metallic glasses and found a universal correlation between average local atomic structure and boson peak. It is found that the boson peak can be decomposed into six characteristic vibratory modes, i.e., Debye's vibratory mode and five Einstein's vibratory modes. By using the Ioffe-Regel condition over all studied Zr-based metallic glasses, we reveal that atomic pair correlation function exactly maps on the low-temperature dynamics and the origin of the boson peak, which is the sum of vibrations of local density fluctuation domains in the glasses. In addition, it is found that the Debye's type oscillators are the major contributors to the low-temperature specific heat capacities. This study opens a new way of understanding the relationship of the physical properties with the atomic arrangements in glasses.

  12. Effects of concept map teaching on students' critical thinking and approach to learning and studying.

    PubMed

    Chen, Shiah-Lian; Liang, Tienli; Lee, Mei-Li; Liao, I-Chen

    2011-08-01

    The purpose of this study was to explore the effects of concept mapping in developing critical thinking ability and approach to learning and studying. A quasi-experimental study design with a purposive sample was drawn from a group of nursing students enrolled in a medical-surgical nursing course in central Taiwan. Students in the experimental group were taught to use concept mapping in their learning. Students in the control group were taught by means of traditional lectures. After the intervention, the experimental group had better overall critical thinking scores than did the control group, although the difference was not statistically significant. After controlling for the effects of age and the pretest score on critical thinking using analysis of covariance, the experimental group had significantly higher adjusted mean scores on inference and overall critical thinking compared with the control group. Concept mapping is an effective tool for improving students' ability to think critically.

  13. Phase map retrieval in digital holography: avoiding the undersampling effect by a lateral shear approach.

    PubMed

    Ferraro, P; Del Core, C; Miccio, L; Grilli, S; De Nicola, S; Finizio, A; Coppola, G

    2007-08-01

    In digital holography (DH) the numerical reconstruction of the whole wavefront allows one to extract the wrapped phase map mod, 2 pi. It can occur that the reconstructed wrapped phase map in the image plane is undersampled because of the limited pixel size in that plane. In such a case the phase distribution cannot be retrieved correctly by the usual unwrapping procedures. We show that the use of the digital lateral-shearing interferometry approach in DH provides the correct reconstruction of the phase map in the image plane, even in extreme cases where the phase profile changes very rapidly. We demonstrate the effectiveness of the method in a particular case where the profile of a highly curved silicon microelectromechanical system membrane has to be reconstructed.

  14. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  15. Automated mapping of glacial overdeepenings beneath contemporary ice sheets: Approaches and potential applications

    NASA Astrophysics Data System (ADS)

    Patton, Henry; Swift, Darrel A.; Clark, Chris D.; Livingstone, Stephen J.; Cook, Simon J.; Hubbard, Alun

    2015-03-01

    Awareness is growing on the significance of overdeepenings in ice sheet systems. However, a complete understanding of overdeepening formation is lacking, meaning observations of overdeepening location and morphometry are urgently required to motivate process understanding. Subject to the development of appropriate mapping approaches, high resolution subglacial topography data sets covering the whole of Antarctica and Greenland offer significant potential to acquire such observations and to relate overdeepening characteristics to ice sheet parameters. We explore a possible method for mapping overdeepenings beneath the Antarctic and Greenland ice sheets and illustrate a potential application of this approach by testing a possible relationship between overdeepening elongation ratio and ice sheet flow velocity. We find that hydrological and terrain filtering approaches are unsuited to mapping overdeepenings and develop a novel rule-based GIS methodology that delineates overdeepening perimeters by analysis of closed-contour properties. We then develop GIS procedures that provide information on overdeepening morphology and topographic context. Limitations in the accuracy and resolution of bed-topography data sets mean that application to glaciological problems requires consideration of quality-control criteria to (a) remove potentially spurious depressions and (b) reduce uncertainties that arise from the inclusion of depressions of nonglacial origin, or those in regions where empirical data are sparse. To address the problem of overdeepening elongation, potential quality control criteria are introduced; and discussion of this example serves to highlight the limitations that mapping approaches - and applications of such approaches - must confront. We predict that improvements in bed-data quality will reduce the need for quality control procedures and facilitate increasingly robust insights from empirical data.

  16. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    PubMed

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  17. High-resolution geologic mapping of the inner continental shelf: Boston Harbor and approaches, Massachusetts

    USGS Publications Warehouse

    Ackerman, Seth D.; Butman, Bradford; Barnhardt, Walter A.; Danforth, William W.; Crocker, James M.

    2006-01-01

    This report presents the surficial geologic framework data and information for the sea floor of Boston Harbor and Approaches, Massachusetts (fig. 1.1). This mapping was conducted as part of a cooperative program between the U.S. Geological Survey (USGS), the Massachusetts Office of Coastal Zone Management (CZM), and the National Oceanic and Atmospheric Administration (NOAA). The primary objective of this project was to provide sea floor geologic information and maps of Boston Harbor to aid resource management, scientific research, industry and the public. A secondary objective was to test the feasibility of using NOAA hydrographic survey data, normally collected to update navigation charts, to create maps of the sea floor suitable for geologic and habitat interpretations. Defining sea-floor geology is the first steps toward managing ocean resources and assessing environmental changes due to natural or human activity. The geophysical data for these maps were collected as part of hydrographic surveys carried out by NOAA in 2000 and 2001 (fig. 1.2). Bottom photographs, video, and samples of the sediments were collected in September 2004 to help in the interpretation of the geophysical data. Included in this report are high-resolution maps of the sea floor, at a scale of 1:25,000; the data used to create these maps in Geographic Information Systems (GIS) format; a GIS project; and a gallery of photographs of the sea floor. Companion maps of sea floor to the north Boston Harbor and Approaches are presented by Barnhardt and others (2006) and to the east by Butman and others (2003a,b,c). See Butman and others (2004) for a map of Massachusetts Bay at a scale of 1:125,000. The sections of this report are listed in the navigation bar along the left-hand margin of this page. Section 1 (this section) introduces the report. Section 2 presents the large-format map sheets. Section 3 describes data collection, processing, and analysis. Section 4 summarizes the geologic history of

  18. Continuous intensity map optimization (CIMO): a novel approach to leaf sequencing in step and shoot IMRT.

    PubMed

    Cao, Daliang; Earl, Matthew A; Luan, Shuang; Shepard, David M

    2006-04-01

    A new leaf-sequencing approach has been developed that is designed to reduce the number of required beam segments for step-and-shoot intensity modulated radiation therapy (IMRT). This approach to leaf sequencing is called continuous-intensity-map-optimization (CIMO). Using a simulated annealing algorithm, CIMO seeks to minimize differences between the optimized and sequenced intensity maps. Two distinguishing features of the CIMO algorithm are (1) CIMO does not require that each optimized intensity map be clustered into discrete levels and (2) CIMO is not rule-based but rather simultaneously optimizes both the aperture shapes and weights. To test the CIMO algorithm, ten IMRT patient cases were selected (four head-and-neck, two pancreas, two prostate, one brain, and one pelvis). For each case, the optimized intensity maps were extracted from the Pinnacle3 treatment planning system. The CIMO algorithm was applied, and the optimized aperture shapes and weights were loaded back into Pinnacle. A final dose calculation was performed using Pinnacle's convolution/superposition based dose calculation. On average, the CIMO algorithm provided a 54% reduction in the number of beam segments as compared with Pinnacle's leaf sequencer. The plans sequenced using the CIMO algorithm also provided improved target dose uniformity and a reduced discrepancy between the optimized and sequenced intensity maps. For ten clinical intensity maps, comparisons were performed between the CIMO algorithm and the power-of-two reduction algorithm of Xia and Verhey [Med. Phys. 25(8), 1424-1434 (1998)]. When the constraints of a Varian Millennium multileaf collimator were applied, the CIMO algorithm resulted in a 26% reduction in the number of segments. For an Elekta multileaf collimator, the CIMO algorithm resulted in a 67% reduction in the number of segments. An average leaf sequencing time of less than one minute per beam was observed.

  19. Simulating generic spin-boson models with matrix product states

    NASA Astrophysics Data System (ADS)

    Wall, Michael L.; Safavi-Naini, Arghavan; Rey, Ana Maria

    2016-11-01

    The global coupling of few-level quantum systems ("spins") to a discrete set of bosonic modes is a key ingredient for many applications in quantum science, including large-scale entanglement generation, quantum simulation of the dynamics of long-range interacting spin models, and hybrid platforms for force and spin sensing. We present a general numerical framework for treating the out-of-equilibrium dynamics of such models based on matrix product states. Our approach applies for generic spin-boson systems: it treats any spatial and operator dependence of the two-body spin-boson coupling and places no restrictions on relative energy scales. We show that the full counting statistics of collective spin measurements and infidelity of quantum simulation due to spin-boson entanglement, both of which are difficult to obtain by other techniques, are readily calculable in our approach. We benchmark our method using a recently developed exact solution for a particular spin-boson coupling relevant to trapped ion quantum simulators. Finally, we show how decoherence can be incorporated within our framework using the method of quantum trajectories, and study the dynamics of an open-system spin-boson model with spatially nonuniform spin-boson coupling relevant for trapped atomic ion crystals in the presence of molecular ion impurities.

  20. Global land cover mapping at 30 m resolution: A POK-based operational approach

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  1. A novel approach to locate Phytophthora infestans resistance genes on the potato genetic map.

    PubMed

    Jacobs, Mirjam M J; Vosman, Ben; Vleeshouwers, Vivianne G A A; Visser, Richard G F; Henken, Betty; van den Berg, Ronald G

    2010-02-01

    Mapping resistance genes is usually accomplished by phenotyping a segregating population for the resistance trait and genotyping it using a large number of markers. Most resistance genes are of the NBS-LRR type, of which an increasing number is sequenced. These genes and their analogs (RGAs) are often organized in clusters. Clusters tend to be rather homogenous, viz. containing genes that show high sequence similarity with each other. From many of these clusters the map position is known. In this study we present and test a novel method to quickly identify to which cluster a new resistance gene belongs and to produce markers that can be used for introgression breeding. We used NBS profiling to identify markers in bulked DNA samples prepared from resistant and susceptible genotypes of small segregating populations. Markers co-segregating with resistance can be tested on individual plants and directly used for breeding. To identify the resistance gene cluster a gene belongs to, the fragments were sequenced and the sequences analyzed using bioinformatics tools. Putative map positions arising from this analysis were validated using markers mapped in the segregating population. The versatility of the approach is demonstrated with a number of populations derived from wild Solanum species segregating for P. infestans resistance. Newly identified P. infestans resistance genes originating from S. verrucosum, S. schenckii, and S. capsicibaccatum could be mapped to potato chromosomes 6, 4, and 11, respectively.

  2. A Random-Model Approach to QTL Mapping in Multiparent Advanced Generation Intercross (MAGIC) Populations.

    PubMed

    Wei, Julong; Xu, Shizhong

    2016-02-01

    Most standard QTL mapping procedures apply to populations derived from the cross of two parents. QTL detected from such biparental populations are rarely relevant to breeding programs because of the narrow genetic basis: only two alleles are involved per locus. To improve the generality and applicability of mapping results, QTL should be detected using populations initiated from multiple parents, such as the multiparent advanced generation intercross (MAGIC) populations. The greatest challenges of QTL mapping in MAGIC populations come from multiple founder alleles and control of the genetic background information. We developed a random-model methodology by treating the founder effects of each locus as random effects following a normal distribution with a locus-specific variance. We also fit a polygenic effect to the model to control the genetic background. To improve the statistical power for a scanned marker, we release the marker effect absorbed by the polygene back to the model. In contrast to the fixed-model approach, we estimate and test the variance of each locus and scan the entire genome one locus at a time using likelihood-ratio test statistics. Simulation studies showed that this method can increase statistical power and reduce type I error compared with composite interval mapping (CIM) and multiparent whole-genome average interval mapping (MPWGAIM). We demonstrated the method using a public Arabidopsis thaliana MAGIC population and a mouse MAGIC population.

  3. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.

    PubMed

    Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement.

  4. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    PubMed Central

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  5. Visualization: a really generic approach or the art of mapping data to graphical objects

    NASA Astrophysics Data System (ADS)

    Trilk, Joern; Schuetz, Frank

    1998-05-01

    Visualization is an important technology for analyzing large amounts of data. However, the process of creating meaningful visualizations is quite difficult. The success of this process depends heavily on a good mapping of objects present in the application domain to objects used in the graphical representation. Both kinds of objects possess several attributes. Whereas data objects have attributes of certain types (e.g. integers, strings) graphical objects are characterized by their appearance (shape, color, size, etc.). In our approach, the user may map arbitrarily data attributes to graphical attributes, leading to a great flexibility. In our opinion, this is the only possibility to achieve a really generic approach. To evaluate our ideas, we developed a tool called ProViS. This tool indicates the possible attributes of data objects as well as graphical objects. Depending on his goals, the user can then 'connect' (freely) attributes of data objects to attributes of their graphical counterparts. The structure behind the application objects can be worked out very easily with the help of various layout algorithms. In addition, we integrated several mechanisms (e.g. ghosting, hiding, grouping, fisheye views) to reduce complexity and to further enhance the three-dimensional visualization. In this paper, first of all we take a look at the basic principle of visualization: mapping data. Then we present, ProViS, a visualization tool implementing our idea of mapping.

  6. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    NASA Astrophysics Data System (ADS)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  7. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  8. A pooling-based approach to mapping genetic variants associated with DNA methylation.

    PubMed

    Kaplow, Irene M; MacIsaac, Julia L; Mah, Sarah M; McEwen, Lisa M; Kobor, Michael S; Fraser, Hunter B

    2015-06-01

    DNA methylation is an epigenetic modification that plays a key role in gene regulation. Previous studies have investigated its genetic basis by mapping genetic variants that are associated with DNA methylation at specific sites, but these have been limited to microarrays that cover <2% of the genome and cannot account for allele-specific methylation (ASM). Other studies have performed whole-genome bisulfite sequencing on a few individuals, but these lack statistical power to identify variants associated with DNA methylation. We present a novel approach in which bisulfite-treated DNA from many individuals is sequenced together in a single pool, resulting in a truly genome-wide map of DNA methylation. Compared to methods that do not account for ASM, our approach increases statistical power to detect associations while sharply reducing cost, effort, and experimental variability. As a proof of concept, we generated deep sequencing data from a pool of 60 human cell lines; we evaluated almost twice as many CpGs as the largest microarray studies and identified more than 2000 genetic variants associated with DNA methylation. We found that these variants are highly enriched for associations with chromatin accessibility and CTCF binding but are less likely to be associated with traits indirectly linked to DNA, such as gene expression and disease phenotypes. In summary, our approach allows genome-wide mapping of genetic variants associated with DNA methylation in any tissue of any species, without the need for individual-level genotype or methylation data.

  9. Putting people on the map through an approach that integrates social data in conservation planning.

    PubMed

    Stephanson, Sheri L; Mascia, Michael B

    2014-10-01

    Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making.

  10. A pooling-based approach to mapping genetic variants associated with DNA methylation

    SciTech Connect

    Kaplow, Irene M.; MacIsaac, Julia L.; Mah, Sarah M.; McEwen, Lisa M.; Kobor, Michael S.; Fraser, Hunter B.

    2015-04-24

    DNA methylation is an epigenetic modification that plays a key role in gene regulation. Previous studies have investigated its genetic basis by mapping genetic variants that are associated with DNA methylation at specific sites, but these have been limited to microarrays that cover <2% of the genome and cannot account for allele-specific methylation (ASM). Other studies have performed whole-genome bisulfite sequencing on a few individuals, but these lack statistical power to identify variants associated with DNA methylation. We present a novel approach in which bisulfite-treated DNA from many individuals is sequenced together in a single pool, resulting in a truly genome-wide map of DNA methylation. Compared to methods that do not account for ASM, our approach increases statistical power to detect associations while sharply reducing cost, effort, and experimental variability. As a proof of concept, we generated deep sequencing data from a pool of 60 human cell lines; we evaluated almost twice as many CpGs as the largest microarray studies and identified more than 2000 genetic variants associated with DNA methylation. Here we found that these variants are highly enriched for associations with chromatin accessibility and CTCF binding but are less likely to be associated with traits indirectly linked to DNA, such as gene expression and disease phenotypes. In summary, our approach allows genome-wide mapping of genetic variants associated with DNA methylation in any tissue of any species, without the need for individual-level genotype or methylation data.

  11. Higgs boson production via vector-boson fusion at next-to-next-to-leading order in QCD.

    PubMed

    Bolzoni, Paolo; Maltoni, Fabio; Moch, Sven-Olaf; Zaro, Marco

    2010-07-02

    We present the total cross sections at next-to-next-to-leading order in the strong coupling for Higgs boson production via weak-boson fusion. Our results are obtained via the structure function approach, which builds upon the approximate, though very accurate, factorization of the QCD corrections between the two quark lines. The theoretical uncertainty on the total cross sections at the LHC from higher order corrections and the parton distribution uncertainties are estimated at the 2% level each for a wide range of Higgs boson masses.

  12. Correlation energy for elementary bosons: Physics of the singularity

    SciTech Connect

    Shiau, Shiue-Yuan; Combescot, Monique; Chang, Yia-Chung

    2016-04-15

    We propose a compact perturbative approach that reveals the physical origin of the singularity occurring in the density dependence of correlation energy: like fermions, elementary bosons have a singular correlation energy which comes from the accumulation, through Feynman “bubble” diagrams, of the same non-zero momentum transfer excitations from the free particle ground state, that is, the Fermi sea for fermions and the Bose–Einstein condensate for bosons. This understanding paves the way toward deriving the correlation energy of composite bosons like atomic dimers and semiconductor excitons, by suggesting Shiva diagrams that have similarity with Feynman “bubble” diagrams, the previous elementary boson approaches, which hide this physics, being inappropriate to do so.

  13. Mapping genetic determinants of viral traits with FST and quantitative trait locus (QTL) approaches.

    PubMed

    Doumayrou, Juliette; Thébaud, Gaël; Vuillaume, Florence; Peterschmitt, Michel; Urbino, Cica

    2015-10-01

    The genetic determinism of viral traits can generally be dissected using either forward or reverse genetics because the clonal reproduction of viruses does not require the use of approaches based on laboratory crosses. Nevertheless, we hypothesized that recombinant viruses could be analyzed as sexually reproducing organisms, using either a quantitative trait loci (QTL) approach or a locus-by-locus fixation index (FST). Locus-by-locus FST analysis, and four different regressions and interval mapping algorithms of QTL analysis were applied to a phenotypic and genotypic dataset previously obtained from 47 artificial recombinant genomes generated between two begomovirus species. Both approaches assigned the determinant of within-host accumulation-previously identified using standard virology approaches-to a region including the 5׳ end of the replication-associated protein (Rep) gene and the upstream intergenic region. This study provides a proof of principle that QTL and population genetics tools can be extended to characterize the genetic determinants of viral traits.

  14. An assessment of a collaborative mapping approach for exploring land use patterns for several European metropolises

    NASA Astrophysics Data System (ADS)

    Jokar Arsanjani, Jamal; Vaz, Eric

    2015-03-01

    Until recently, land surveys and digital interpretation of remotely sensed imagery have been used to generate land use inventories. These techniques however, are often cumbersome and costly, allocating large amounts of technical and temporal costs. The technological advances of web 2.0 have brought a wide array of technological achievements, stimulating the participatory role in collaborative and crowd sourced mapping products. This has been fostered by GPS-enabled devices, and accessible tools that enable visual interpretation of high resolution satellite images/air photos provided in collaborative mapping projects. Such technologies offer an integrative approach to geography by means of promoting public participation and allowing accurate assessment and classification of land use as well as geographical features. OpenStreetMap (OSM) has supported the evolution of such techniques, contributing to the existence of a large inventory of spatial land use information. This paper explores the introduction of this novel participatory phenomenon for land use classification in Europe's metropolitan regions. We adopt a positivistic approach to assess comparatively the accuracy of these contributions of OSM for land use classifications in seven large European metropolitan regions. Thematic accuracy and degree of completeness of OSM data was compared to available Global Monitoring for Environment and Security Urban Atlas (GMESUA) datasets for the chosen metropolises. We further extend our findings of land use within a novel framework for geography, justifying that volunteered geographic information (VGI) sources are of great benefit for land use mapping depending on location and degree of VGI dynamism and offer a great alternative to traditional mapping techniques for metropolitan regions throughout Europe. Evaluation of several land use types at the local level suggests that a number of OSM classes (such as anthropogenic land use, agricultural and some natural environment

  15. MAP3D: a media processor approach for high-end 3D graphics

    NASA Astrophysics Data System (ADS)

    Darsa, Lucia; Stadnicki, Steven; Basoglu, Chris

    1999-12-01

    Equator Technologies, Inc. has used a software-first approach to produce several programmable and advanced VLIW processor architectures that have the flexibility to run both traditional systems tasks and an array of media-rich applications. For example, Equator's MAP1000A is the world's fastest single-chip programmable signal and image processor targeted for digital consumer and office automation markets. The Equator MAP3D is a proposal for the architecture of the next generation of the Equator MAP family. The MAP3D is designed to achieve high-end 3D performance and a variety of customizable special effects by combining special graphics features with high performance floating-point and media processor architecture. As a programmable media processor, it offers the advantages of a completely configurable 3D pipeline--allowing developers to experiment with different algorithms and to tailor their pipeline to achieve the highest performance for a particular application. With the support of Equator's advanced C compiler and toolkit, MAP3D programs can be written in a high-level language. This allows the compiler to successfully find and exploit any parallelism in a programmer's code, thus decreasing the time to market of a given applications. The ability to run an operating system makes it possible to run concurrent applications in the MAP3D chip, such as video decoding while executing the 3D pipelines, so that integration of applications is easily achieved--using real-time decoded imagery for texturing 3D objects, for instance. This novel architecture enables an affordable, integrated solution for high performance 3D graphics.

  16. Repelling Point Bosons

    NASA Astrophysics Data System (ADS)

    McGuire, J. B.

    2011-12-01

    There is a body of conventional wisdom that holds that a solvable quantum problem, by virtue of its solvability, is pathological and thus irrelevant. It has been difficult to refute this view owing to the paucity of theoretical constructs and experimental results. Recent experiments involving equivalent ions trapped in a spatial conformation of extreme anisotropic confinement (longitudinal extension tens, hundreds or even thousands of times transverse extension) have modified the view of relevancy, and it is now possible to consider systems previously thought pathological, in particular point Bosons that repel in one dimension. It has been difficult for the experimentalists to utilize existing theory, mainly due to long-standing theoretical misunderstanding of the relevance of the permutation group, in particular the non-commutativity of translations (periodicity) and transpositions (permutation). This misunderstanding is most easily rectified in the case of repelling Bosons.

  17. Progress in landslide susceptibility mapping over Europe using Tier-based approaches

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Hervás, Javier; Reichenbach, Paola; Malet, Jean-Philippe

    2010-05-01

    The European Thematic Strategy for Soil Protection aims, among other objectives, to ensure a sustainable use of soil. The legal instrument of the strategy, the proposed Framework Directive, suggests identifying priority areas of several soil threats including landslides using a coherent and compatible approach based on the use of common thematic data. In a first stage, this can be achieved through landslide susceptibility mapping using geographically nested, multi-step tiered approaches, where areas identified as of high susceptibility by a first, synoptic-scale Tier ("Tier 1") can then be further assessed and mapped at larger scale by successive Tiers. In order to identify areas prone to landslides at European scale ("Tier 1"), a number of thematic terrain and environmental data sets already available for the whole of Europe can be used as input for a continental scale susceptibility model. However, since no coherent landslide inventory data is available at the moment over the whole continent, qualitative heuristic zonation approaches are proposed. For "Tier 1" a preliminary, simplified model has been developed. It consists of an equally weighting combination of a reduced, continent-wide common dataset of landslide conditioning factors including soil parent material, slope angle and land cover, to derive a landslide susceptibility index using raster mapping units consisting of 1 x 1 km pixels. A preliminary European-wide susceptibility map has thus been produced at 1:1 Million scale, since this is compatible with that of the datasets used. The map has been validated by means of a ratio of effectiveness using samples from landslide inventories in Italy, Austria, Hungary and United Kingdom. Although not differentiated for specific geomorphological environments or specific landslide types, the experimental model reveals a relatively good performance in many European regions at a 1:1 Million scale. An additional "Tier 1" susceptibility map at the same scale and using

  18. Higgs Boson Properties

    NASA Astrophysics Data System (ADS)

    David, André Dührssen, Michael

    2016-10-01

    This chapter presents an overview of the measured properties of the Higgs boson discovered in 2012 by the ATLAS and CMS collaborations at the CERN LHC. Searches for deviations from the properties predicted by the standard theory are also summarised. The present status corresponds to the combined analysis of the full Run 1 data sets of collisions collected at centre-of-mass energies of 7 and 8 TeV.

  19. Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach

    NASA Astrophysics Data System (ADS)

    Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai

    2017-02-01

    The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.

  20. Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Herkenhoff, K.

    2010-01-01

    The 1:500K-scale geologic map of MTM 85200 - the Olympia Cavi region of Mars - has been submitted for peer review [1]. Physiographically, the quadrangle includes portions of Olympia Rupes, a set of sinuous scarps which elevate Planum Boreum 800 meters above Olympia Planum. The region includes the high-standing, spiral troughs of Boreales Scopuli, the rugged and deep depressions of Olympia Cavi, and the vast dune fields of Olympia Undae. Geologically, the mapped units and landforms reflect the recent history of repeated accumulation and degradation. The widespread occurrence of both weakly and strongly stratified units implicates the drape-like accumulation of ice, dust, and sand through climatic variations. Similarly, the occurrence of layer truncations, particularly at unit boundaries, implicates punctuated periods of both localized and regional erosion and surface deflation whereby underlying units were exhumed and their material transported and re-deposited. Herein, we focus on the iterative mapping approaches that allowed not only the accommodation of the burgeoning variety and volume of data sets, but also facilitated the efficient presentation of map information. Unit characteristics and their geologic history are detailed in past abstracts [2-3].

  1. Concept mapping as an approach for expert-guided model building: The example of health literacy.

    PubMed

    Soellner, Renate; Lenartz, Norbert; Rudinger, Georg

    2017-02-01

    Concept mapping served as the starting point for the aim of capturing the comprehensive structure of the construct of 'health literacy.' Ideas about health literacy were generated by 99 experts and resulted in 105 statements that were subsequently organized by 27 experts in an unstructured card sorting. Multidimensional scaling was applied to the sorting data and a two and three-dimensional solution was computed. The three dimensional solution was used in subsequent cluster analysis and resulted in a concept map of nine "clusters": (1) self-regulation, (2) self-perception, (3) proactive approach to health, (4) basic literacy and numeracy skills, (5) information appraisal, (6) information search, (7) health care system knowledge and acting, (8) communication and cooperation, and (9) beneficial personality traits. Subsequently, this concept map served as a starting point for developing a "qualitative" structural model of health literacy and a questionnaire for the measurement of health literacy. On the basis of questionnaire data, a "quantitative" structural model was created by first applying exploratory factor analyses (EFA) and then cross-validating the model with confirmatory factor analyses (CFA). Concept mapping proved to be a highly valuable tool for the process of model building up to translational research in the "real world".

  2. A reciprocal space approach for locating symmetry elements in Patterson superposition maps

    SciTech Connect

    Hendrixson, T.

    1990-09-21

    A method for determining the location and possible existence of symmetry elements in Patterson superposition maps has been developed. A comparison of the original superposition map and a superposition map operated on by the symmetry element gives possible translations to the location of the symmetry element. A reciprocal space approach using structure factor-like quantities obtained from the Fourier transform of the superposition function is then used to determine the best'' location of the symmetry element. Constraints based upon the space group requirements are also used as a check on the locations. The locations of the symmetry elements are used to modify the Fourier transform coefficients of the superposition function to give an approximation of the structure factors, which are then refined using the EG relation. The analysis of several compounds using this method is presented. Reciprocal space techniques for locating multiple images in the superposition function are also presented, along with methods to remove the effect of multiple images in the Fourier transform coefficients of the superposition map. In addition, crystallographic studies of the extended chain structure of (NHC{sub 5}H{sub 5})SbI{sub 4} and of the twinning method of the orthorhombic form of the high-{Tc} superconductor YBa{sub 2}Cu{sub 3}O{sub 7-x} are presented. 54 refs.

  3. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  4. Chiral anomaly, bosonization, and fractional charge

    SciTech Connect

    Mignaco, J.A.; Monteiro, M.A.R.

    1985-06-15

    We present a method to evaluate the Jacobian of chiral rotations, regulating determinants through the proper-time method and using Seeley's asymptotic expansion. With this method we compute easily the chiral anomaly for ..nu.. = 4,6 dimensions, discuss bosonization of some massless two-dimensional models, and handle the problem of charge fractionization. In addition, we comment on the general validity of Fujikawa's approach to regulate the Jacobian of chiral rotations with non-Hermitian operators.

  5. A Voxel-Map Quantitative Analysis Approach for Atherosclerotic Noncalcified Plaques of the Coronary Artery Tree

    PubMed Central

    Li, Ying; Chen, Wei; Chen, Yonglin; Chu, Chun; Fang, Bingji; Tan, Liwen

    2013-01-01

    Noncalcified plaques (NCPs) are associated with the presence of lipid-core plaques that are prone to rupture. Thus, it is important to detect and monitor the development of NCPs. Contrast-enhanced coronary Computed Tomography Angiography (CTA) is a potential imaging technique to identify atherosclerotic plaques in the whole coronary tree, but it fails to provide information about vessel walls. In order to overcome the limitations of coronary CTA and provide more meaningful quantitative information for percutaneous coronary intervention (PCI), we proposed a Voxel-Map based on mathematical morphology to quantitatively analyze the noncalcified plaques on a three-dimensional coronary artery wall model (3D-CAWM). This approach is a combination of Voxel-Map analysis techniques, plaque locating, and anatomical location related labeling, which show more detailed and comprehensive coronary tree wall visualization. PMID:24348749

  6. A sib-pair approach to interval mapping of quantitative trait loci.

    PubMed Central

    Fulker, D. W.; Cardon, L. R.

    1994-01-01

    An interval mapping procedure based on the sib-pair method of Haseman and Elston is developed, and simulation studies are carried out to explore its properties. The procedure is analogous to other interval mapping procedures used with experimental material, such as plants and animals, and yields very similar results in terms of the location and effect size of a quantitative trait locus (QTL). The procedure offers an advantage over the conventional Haseman and Elston approach, in terms of power, and provides useful information concerning the location of a QTL. Because of its simplicity, the method readily lends itself to the analysis of selected samples for increased power and the evaluation of multilocus models of complex phenotypes. PMID:8198132

  7. A GIS based method for soil mapping in Sardinia, Italy: a geomatic approach.

    PubMed

    Vacca, A; Loddo, S; Melis, M T; Funedda, A; Puddu, R; Verona, M; Fanni, S; Fantola, F; Madrau, S; Marrone, V A; Serra, G; Tore, C; Manca, D; Pasci, S; Puddu, M R; Schirru, P

    2014-06-01

    A new project was recently initiated for the realization of the "Land Unit and Soil Capability Map of Sardinia" at a scale of 1:50,000 to support land use planning. In this study, we outline the general structure of the project and the methods used in the activities that have been thus far conducted. A GIS approach was used. We used the soil-landscape paradigm for the prediction of soil classes and their spatial distribution or the prediction of soil properties based on landscape features. The work is divided into two main phases. In the first phase, the available digital data on land cover, geology and topography were processed and classified according to their influence on weathering processes and soil properties. The methods used in the interpretation are based on consolidated and generalized knowledge about the influence of geology, topography and land cover on soil properties. The existing soil data (areal and point data) were collected, reviewed, validated and standardized according to international and national guidelines. Point data considered to be usable were input into a specific database created for the project. Using expert interpretation, all digital data were merged to produce a first draft of the Land Unit Map. During the second phase, this map will be implemented with the existing soil data and verified in the field if also needed with new soil data collection, and the final Land Unit Map will be produced. The Land Unit and Soil Capability Map will be produced by classifying the land units using a reference matching table of land capability classes created for this project.

  8. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  9. A New Approach to Liquefaction Potential Mapping Using Remote Sensing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Oommen, T.; Baise, L. G.

    2007-12-01

    learning capabilities of a human brain and make appropriate predictions that involve intuitive judgments and a high degree of nonlinearity. The accuracy of the developed liquefaction potential map was tested using independent testing data that was not used for the model development. The results show that the developed liquefaction potential map has an overall classification accuracy of 84%, indicating that the combination of remote sensing data and other relevant spatial data together with machine learning can be a promising approach for liquefaction potential mapping.

  10. Interacting boson models for N˜Z nuclei

    NASA Astrophysics Data System (ADS)

    Van Isacker, P.

    2011-05-01

    This contribution discusses the use of boson models in the description of N˜Z nuclei. A brief review is given of earlier attempts, initiated by Elliott and co-workers, to extend the interacting boson model of Arima and Iachello by the inclusion of neutron-proton s and d bosons with T = 1 (IBM-3) as well as T = 0 (IBM-4). It is argued that for the N˜Z nuclei that are currently studied experimentally, a different approach is needed which invokes aligned neutron-proton pairs with angular momentum J = 2j and isospin T = 0. This claim is supported by an analysis of shell-model wave functions in terms of pair states. Results of this alternative version of the interacting boson model are compared with shell-model calculations in the 1g9/2 shell.

  11. Interacting boson models for N{approx}Z nuclei

    SciTech Connect

    Van Isacker, P.

    2011-05-06

    This contribution discusses the use of boson models in the description of N{approx}Z nuclei. A brief review is given of earlier attempts, initiated by Elliott and co-workers, to extend the interacting boson model of Arima and Iachello by the inclusion of neutron-proton s and d bosons with T = 1 (IBM-3) as well as T = 0 (IBM-4). It is argued that for the N{approx}Z nuclei that are currently studied experimentally, a different approach is needed which invokes aligned neutron-proton pairs with angular momentum J = 2j and isospin T = 0. This claim is supported by an analysis of shell-model wave functions in terms of pair states. Results of this alternative version of the interacting boson model are compared with shell-model calculations in the 1g{sub 9/2} shell.

  12. Bose-Einstein condensates of bosonic Thomson atoms

    NASA Astrophysics Data System (ADS)

    Schneider, Tobias; Blümel, Reinhold

    1999-10-01

    A system of charged particles in a harmonic trap is a realization of Thomson's raisin cake model. Therefore, we call it a Thomson atom. Bosonic, fermionic and mixed Thomson atoms exist. In this paper we focus on bosonic Thomson atoms in isotropic traps. Approximating the exact ground state by a condensate we investigate ground-state properties at temperature T = 0 using the Hartree-Fock theory for bosons. In order to assess the quality of our mean-field approach we compare the Hartree-Fock results for bosonic Thomson helium with an exact diagonalization. In contrast to the weakly interacting Bose gas (alkali vapours) mean-field calculations are reliable in the limit of large particle density. The Wigner regime (low particle density) is discussed.

  13. Functional connectivity-based parcellation of amygdala using self-organized mapping: a data driven approach.

    PubMed

    Mishra, Arabinda; Rogers, Baxter P; Chen, Li Min; Gore, John C

    2014-04-01

    The overall goal of this work is to demonstrate how resting state functional magnetic resonance imaging (fMRI) signals may be used to objectively parcellate functionally heterogeneous subregions of the human amygdala into structures characterized by similar patterns of functional connectivity. We hypothesize that similarity of functional connectivity of subregions with other parts of the brain can be a potential basis to segment and cluster voxels using data driven approaches. In this work, self-organizing map (SOM) was implemented to cluster the connectivity maps associated with each voxel of the human amygdala, thereby defining distinct subregions. The functional separation was optimized by evaluating the overall differences in functional connectivity between the subregions at group level. Analysis of 25 resting state fMRI data sets suggests that SOM can successfully identify functionally independent nuclei based on differences in their inter subregional functional connectivity, evaluated statistically at various confidence levels. Although amygdala contains several nuclei whose distinct roles are implicated in various functions, our objective approach discerns at least two functionally distinct volumes comparable to previous parcellation results obtained using probabilistic tractography and cytoarchitectonic analysis. Association of these nuclei with various known functions and a quantitative evaluation of their differences in overall functional connectivity with lateral orbital frontal cortex and temporal pole confirms the functional diversity of amygdala. The data driven approach adopted here may be used as a powerful indicator of structure-function relationships in the amygdala and other functionally heterogeneous structures as well.

  14. A Non-parametric Approach to Constrain the Transfer Function in Reverberation Mapping

    NASA Astrophysics Data System (ADS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-11-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  15. An internal state variable mapping approach for Li-Plating diagnosis

    NASA Astrophysics Data System (ADS)

    Bai, Guangxing; Wang, Pingfeng

    2016-08-01

    Li-ion battery failure becomes one of major challenges for reliable battery applications, as it could cause catastrophic consequences. Compared with capacity fading resulted from calendar effects, Li-plating induced battery failures are more difficult to identify, as they causes sudden capacity loss leaving limited time for failure diagnosis. This paper presents a new internal state variable (ISV) mapping approach to identify values of immeasurable battery ISVs considering changes of inherent parameters of battery system dynamics for Li-plating diagnosis. Employing the developed ISV mapping approach, an explicit functional relationship model between measurable battery signals and immeasurable battery ISVs can be developed. The developed model can then be used to identify ISVs from an online battery system for the occurrence identification of Li-plating. Employing multiphysics based simulation of Li-plating using COMSOL, the proposed Li-plating diagnosis approach is implemented under different conditions in the case studies to demonstrate its efficacy in diagnosis of Li-plating onset timings.

  16. A pooling-based approach to mapping genetic variants associated with DNA methylation

    DOE PAGES

    Kaplow, Irene M.; MacIsaac, Julia L.; Mah, Sarah M.; ...

    2015-04-24

    DNA methylation is an epigenetic modification that plays a key role in gene regulation. Previous studies have investigated its genetic basis by mapping genetic variants that are associated with DNA methylation at specific sites, but these have been limited to microarrays that cover <2% of the genome and cannot account for allele-specific methylation (ASM). Other studies have performed whole-genome bisulfite sequencing on a few individuals, but these lack statistical power to identify variants associated with DNA methylation. We present a novel approach in which bisulfite-treated DNA from many individuals is sequenced together in a single pool, resulting in a trulymore » genome-wide map of DNA methylation. Compared to methods that do not account for ASM, our approach increases statistical power to detect associations while sharply reducing cost, effort, and experimental variability. As a proof of concept, we generated deep sequencing data from a pool of 60 human cell lines; we evaluated almost twice as many CpGs as the largest microarray studies and identified more than 2000 genetic variants associated with DNA methylation. Here we found that these variants are highly enriched for associations with chromatin accessibility and CTCF binding but are less likely to be associated with traits indirectly linked to DNA, such as gene expression and disease phenotypes. In summary, our approach allows genome-wide mapping of genetic variants associated with DNA methylation in any tissue of any species, without the need for individual-level genotype or methylation data.« less

  17. A self organizing map approach to physiological data analysis for enhanced group performance.

    SciTech Connect

    Doser, Adele Beatrice; Merkle, Peter Benedict

    2004-10-01

    A Self Organizing Map (SOM) approach was used to analyze physiological data taken from a group of subjects participating in a cooperative video shooting game. The ultimate aim was to discover signatures of group cooperation, conflict, leadership, and performance. Such information could be fed back to participants in a meaningful way, and ultimately increase group performance in national security applications, where the consequences of a poor group decision can be devastating. Results demonstrated that a SOM can be a useful tool in revealing individual and group signatures from physiological data, and could ultimately be used to heighten group performance.

  18. Image Mining in Remote Sensing for Coastal Wetlands Mapping: from Pixel Based to Object Based Approach

    NASA Astrophysics Data System (ADS)

    Farda, N. M.; Danoedoro, P.; Hartono; Harjoko, A.

    2016-11-01

    The availably of remote sensing image data is numerous now, and with a large amount of data it makes “knowledge gap” in extraction of selected information, especially coastal wetlands. Coastal wetlands provide ecosystem services essential to people and the environment. The aim of this research is to extract coastal wetlands information from satellite data using pixel based and object based image mining approach. Landsat MSS, Landsat 5 TM, Landsat 7 ETM+, and Landsat 8 OLI images located in Segara Anakan lagoon are selected to represent data at various multi temporal images. The input for image mining are visible and near infrared bands, PCA band, invers PCA bands, mean shift segmentation bands, bare soil index, vegetation index, wetness index, elevation from SRTM and ASTER GDEM, and GLCM (Harralick) or variability texture. There is three methods were applied to extract coastal wetlands using image mining: pixel based - Decision Tree C4.5, pixel based - Back Propagation Neural Network, and object based - Mean Shift segmentation and Decision Tree C4.5. The results show that remote sensing image mining can be used to map coastal wetlands ecosystem. Decision Tree C4.5 can be mapped with highest accuracy (0.75 overall kappa). The availability of remote sensing image mining for mapping coastal wetlands is very important to provide better understanding about their spatiotemporal coastal wetlands dynamics distribution.

  19. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    PubMed Central

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  20. A Novel Approach on Designing Augmented Fuzzy Cognitive Maps Using Fuzzified Decision Trees

    NASA Astrophysics Data System (ADS)

    Papageorgiou, Elpiniki I.

    This paper proposes a new methodology for designing Fuzzy Cognitive Maps using crisp decision trees that have been fuzzified. Fuzzy cognitive map is a knowledge-based technique that works as an artificial cognitive network inheriting the main aspects of cognitive maps and artificial neural networks. Decision trees, in the other hand, are well known intelligent techniques that extract rules from both symbolic and numeric data. Fuzzy theoretical techniques are used to fuzzify crisp decision trees in order to soften decision boundaries at decision nodes inherent in this type of trees. Comparisons between crisp decision trees and the fuzzified decision trees suggest that the later fuzzy tree is significantly more robust and produces a more balanced decision making. The approach proposed in this paper could incorporate any type of fuzzy decision trees. Through this methodology, new linguistic weights were determined in FCM model, thus producing augmented FCM tool. The framework is consisted of a new fuzzy algorithm to generate linguistic weights that describe the cause-effect relationships among the concepts of the FCM model, from induced fuzzy decision trees.

  1. An object-oriented approach to automated landform mapping: A case study of drumlins

    NASA Astrophysics Data System (ADS)

    Saha, Kakoli; Wells, Neil A.; Munro-Stasiuk, Mandy

    2011-09-01

    This paper details an automated object-oriented approach to mapping landforms from digital elevation models (DEMs), using the example of drumlins in the Chautauqua drumlin field in NW Pennsylvania and upstate New York. Object-oriented classification is highly desirable as it can identify specific shapes in datasets based on both the pixel values in a raster dataset and the contextual information between pixels and extracted objects. The methodology is built specifically for application to the USGS 30 m resolution DEM data, which are freely available to the public and of sufficient resolution to map medium scale landforms. Using the raw DEM data, as well as derived aspect and slope, Definiens Developer (v.7) was used to perform multiresolution segmentation, followed by rule-based classification in order to extract individual polygons that represent drumlins. Drumlins obtained by automated extraction were visually and statistically compared to those identified via manual digitization. Detailed morphometric descriptive statistics such as means, ranges, and standard deviations were inspected and compared for length, width, elongation ratio, area, and perimeter. Although the manual and automated results were not always statistically identical, a more detailed comparison of just the drumlins identified by both procedures showed that the automated methods easily matched the manual digitization. Differences in the two methods related to mapping compound drumlins, and smaller and larger drumlins. The automated method generally identified more features in these categories than the manual method and thus outperformed the manual method.

  2. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach.

    PubMed

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems.

  3. An entropy-driven matrix completion (E-MC) approach to complex network mapping

    NASA Astrophysics Data System (ADS)

    Koochakzadeh, Ali; Pal, Piya

    2016-05-01

    Mapping the topology of a complex network in a resource-efficient manner is a challenging problem with applications in internet mapping, social network inference, and so forth. We propose a new entropy driven algorithm leveraging ideas from matrix completion, to map the network using monitors (or sensors) which, when placed on judiciously selected nodes, are capable of discovering their immediate neighbors. The main challenge is to maximize the portion of discovered network using only a limited number of available monitors. To this end, (i) a new measure of entropy or uncertainty is associated with each node, in terms of the currently discovered edges incident on that node, and (ii) a greedy algorithm is developed to select a candidate node for monitor placement based on its entropy. Utilizing the fact that many complex networks of interest (such as social networks), have a low-rank adjacency matrix, a matrix completion algorithm, namely 1-bit matrix completion, is combined with the greedy algorithm to further boost its performance. The low rank property of the network adjacency matrix can be used to extrapolate a portion of missing edges, and consequently update the node entropies, so as to efficiently guide the network discovery algorithm towards placing monitors on the nodes that can turn out to be more informative. Simulations performed on a variety of real world networks such as social networks and peer networks demonstrate the superior performance of the matrix-completion guided approach in discovering the network topology.

  4. Flood mapping using VHR satellite imagery: a comparison between different classification approaches

    NASA Astrophysics Data System (ADS)

    Franci, Francesca; Boccardo, Piero; Mandanici, Emanuele; Roveri, Elena; Bitelli, Gabriele

    2016-10-01

    Various regions in Europe have suffered from severe flooding over the last decades. Flood disasters often have a broad extent and a high frequency. They are considered the most devastating natural hazards because of the tremendous fatalities, injuries, property damages, economic and social disruption that they cause. In this context, Earth Observation techniques have become a key tool for flood risk and damage assessment. In particular, remote sensing facilitates flood surveying, providing valuable information, e.g. flood occurrence, intensity and progress of flood inundation, spurs and embankments affected/threatened. The present work aims to investigate the use of Very High Resolution satellite imagery for mapping flood-affected areas. The case study is the November 2013 flood event which occurred in Sardinia region (Italy), affecting a total of 2,700 people and killing 18 persons. The investigated zone extends for 28 km2 along the Posada river, from the Maccheronis dam to the mouth in the Tyrrhenian sea. A post-event SPOT6 image was processed by means of different classification methods, in order to produce the flood map of the analysed area. The unsupervised classification algorithm ISODATA was tested. A pixel-based supervised technique was applied using the Maximum Likelihood algorithm; moreover, the SPOT 6 image was processed by means of object-oriented approaches. The produced flood maps were compared among each other and with an independent data source, in order to evaluate the performance of each method, also in terms of time demand.

  5. Flight investigation of helicopter IFR approaches to oil rigs using airborne weather and mapping radar

    NASA Technical Reports Server (NTRS)

    Bull, J. S.; Hegarty, D. M.; Phillips, J. D.; Sturgeon, W. R.; Hunting, A. W.; Pate, D. P.

    1979-01-01

    Airborne weather and mapping radar is a near-term, economical method of providing 'self-contained' navigation information for approaches to offshore oil rigs and its use has been rapidly expanding in recent years. A joint NASA/FAA flight test investigation of helicopter IFR approaches to offshore oil rigs in the Gulf of Mexico was initiated in June 1978 and conducted under contract to Air Logistics. Approximately 120 approaches were flown in a Bell 212 helicopter by 15 operational pilots during the months of August and September 1978. The purpose of the tests was to collect data to (1) support development of advanced radar flight director concepts by NASA and (2) aid the establishment of Terminal Instrument Procedures (TERPS) criteria by the FAA. The flight test objectives were to develop airborne radar approach procedures, measure tracking errors, determine accpetable weather minimums, and determine pilot acceptability. Data obtained will contribute significantly to improved helicopter airborne radar approach capability and to the support of exploration, development, and utilization of the Nation's offshore oil supplies.

  6. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids.

    PubMed

    Tangprasertchai, Narin S; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S; Qin, Peter Z

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve "correct" all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements.

  7. A novel linear programming approach to fluence map optimization for intensity modulated radiation therapy treatment planning.

    PubMed

    Romeijn, H Edwin; Ahuja, Ravindra K; Dempsey, James F; Kumar, Arvind; Li, Jonathan G

    2003-11-07

    We present a novel linear programming (LP) based approach for efficiently solving the intensity modulated radiation therapy (IMRT) fluence-map optimization (FMO) problem to global optimality. Our model overcomes the apparent limitations of a linear-programming approach by approximating any convex objective function by a piecewise linear convex function. This approach allows us to retain the flexibility offered by general convex objective functions, while allowing us to formulate the FMO problem as a LP problem. In addition, a novel type of partial-volume constraint that bounds the tail averages of the differential dose-volume histograms of structures is imposed while retaining linearity as an alternative approach to improve dose homogeneity in the target volumes, and to attempt to spare as many critical structures as possible. The goal of this work is to develop a very rapid global optimization approach that finds high quality dose distributions. Implementation of this model has demonstrated excellent results. We found globally optimal solutions for eight 7-beam head-and-neck cases in less than 3 min of computational time on a single processor personal computer without the use of partial-volume constraints. Adding such constraints increased the running times by a factor of 2-3, but improved the sparing of critical structures. All cases demonstrated excellent target coverage (> 95%), target homogeneity (< 10% overdosing and < 7% underdosing) and organ sparing using at least one of the two models.

  8. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids

    PubMed Central

    Tangprasertchai, Narin S.; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S.; Qin, Peter Z.

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve “correct” all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. PMID:26477260

  9. Non-invasive computation of aortic pressure maps: a phantom-based study of two approaches

    NASA Astrophysics Data System (ADS)

    Delles, Michael; Schalck, Sebastian; Chassein, Yves; Müller, Tobias; Rengier, Fabian; Speidel, Stefanie; von Tengg-Kobligk, Hendrik; Kauczor, Hans-Ulrich; Dillmann, Rüdiger; Unterhinninghofen, Roland

    2014-03-01

    Patient-specific blood pressure values in the human aorta are an important parameter in the management of cardiovascular diseases. A direct measurement of these values is only possible by invasive catheterization at a limited number of measurement sites. To overcome these drawbacks, two non-invasive approaches of computing patient-specific relative aortic blood pressure maps throughout the entire aortic vessel volume are investigated by our group. The first approach uses computations from complete time-resolved, three-dimensional flow velocity fields acquired by phasecontrast magnetic resonance imaging (PC-MRI), whereas the second approach relies on computational fluid dynamics (CFD) simulations with ultrasound-based boundary conditions. A detailed evaluation of these computational methods under realistic conditions is necessary in order to investigate their overall robustness and accuracy as well as their sensitivity to certain algorithmic parameters. We present a comparative study of the two blood pressure computation methods in an experimental phantom setup, which mimics a simplified thoracic aorta. The comparative analysis includes the investigation of the impact of algorithmic parameters on the MRI-based blood pressure computation and the impact of extracting pressure maps in a voxel grid from the CFD simulations. Overall, a very good agreement between the results of the two computational approaches can be observed despite the fact that both methods used completely separate measurements as input data. Therefore, the comparative study of the presented work indicates that both non-invasive pressure computation methods show an excellent robustness and accuracy and can therefore be used for research purposes in the management of cardiovascular diseases.

  10. The field line map approach for simulations of magnetically confined plasmas

    NASA Astrophysics Data System (ADS)

    Stegmeir, Andreas; Coster, David; Maj, Omar; Hallatschek, Klaus; Lackner, Karl

    2016-01-01

    Predictions of plasma parameters in the edge and scrape-off layer of tokamaks is difficult since most modern tokamaks have a divertor and the associated separatrix causes the usually employed field/flux-aligned coordinates to become singular on the separatrix/X-point. The presented field line map approach avoids such problems as it is based on a cylindrical grid: standard finite-difference methods can be used for the discretisation of perpendicular (w.r.t. magnetic field) operators, and the characteristic flute mode property (k∥ ≪k⊥) of structures is exploited computationally via a field line following discretisation of parallel operators which leads to grid sparsification in the toroidal direction. This paper is devoted to the discretisation of the parallel diffusion operator (the approach taken is very similar to the flux-coordinate independent (FCI) approach which has already been adopted to a hyperbolic problem (Ottaviani, 2011; Hariri, 2013)). Based on the support operator method, schemes are derived which maintain the self-adjointness property of the parallel diffusion operator on the discrete level. These methods have very low numerical perpendicular diffusion compared to a naive discretisation which is a critical issue since magnetically confined plasmas exhibit a very strong anisotropy. Two different versions of the discrete parallel diffusion operator are derived: the first is based on interpolation where the order of interpolation and therefore the numerical diffusion is adjustable; the second is based on integration and is advantageous in cases where the field line map is strongly distorted. The schemes are implemented in the new code GRILLIX, and extensive benchmarks and numerous examples are presented which show the validity of the approach in general and GRILLIX in particular.

  11. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    NASA Astrophysics Data System (ADS)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Roger Anderson, O.

    2011-04-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample of the study consisted of six classes (140 Form 3 students of 13-15 years old) selected from a typical coeducational school in Brunei. Three classes (40 boys and 30 girls) were taught using the TTA while three other classes (41 boys and 29 girls) used the CMA, enriched with PowerPoint presentations. After the interventions (lessons on magnetism), the students in both groups were asked to describe in writing their understanding of magnetism accrued from the lessons. Their written descriptions were analyzed using flow map analyses to assess their content knowledge and its organisation in memory as evidence of cognitive structure. The extent of CLE was measured using a published CLE survey. The results showed that the cognitive structures of the CMA students were more extensive, thematically organised and richer in interconnectedness of thoughts than those of TTA students. Moreover, CMA students also perceived their classroom learning environment to be more constructivist than their counterparts. It is, therefore, recommended that teachers consider using the CMA teaching technique to help students enrich their understanding, especially for more complex or abstract scientific content.

  12. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps.

    PubMed

    Santos, Diego G; Fernandes, Bruno J T; Bezerra, Byron L D

    2015-11-12

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset.

  13. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties.

  14. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps

    PubMed Central

    Santos, Diego G.; Fernandes, Bruno J. T.; Bezerra, Byron L. D.

    2015-01-01

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset. PMID:26569262

  15. In Silico Design of Human IMPDH Inhibitors Using Pharmacophore Mapping and Molecular Docking Approaches

    PubMed Central

    Li, Rui-Juan; Wang, Ya-Li; Wang, Qing-He; Wang, Jian; Cheng, Mao-Sheng

    2015-01-01

    Inosine 5′-monophosphate dehydrogenase (IMPDH) is one of the crucial enzymes in the de novo biosynthesis of guanosine nucleotides. It has served as an attractive target in immunosuppressive, anticancer, antiviral, and antiparasitic therapeutic strategies. In this study, pharmacophore mapping and molecular docking approaches were employed to discover novel Homo sapiens IMPDH (hIMPDH) inhibitors. The Güner-Henry (GH) scoring method was used to evaluate the quality of generated pharmacophore hypotheses. One of the generated pharmacophore hypotheses was found to possess a GH score of 0.67. Ten potential compounds were selected from the ZINC database using a pharmacophore mapping approach and docked into the IMPDH active site. We find two hits (i.e., ZINC02090792 and ZINC00048033) that match well the optimal pharmacophore features used in this investigation, and it is found that they form interactions with key residues of IMPDH. We propose that these two hits are lead compounds for the development of novel hIMPDH inhibitors. PMID:25784957

  16. Coriolis effects on rotating Hele-Shaw flows: a conformal-mapping approach.

    PubMed

    Miranda, José A; Gadêlha, Hermes; Dorsey, Alan T

    2010-12-01

    The zero surface tension fluid-fluid interface dynamics in a radial Hele-Shaw cell driven by both injection and rotation is studied by a conformal-mapping approach. The situation in which one of the fluids is inviscid and has negligible density is analyzed. When Coriolis force effects are ignored, exact solutions of the zero surface tension rotating Hele-Shaw problem with injection reveal suppression of cusp singularities for sufficiently high rotation rates. We study how the Coriolis force affects the time-dependent solutions of the problem, and the development of finite time singularities. By employing Richardson's harmonic moments approach we obtain conformal maps which describe the time evolution of the fluid boundary. Our results demonstrate that the inertial Coriolis contribution plays an important role in determining the time for cusp formation. Moreover, it introduces a phase drift that makes the evolving patterns rotate. The Coriolis force acts against centrifugal effects, promoting (inhibiting) cusp breakdown if the more viscous and dense fluid lies outside (inside) the interface. Despite the presence of Coriolis effects, the occurrence of finger bending events has not been detected in the exact solutions.

  17. Mapping CO2 emission in highly urbanized region using standardized microbial respiration approach

    NASA Astrophysics Data System (ADS)

    Vasenev, V. I.; Stoorvogel, J. J.; Ananyeva, N. D.

    2012-12-01

    Urbanization is a major recent land-use change pathway. Land conversion to urban has a tremendous and still unclear effect on soil cover and functions. Urban soil can act as a carbon source, although its potential for CO2 emission is also very high. The main challenge in analysis and mapping soil organic carbon (SOC) in urban environment is its high spatial heterogeneity and temporal dynamics. The urban environment provides a number of specific features and processes that influence soil formation and functioning and results in a unique spatial variability of carbon stocks and fluxes at short distance. Soil sealing, functional zoning, settlement age and size are the predominant factors, distinguishing heterogeneity of urban soil carbon. The combination of these factors creates a great amount of contrast clusters with abrupt borders, which is very difficult to consider in regional assessment and mapping of SOC stocks and soil CO2 emission. Most of the existing approaches to measure CO2 emission in field conditions (eddy-covariance, soil chambers) are very sensitive to soil moisture and temperature conditions. They require long-term sampling set during the season in order to obtain relevant results. This makes them inapplicable for the analysis of CO2 emission spatial variability at the regional scale. Soil respiration (SR) measurement in standardized lab conditions enables to overcome this difficulty. SR is predominant outgoing carbon flux, including autotrophic respiration of plant roots and heterotrophic respiration of soil microorganisms. Microbiota is responsible for 50-80% of total soil carbon outflow. Microbial respiration (MR) approach provides an integral CO2 emission results, characterizing microbe CO2 production in optimal conditions and thus independent from initial difference in soil temperature and moisture. The current study aimed to combine digital soil mapping (DSM) techniques with standardized microbial respiration approach in order to analyse and

  18. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  19. Mapping of Protein–Protein Interaction Sites by the ‘Absence of Interference’ Approach

    PubMed Central

    Dhayalan, Arunkumar; Jurkowski, Tomasz P.; Laser, Heike; Reinhardt, Richard; Jia, Da; Cheng, Xiaodong; Jeltsch, Albert

    2008-01-01

    Protein–protein interactions are critical to most biological processes, and locating protein–protein interfaces on protein structures is an important task in molecular biology. We developed a new experimental strategy called the ‘absence of interference’ approach to determine surface residues involved in protein–protein interaction of established yeast two-hybrid pairs of interacting proteins. One of the proteins is subjected to high-level randomization by error-prone PCR. The resulting library is selected by yeast two-hybrid system for interacting clones that are isolated and sequenced. The interaction region can be identified by an absence or depletion of mutations. For data analysis and presentation, we developed a Web interface that analyzes the mutational spectrum and displays the mutational frequency on the surface of the structure (or a structural model) of the randomized protein†. Additionally, this interface might be of use for the display of mutational distributions determined by other types of random mutagenesis experiments. We applied the approach to map the interface of the catalytic domain of the DNA methyltransferase Dnmt3a with its regulatory factor Dnmt3L. Dnmt3a was randomized with high mutational load. A total of 76 interacting clones were isolated and sequenced, and 648 mutations were identified. The mutational pattern allowed to identify a unique interaction region on the surface of Dnmt3a, which comprises about 500−600 Å2. The results were confirmed by site-directed mutagenesis and structural analysis. The absence-of-interference approach will allow high-throughput mapping of protein interaction sites suitable for functional studies and protein docking. PMID:18191145

  20. Divide and Conquer Approach to Contact Map Overlap Problem Using 2D-Pattern Mining of Protein Contact Networks.

    PubMed

    Koneru, Suvarna Vani; Bhavani, Durga S

    2015-01-01

    A novel approach to Contact Map Overlap (CMO) problem is proposed using the two dimensional clusters present in the contact maps. Each protein is represented as a set of the non-trivial clusters of contacts extracted from its contact map. The approach involves finding matching regions between the two contact maps using approximate 2D-pattern matching algorithm and dynamic programming technique. These matched pairs of small contact maps are submitted in parallel to a fast heuristic CMO algorithm. The approach facilitates parallelization at this level since all the pairs of contact maps can be submitted to the algorithm in parallel. Then, a merge algorithm is used in order to obtain the overall alignment. As a proof of concept, MSVNS, a heuristic CMO algorithm is used for global as well as local alignment. The divide and conquer approach is evaluated for two benchmark data sets that of Skolnick and Ding et al. It is interesting to note that along with achieving saving of time, better overlap is also obtained for certain protein folds.

  1. The land morphology approach to flood risk mapping: An application to Portugal.

    PubMed

    Cunha, N S; Magalhães, M R; Domingos, T; Abreu, M M; Küpfer, C

    2017-05-15

    In the last decades, the increasing vulnerability of floodplains is linked to societal changes such as population density growth, land use changes, water use patterns, among other factors. Land morphology directly influences surface water flow, transport of sediments, soil genesis, local climate and vegetation distribution. Therefore, the land morphology, the land used and management directly influences flood risks genesis. However, attention is not always given to the underlying geomorphological and ecological processes that influence the dynamic of rivers and their floodplains. Floodplains are considered a part of a larger system called Wet System (WS). The WS includes permanent and temporary streams, water bodies, wetlands and valley bottoms. Valley bottom is a broad concept which comprehends not only floodplains but also flat and concave areas, contiguous to streams, in which slope is less than 5%. This will be addressed through a consistent method based on a land morphology approach that classifies landforms according to their hydrological position in the watershed. This method is based on flat areas (slopes less than 5%), surface curvature and hydrological features. The comparison between WS and flood risk data from the Portuguese Environmental Agency for the main rivers of mainland Portugal showed that in downstream areas of watersheds, valley bottoms are coincident with floodplains modelled by hydrological methods. Mapping WS has a particular interest in analysing river ecosystems position and function in the landscape, from upstream to downstream areas in the watershed. This morphological approach is less demanding data and time-consuming than hydrological methods and can be used as the preliminary delimitation of floodplains and potential flood risk areas in situations where there is no hydrological data available. The results were also compared with the land use/cover map at a national level and detailed in Trancão river basin, located in Lisbon

  2. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  3. Mapping a near surface variable geologic regime using an integrated geophysical approach

    SciTech Connect

    Rogers, N.T.; Sandberg, S.K.; Miller, P.; Powell, G.

    1997-10-01

    An integrated geophysical approach involving seismic, electromagnetic, and electrical methods was employed to map fluvial, colluvial and bedrock geology, to delineate bedrock channels, and to determine fracture and joint orientations that may influence migration of petroleum hydrocarbons at the Glenrock Oil Seep. Both P (primary)-wave and S (shear)-wave seismic refraction techniques were used to map the bedrock surface topography, bedrock minima, stratigraphic boundaries, and possible structure. S-wave data were preferred because of better vertical resolution due to the combination of slower velocities and lower frequency wave train. Azimuthal resistivity/EP (induced polarization) and azimuthal electromagnetics were used to determine fracture orientations and groundwater flow directions. Terrain conductivity was used to map the fluvial sedimentary sequences (e.g., paleochannel and overbank deposits) in the North Platte River floodplain. Conductivity measurements were also used to estimate bedrock depth and to assist in the placement and recording parameters of the azimuthal soundings. The geophysical investigation indicated that groundwater flow pathways were controlled by the fluvial paleochannels and bedrock erosional features. Primary groundwater flow direction in the bedrock and colluvial sediments was determined from the azimuthal measurements and confirmed by drilling to be N20-40W along the measured strike of the bedrock and joint orientations. Joint/fracture orientations were measured at N20-40W and N10-30E from the azimuthal data and confirmed from measurements at a bedrock outcrop south of the site. The bedrock has an apparent N10E anisotropy in the seismic velocity profiles on the old refinery property that closely match that of measured joint/fracture orientations and may have a minor effect on groundwater flow.

  4. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    PubMed

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  5. Dark light Higgs bosons.

    SciTech Connect

    Draper, P.; Liu, T.; Wagner, C. E. M.; Wang, L.-T.; Zhang, H.

    2011-03-24

    We study a limit of the nearly Peccei-Quinn-symmetric next-to-minimal supersymmetric standard model possessing novel Higgs and dark matter (DM) properties. In this scenario, there naturally coexist three light singletlike particles: a scalar, a pseudoscalar, and a singlinolike DM candidate, all with masses of order 0.1-10 GeV. The decay of a standard model-like Higgs boson to pairs of the light scalars or pseudoscalars is generically suppressed, avoiding constraints from collider searches for these channels. For a certain parameter window annihilation into the light pseudoscalar and exchange of the light scalar with nucleons allow the singlino to achieve the correct relic density and a large direct-detection cross section consistent with the DM direct-detection experiments, CoGeNT and DAMA/LIBRA, preferred region simultaneously. This parameter space is consistent with experimental constraints from LEP, the Tevatron, ?, and flavor physics.

  6. Chameleon vector bosons

    SciTech Connect

    Nelson, Ann E.

    2008-05-01

    We show that for a force mediated by a vector particle coupled to a conserved U(1) charge, the apparent range and strength can depend on the size and density of the source, and the proximity to other sources. This chameleon effect is due to screening from a light charged scalar. Such screening can weaken astrophysical constraints on new gauge bosons. As an example we consider the constraints on chameleonic gauged B-L. We show that although Casimir measurements greatly constrain any B-L force much stronger than gravity with range longer than 0.1 {mu}m, there remains an experimental window for a long-range chameleonic B-L force. Such a force could be much stronger than gravity, and long or infinite range in vacuum, but have an effective range near the surface of the earth which is less than a micron.

  7. An automated approach for tone mapping operator parameter adjustment in security applications

    NASA Astrophysics Data System (ADS)

    Krasula, LukáÅ.¡; Narwaria, Manish; Le Callet, Patrick

    2014-05-01

    High Dynamic Range (HDR) imaging has been gaining popularity in recent years. Different from the traditional low dynamic range (LDR), HDR content tends to be visually more appealing and realistic as it can represent the dynamic range of the visual stimuli present in the real world. As a result, more scene details can be faithfully reproduced. As a direct consequence, the visual quality tends to improve. HDR can be also directly exploited for new applications such as video surveillance and other security tasks. Since more scene details are available in HDR, it can help in identifying/tracking visual information which otherwise might be difficult with typical LDR content due to factors such as lack/excess of illumination, extreme contrast in the scene, etc. On the other hand, with HDR, there might be issues related to increased privacy intrusion. To display the HDR content on the regular screen, tone-mapping operators (TMO) are used. In this paper, we present the universal method for TMO parameters tuning, in order to maintain as many details as possible, which is desirable in security applications. The method's performance is verified on several TMOs by comparing the outcomes from tone-mapping with default and optimized parameters. The results suggest that the proposed approach preserves more information which could be of advantage for security surveillance but, on the other hand, makes us consider possible increase in privacy intrusion.

  8. A computational approach to map nucleosome positions and alternative chromatin states with base pair resolution

    PubMed Central

    Zhou, Xu; Blocker, Alexander W; Airoldi, Edoardo M; O'Shea, Erin K

    2016-01-01

    Understanding chromatin function requires knowing the precise location of nucleosomes. MNase-seq methods have been widely applied to characterize nucleosome organization in vivo, but generally lack the accuracy to determine the precise nucleosome positions. Here we develop a computational approach leveraging digestion variability to determine nucleosome positions at a base-pair resolution from MNase-seq data. We generate a variability template as a simple error model for how MNase digestion affects the mapping of individual nucleosomes. Applied to both yeast and human cells, this analysis reveals that alternatively positioned nucleosomes are prevalent and create significant heterogeneity in a cell population. We show that the periodic occurrences of dinucleotide sequences relative to nucleosome dyads can be directly determined from genome-wide nucleosome positions from MNase-seq. Alternatively positioned nucleosomes near transcription start sites likely represent different states of promoter nucleosomes during transcription initiation. Our method can be applied to map nucleosome positions in diverse organisms at base-pair resolution. DOI: http://dx.doi.org/10.7554/eLife.16970.001 PMID:27623011

  9. Pattern selection in extended periodically forced systems: a continuum coupled map approach.

    PubMed

    Venkataramani, S C; Ott, E

    2001-04-01

    We propose that a useful approach to the modeling of periodically forced extended systems is through continuum coupled map (CCM) models. CCM models are discrete time, continuous space models, mapping a continuous spatially varying field xi(n)(x) from time n to time n+1. The efficacy of CCM models is illustrated by an application to experiments of Umbanhowar, Melo, and Swinney [Nature 382, 793 (1996)] on vertically vibrated granular layers. Using a simple CCM model incorporating temporal period doubling and spatial patterning at a preferred length scale, we obtain results that bear remarkable similarities to the experimental observations. The fact that the model does not make use of physics specific to granular layers suggests that similar phenomena may be observed in other (nongranular) periodically forced, strongly dissipative systems. We also present a framework for the analysis of pattern selection in CCM models using a truncated modal expansion. Through the analysis, we predict scaling laws of various quantities, and these laws may be verifiable experimentally.

  10. A network-based phenotype mapping approach to identify genes that modulate drug response phenotypes

    PubMed Central

    Cairns, Junmei; Ung, Choong Yong; da Rocha, Edroaldo Lummertz; Zhang, Cheng; Correia, Cristina; Weinshilboum, Richard; Wang, Liewei; Li, Hu

    2016-01-01

    To better address the problem of drug resistance during cancer chemotherapy and explore the possibility of manipulating drug response phenotypes, we developed a network-based phenotype mapping approach (P-Map) to identify gene candidates that upon perturbed can alter sensitivity to drugs. We used basal transcriptomics data from a panel of human lymphoblastoid cell lines (LCL) to infer drug response networks (DRNs) that are responsible for conferring response phenotypes for anthracycline and taxane, two common anticancer agents use in clinics. We further tested selected gene candidates that interact with phenotypic differentially expressed genes (PDEGs), which are up-regulated genes in LCL for a given class of drug response phenotype in triple-negative breast cancer (TNBC) cells. Our results indicate that it is possible to manipulate a drug response phenotype, from resistant to sensitive or vice versa, by perturbing gene candidates in DRNs and suggest plausible mechanisms regulating directionality of drug response sensitivity. More important, the current work highlights a new way to formulate systems-based therapeutic design: supplementing therapeutics that aim to target disease culprits with phenotypic modulators capable of altering DRN properties with the goal to re-sensitize resistant phenotypes. PMID:27841317

  11. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms

    PubMed Central

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting

    2016-01-01

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus, which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates that machine learning-based method can assign more efficient weight coefficients than using empirical formula based on biological knowledge. PMID:27660763

  12. Mapping the distribution of neuroepithelial bodies of the rat lung. A whole-mount immunohistochemical approach.

    PubMed Central

    Avadhanam, K. P.; Plopper, C. G.; Pinkerton, K. E.

    1997-01-01

    We report an immunohistochemical method for mapping the distribution of neuroepithelial bodies (NEBs) in whole-mount preparations of the intrapulmonary airways. The lungs of 8- and 50-day-old male Sprague-Dawley rats were fixed with ethanol-acetic acid by intratracheal instillation. The major axial airway path of the infracardiac lobe was exposed and isolated by microdissection. NEBs were identified by calcitonin gene-related peptide immunoreactivity and their distribution mapped by generation and branch-point number. A distinct pattern was noted with greater prevalence of NEBs in proximal airway generations compared with more distal airways. No significant difference was noted in the distribution pattern or absolute number of NEBs between neonates and adults when compared by airway generation. NEBs were found more frequently on the ridges of the bifurcation than in other regions of the bifurcating airway wall. The ease of identification of total numbers of NEBs and their specific location by airway generation in whole-mount preparations of the bronchial tree completely removes the necessity of examining multiple sections and performing extensive morphometric procedures. Whole-mount airway preparations allow for the analysis and comparison of larger sample sizes per experimental group without labor-intensive approaches. The application of this method should enhance our knowledge of the role of NEBs in lung development and in response to disease. Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 PMID:9060823

  13. An efficient unsupervised index based approach for mapping urban vegetation from IKONOS imagery

    NASA Astrophysics Data System (ADS)

    Anchang, Julius Y.; Ananga, Erick O.; Pu, Ruiliang

    2016-08-01

    Despite the increased availability of high resolution satellite image data, their operational use for mapping urban land cover in Sub-Saharan Africa continues to be limited by lack of computational resources and technical expertise. As such, there is need for simple and efficient image classification techniques. Using Bamenda in North West Cameroon as a test case, we investigated two completely unsupervised pixel based approaches to extract tree/shrub (TS) and ground vegetation (GV) cover from an IKONOS derived soil adjusted vegetation index. These included: (1) a simple Jenks Natural Breaks classification and (2) a two-step technique that combined the Jenks algorithm with agglomerative hierarchical clustering. Both techniques were compared with each other and with a non-linear support vector machine (SVM) for classification performance. While overall classification accuracy was generally high for all techniques (>90%), One-Way Analysis of Variance tests revealed the two step technique to outperform the simple Jenks classification in terms of predicting the GV class. It also outperformed the SVM in predicting the TS class. We conclude that the unsupervised methods are technically as good and practically superior for efficient urban vegetation mapping in budget and technically constrained regions such as Sub-Saharan Africa.

  14. Policy, Research and Residents’ Perspectives on Built Environments Implicated in Heart Disease: A Concept Mapping Approach

    PubMed Central

    Stankov, Ivana; Howard, Natasha J.; Daniel, Mark; Cargo, Margaret

    2017-01-01

    An underrepresentation of stakeholder perspectives within urban health research arguably limits our understanding of what is a multi-dimensional and complex relationship between the built environment and health. By engaging a wide range of stakeholders using a participatory concept mapping approach, this study aimed to achieve a more holistic and nuanced understanding of the built environments shaping disease risk, specifically cardiometabolic risk (CMR). Moreover, this study aimed to ascertain the importance and changeability of identified environments through government action. Through the concept mapping process, community members, researchers, government and non-government stakeholders collectively identified eleven clusters encompassing 102 built environmental domains related to CMR, a number of which are underrepresented within the literature. Among the identified built environments, open space, public transportation and pedestrian environments were highlighted as key targets for policy intervention. Whilst there was substantive convergence in stakeholder groups’ perspectives concerning the built environment and CMR, there were disparities in the level of importance government stakeholders and community members respectively assigned to pedestrian environments and street connectivity. These findings support the role of participatory methods in strengthening how urban health issues are understood and in affording novel insights into points of action for public health and policy intervention. PMID:28208786

  15. A new LPV modeling approach using PCA-based parameter set mapping to design a PSS.

    PubMed

    Jabali, Mohammad B Abolhasani; Kazemi, Mohammad H

    2017-01-01

    This paper presents a new methodology for the modeling and control of power systems based on an uncertain polytopic linear parameter-varying (LPV) approach using parameter set mapping with principle component analysis (PCA). An LPV representation of the power system dynamics is generated by linearization of its differential-algebraic equations about the transient operating points for some given specific faults containing the system nonlinear properties. The time response of the output signal in the transient state plays the role of the scheduling signal that is used to construct the LPV model. A set of sample points of the dynamic response is formed to generate an initial LPV model. PCA-based parameter set mapping is used to reduce the number of models and generate a reduced LPV model. This model is used to design a robust pole placement controller to assign the poles of the power system in a linear matrix inequality (LMI) region, such that the response of the power system has a proper damping ratio for all of the different oscillation modes. The proposed scheme is applied to controller synthesis of a power system stabilizer, and its performance is compared with a tuned standard conventional PSS using nonlinear simulation of a multi-machine power network. The results under various conditions show the robust performance of the proposed controller.

  16. Object-based approach to national land cover mapping using HJ satellite imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Xiaosong; Yuan, Quanzhi; Liu, Yu

    2014-01-01

    To meet the carbon storage estimate in ecosystems for a national carbon strategy, we introduce a consistent database of China land cover. The Chinese Huan Jing (HJ) satellite is proven efficient in the cloud-free acquisition of seasonal image series in a monsoon region and in vegetation identification for mesoscale land cover mapping. Thirty-eight classes of level II land cover are generated based on the Land Cover Classification System of the United Nations Food and Agriculture Organization that follows a standard and quantitative definition. Twenty-four layers of derivative spectral, environmental, and spatial features compose the classification database. Object-based approach characterizing additional nonspectral features is conducted through mapping, and multiscale segmentations are applied on object boundary match to target real-world conditions. This method sufficiently employs spatial information, in addition to spectral characteristics, to improve classification accuracy. The algorithm of hierarchical classification is employed to follow step-by-step procedures that effectively control classification quality. This algorithm divides the dual structures of universal and local trees. Consistent universal trees suitable to most regions are performed first, followed by local trees that depend on specific features of nine climate stratifications. The independent validation indicates the overall accuracy reaches 86%.

  17. Geographical information system approaches for hazard mapping of dilute lahars on Montserrat, West Indies

    NASA Astrophysics Data System (ADS)

    Darnell, A. R.; Barclay, J.; Herd, R. A.; Phillips, J. C.; Lovett, A. A.; Cole, P.

    2012-08-01

    Many research tools for lahar hazard assessment have proved wholly unsuitable for practical application to an active volcanic system where field measurements are challenging to obtain. Two simple routing models, with minimal data demands and implemented in a geographical information system (GIS), were applied to dilute lahars originating from Soufrière Hills Volcano, Montserrat. Single-direction flow routing by path of steepest descent, commonly used for simulating normal stream-flow, was tested against LAHARZ, an established lahar model calibrated for debris flows, for ability to replicate the main flow routes. Comparing the ways in which these models capture observed changes, and how the different modelled paths deviate can also provide an indication of where dilute lahars, do not follow behaviour expected from single-phase flow models. Data were collected over two field seasons and provide (1) an overview of gross morphological change after one rainy season, (2) details of dominant channels at the time of measurement, and (3) order of magnitude estimates of individual flow volumes. Modelling results suggested both GIS-based predictive tools had associated benefits. Dominant flow routes observed in the field were generally well-predicted using the hydrological approach with a consideration of elevation error, while LAHARZ was comparatively more successful at mapping lahar dispersion and was better suited to long-term hazard assessment. This research suggests that end-member models can have utility for first-order dilute lahar hazard mapping.

  18. Mapping the progress and impacts of public health approaches to palliative care: a scoping review protocol

    PubMed Central

    Archibald, Daryll; Patterson, Rebecca; Haraldsdottir, Erna; Hazelwood, Mark; Fife, Shirley; Murray, Scott A

    2016-01-01

    Introduction Public health palliative care is a term that can be used to encompass a variety of approaches that involve working with communities to improve people's experience of death, dying and bereavement. Recently, public health palliative care approaches have gained recognition and momentum within UK health policy and palliative care services. There is general consensus that public health palliative care approaches can complement and go beyond the scope of formal service models of palliative care. However, there is no clarity about how these approaches can be undertaken in practice or how evidence can be gathered relating to their effectiveness. Here we outline a scoping review protocol that will systematically map and categorise the variety of activities and programmes that could be classified under the umbrella term ‘public health palliative care’ and highlight the impact of these activities where measured. Methods and analysis This review will be guided by Arksey and O'Malley's scoping review methodology and incorporate insights from more recent innovations in scoping review methodology. Sensitive searches of 9 electronic databases from 1999 to 2016 will be supplemented by grey literature searches. Eligible studies will be screened independently by two reviewers using a data charting tool developed for this scoping review. Ethics and dissemination This scoping review will undertake a secondary analysis of data already collected and does not require ethical approval. The results will facilitate better understanding of the practical application of public health approaches to palliative care, the impacts these activities can have and how to build the evidence base for this work in future. The results will be disseminated through traditional academic routes such as conferences and journals and also policy and third sector seminars. PMID:27417201

  19. Mind-mapping for lung cancer: Towards a personalized therapeutics approach

    PubMed Central

    Mollberg, N; Surati, M; Demchuk, C; Fathi, R; Salama, AK; Husain, AN; Hensing, T; Salgia, R

    2011-01-01

    There will be over 220,000 people diagnosed with lung cancer and over 160,000 dying of lung cancer this year alone in the United States. In order to arrive at better control, prevention, diagnosis, and therapeutics for lung cancer, we must be able to personalize the approach towards lung cancer. Mind-mapping has existed for centuries for physicians to properly think about various “flows” of personalized medicine. We include here the epidemiology, diagnosis, histology, and treatment of lung cancer—specifically, non-small cell lung cancer. As we have new molecular signatures for lung cancer, this is further detailed. This review is not meant to be a comprehensive review, but rather its purpose is to highlight important aspects of lung cancer diagnosis, management, and personalized treatment options. PMID:21337123

  20. Contemporary approaches to neural circuit manipulation and mapping: focus on reward and addiction

    PubMed Central

    Saunders, Benjamin T.; Richard, Jocelyn M.; Janak, Patricia H.

    2015-01-01

    Tying complex psychological processes to precisely defined neural circuits is a major goal of systems and behavioural neuroscience. This is critical for understanding adaptive behaviour, and also how neural systems are altered in states of psychopathology, such as addiction. Efforts to relate psychological processes relevant to addiction to activity within defined neural circuits have been complicated by neural heterogeneity. Recent advances in technology allow for manipulation and mapping of genetically and anatomically defined neurons, which when used in concert with sophisticated behavioural models, have the potential to provide great insight into neural circuit bases of behaviour. Here we discuss contemporary approaches for understanding reward and addiction, with a focus on midbrain dopamine and cortico-striato-pallidal circuits. PMID:26240425

  1. Mind-mapping for lung cancer: towards a personalized therapeutics approach.

    PubMed

    Mollberg, N; Surati, M; Demchuk, C; Fathi, R; Salama, A K; Husain, A N; Hensing, T; Salgia, R

    2011-03-01

    There were over 220,000 people diagnosed with lung cancer and over 160,000 people dying of lung cancer during 2010 alone in the United States. In order to arrive at better control, prevention, diagnosis, and therapeutics for lung cancer, we must be able to personalize the approach towards the disease. Mind-mapping has existed for centuries for physicians to properly think about various "flows" of personalized medicine. We include here the epidemiology, diagnosis, histology, and treatment of lung cancer-in particular, non-small cell lung cancer. As we have new molecular signatures for lung cancer, this is further detailed. This review is not meant to be a comprehensive review, but rather its purpose is to highlight important aspects of lung cancer diagnosis, management, and personalized treatment options.

  2. Segmentation of angiodysplasia lesions in WCE images using a MAP approach with Markov Random Fields.

    PubMed

    Vieira, Pedro M; Goncalves, Bruno; Goncalves, Carla R; Lima, Carlos S

    2016-08-01

    This paper deals with the segmentation of angiodysplasias in wireless capsule endoscopy images. These lesions are the cause of almost 10% of all gastrointestinal bleeding episodes, and its detection using the available software presents low sensitivity. This work proposes an automatic selection of a ROI using an image segmentation module based on the MAP approach where an accelerated version of the EM algorithm is used to iteratively estimate the model parameters. Spatial context is modeled in the prior probability density function using Markov Random Fields. The color space used was CIELab, specially the a component, which highlighted most these type of lesions. The proposed method is the first regarding this specific type of lesions, but when compared to other state-of-the-art segmentation methods, it almost doubles the results.

  3. A universal airborne LiDAR approach for tropical forest carbon mapping.

    PubMed

    Asner, Gregory P; Mascaro, Joseph; Muller-Landau, Helene C; Vieilledent, Ghislain; Vaudry, Romuald; Rasamoelina, Maminiaina; Hall, Jefferson S; van Breugel, Michiel

    2012-04-01

    Airborne light detection and ranging (LiDAR) is fast turning the corner from demonstration technology to a key tool for assessing carbon stocks in tropical forests. With its ability to penetrate tropical forest canopies and detect three-dimensional forest structure, LiDAR may prove to be a major component of international strategies to measure and account for carbon emissions from and uptake by tropical forests. To date, however, basic ecological information such as height-diameter allometry and stand-level wood density have not been mechanistically incorporated into methods for mapping forest carbon at regional and global scales. A better incorporation of these structural patterns in forests may reduce the considerable time needed to calibrate airborne data with ground-based forest inventory plots, which presently necessitate exhaustive measurements of tree diameters and heights, as well as tree identifications for wood density estimation. Here, we develop a new approach that can facilitate rapid LiDAR calibration with minimal field data. Throughout four tropical regions (Panama, Peru, Madagascar, and Hawaii), we were able to predict aboveground carbon density estimated in field inventory plots using a single universal LiDAR model (r ( 2 ) = 0.80, RMSE = 27.6 Mg C ha(-1)). This model is comparable in predictive power to locally calibrated models, but relies on limited inputs of basal area and wood density information for a given region, rather than on traditional plot inventories. With this approach, we propose to radically decrease the time required to calibrate airborne LiDAR data and thus increase the output of high-resolution carbon maps, supporting tropical forest conservation and climate mitigation policy.

  4. Searching for new heavy neutral gauge bosons using vector boson fusion processes at the LHC

    NASA Astrophysics Data System (ADS)

    Flórez, Andrés; Gurrola, Alfredo; Johns, Will; Oh, Young Do; Sheldon, Paul; Teague, Dylan; Weiler, Thomas

    2017-04-01

    New massive resonances are predicted in many extensions to the Standard Model (SM) of particle physics and constitutes one of the most promising searches for new physics at the LHC. We present a feasibility study to search for new heavy neutral gauge bosons using vector boson fusion (VBF) processes, which become especially important as the LHC probes higher collision energies. In particular, we consider the possibility that the discovery of a Z‧ boson may have eluded searches at the LHC. The coupling of the Z‧ boson to the SM quarks can be small, and thus the Z‧ would not be discoverable by the searches conducted thus far. In the context of a simplified phenomenological approach, we consider the Z‧ → ττ and Z‧ → μμ decay modes to show that the requirement of a dilepton pair combined with two high pT forward jets with large separation in pseudorapidity and with large dijet mass is effective in reducing SM backgrounds. The expected exclusion bounds (at 95% confidence level) are m (Z‧) < 1.8 TeV and m (Z‧) < 2.5 TeV in the ττjfjf and μμjfjf channels, respectively, assuming 1000 fb-1 of 13 TeV data from the LHC. The use of the VBF topology to search for massive neutral gauge bosons provides a discovery reach with expected significances greater than 5σ (3σ) for Z‧ masses up to 1.4 (1.6) TeV and 2.0 (2.2) TeV in the ττjfjf and μμjfjf channels.

  5. An Improved Map-Matching Technique Based on the Fréchet Distance Approach for Pedestrian Navigation Services

    PubMed Central

    Bang, Yoonsik; Kim, Jiyoung; Yu, Kiyun

    2016-01-01

    Wearable and smartphone technology innovations have propelled the growth of Pedestrian Navigation Services (PNS). PNS need a map-matching process to project a user’s locations onto maps. Many map-matching techniques have been developed for vehicle navigation services. These techniques are inappropriate for PNS because pedestrians move, stop, and turn in different ways compared to vehicles. In addition, the base map data for pedestrians are more complicated than for vehicles. This article proposes a new map-matching method for locating Global Positioning System (GPS) trajectories of pedestrians onto road network datasets. The theory underlying this approach is based on the Fréchet distance, one of the measures of geometric similarity between two curves. The Fréchet distance approach can provide reasonable matching results because two linear trajectories are parameterized with the time variable. Then we improved the method to be adaptive to the positional error of the GPS signal. We used an adaptation coefficient to adjust the search range for every input signal, based on the assumption of auto-correlation between consecutive GPS points. To reduce errors in matching, the reliability index was evaluated in real time for each match. To test the proposed map-matching method, we applied it to GPS trajectories of pedestrians and the road network data. We then assessed the performance by comparing the results with reference datasets. Our proposed method performed better with test data when compared to a conventional map-matching technique for vehicles. PMID:27782091

  6. Soil mapping in northern Thailand based on an radiometrically calibrated Maximum likelihood approach

    NASA Astrophysics Data System (ADS)

    Schuler, U.; Herrmann, L.; Rangnugpit, W.; Stahr, K.

    2009-04-01

    area with low background radiation showed different gamma-ray spectra for the respective reference soil groups, so that these points can be used as secondary training data. In conclusion, the calibration of the Maximum likelihood approach with airborne radiometric data offers a promising possibility for efficient soil mapping of larger regions in northern Thailand.

  7. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    NASA Astrophysics Data System (ADS)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts

  8. An Improved Approach for Mapping Quantitative Trait loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Wullschleger, Stan D; Wu, Song; Wu, Rongling; Yang, Jie; Li, Yao; Yin, Tongming; Tuskan, Gerald A

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  9. An Improved Approach for Mapping Quantitative Trait Loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Tuskan, Gerald A; Yin, Tongming; Wullschleger, Stan D; Yang, Jie; Huang, Youjun; Li, Yao; Wu, Rongling

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  10. An improved approach for mapping quantitative trait Loci in a pseudo-testcross: revisiting a poplar mapping study.

    PubMed

    Wu, Song; Yang, Jie; Huang, Youjun; Li, Yao; Yin, Tongming; Wullschleger, Stan D; Tuskan, Gerald A; Wu, Rongling

    2010-02-04

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  11. Towards quantum supremacy with lossy scattershot boson sampling

    NASA Astrophysics Data System (ADS)

    Latmiral, Ludovico; Spagnolo, Nicolò; Sciarrino, Fabio

    2016-11-01

    Boson sampling represents a promising approach to obtain evidence of the supremacy of quantum systems as a resource for the solution of computational problems. The classical hardness of Boson Sampling has been related to the so called Permanent-of-Gaussians Conjecture and has been extended to some generalizations such as Scattershot Boson Sampling, approximate and lossy sampling under some reasonable constraints. However, it is still unclear how demanding these techniques are for a quantum experimental sampler. Starting from a state of the art analysis and taking account of the foreseeable practical limitations, we evaluate and discuss the bound for quantum supremacy for different recently proposed approaches, accordingly to today’s best known classical simulators.

  12. Chiral Bosonization of Superconformal Ghosts

    NASA Technical Reports Server (NTRS)

    Shi, Deheng; Shen, Yang; Liu, Jinling; Xiong, Yongjian

    1996-01-01

    We explain the difference of the Hilbert space of the superconformal ghosts (beta,gamma) system from that of its bosonized fields phi and chi. We calculate the chiral correlation functions of phi, chi fields by inserting appropriate projectors.

  13. What is a Higgs Boson?

    SciTech Connect

    Lincoln, Don

    2011-07-07

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  14. What is a Higgs Boson?

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  15. A stochastic simulation framework for the prediction of strategic noise mapping and occupational noise exposure using the random walk approach.

    PubMed

    Han, Lim Ming; Haron, Zaiton; Yahya, Khairulzan; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces.

  16. Interhemispheric transfalcine approach and awake cortical mapping for resection of peri-atrial gliomas associated with the central lobule.

    PubMed

    Malekpour, Mahdi; Cohen-Gadol, Aaron A

    2015-02-01

    Medial posterior frontal and parietal gliomas extending to the peri-atrial region are difficult to reach surgically because of the working angle required to expose the lateral aspect of the tumor and the proximity of the tumor to the sensorimotor lobule; retraction of the sensorimotor cortex may lead to morbidity. The interhemispheric transfalcine approach is favorable and safe for resection of medial hemispheric tumors adjacent to the falx cerebri, but the literature on this approach is scarce. Awake cortical mapping using this operative route for tumors associated with the sensorimotor cortex has not been previously reported to our knowledge. We present the first case of a right medial posterior frontoparietal oligoastrocytoma that was resected through the interhemispheric transfalcine approach using awake cortical and subcortical mapping. Through a contralateral frontoparietal craniotomy, we excised a section of the falx and exposed the contralateral medial hemisphere. Cortical stimulation allowed localization of the supplementary motor cortex, and suprathreshold stimulation mapping excluded the primary motor cortex corresponding to the leg area. Gross total tumor resection was accomplished without any intraoperative or postoperative deficits. Awake cortical mapping using the contralateral transfalcine approach allows a "cross-court" operative route to map functional cortices and resect peri-atrial low-grade gliomas. This technique can minimize the otherwise necessary retraction on the ipsilateral hemisphere through an ipsilateral craniotomy.

  17. A new approach of mapping soils in the Alps - Challenges of deriving soil information and creating soil maps for sustainable land use. An example from South Tyrol (Italy)

    NASA Astrophysics Data System (ADS)

    Baruck, Jasmin; Gruber, Fabian E.; Geitner, Clemens

    2015-04-01

    Nowadays sustainable land use management is gaining importance because intensive land use leads to increasing soil degradation. Especially in mountainous regions like the Alps sustainable land use management is important, as topography limits land use. Therefore, a database containing detailed information of soil characteristics is required. However, information of soil properties is far from being comprehensive. The project "ReBo - Terrain classification based on airborne laser scanning data to support soil mapping in the Alps", founded by the Autonomous Province of Bolzano, aims at developing a methodical framework of how to obtain soil data. The approach combines geomorphometric analysis and soil mapping to generate modern soil maps at medium-scale in a time and cost efficient way. In this study the open source GRASS GIS extension module r.geomorphon (Jasciewicz and Stepinski, 2013) is used to derive topographically homogeneous landform units out of high resolution DTMs on scale 1:5.000. Furthermore, for terrain segmentation and classification we additionally use medium-scale data sets (geology, parent material, land use etc.). As the Alps are characterized by a great variety of topography, parent material, wide range of moisture regimes etc. getting reliable soil data is difficult. Additionally, geomorphic activity (debris flow, landslide etc.) leads to natural disturbances. Thus, soil properties are highly diverse and largely scale dependent. Furthermore, getting soil information of anthropogenically influenced soils is an added challenge. Due to intensive cultivation techniques the natural link between the soil forming factors is often repealed. In South Tyrol we find the largest pome producing area in Europe. Normally, the annual precipitation is not enough for intensive orcharding. Thus, irrigation strategies are in use. However, as knowledge about the small scaled heterogeneous soil properties is mostly lacking, overwatering and modifications of the

  18. Seeing the whole picture: A comprehensive imaging approach to functional mapping of circuits in behaving zebrafish.

    PubMed

    Feierstein, C E; Portugues, R; Orger, M B

    2015-06-18

    In recent years, the zebrafish has emerged as an appealing model system to tackle questions relating to the neural circuit basis of behavior. This can be attributed not just to the growing use of genetically tractable model organisms, but also in large part to the rapid advances in optical techniques for neuroscience, which are ideally suited for application to the small, transparent brain of the larval fish. Many characteristic features of vertebrate brains, from gross anatomy down to particular circuit motifs and cell-types, as well as conserved behaviors, can be found in zebrafish even just a few days post fertilization, and, at this early stage, the physical size of the brain makes it possible to analyze neural activity in a comprehensive fashion. In a recent study, we used a systematic and unbiased imaging method to record the pattern of activity dynamics throughout the whole brain of larval zebrafish during a simple visual behavior, the optokinetic response (OKR). This approach revealed the broadly distributed network of neurons that were active during the behavior and provided insights into the fine-scale functional architecture in the brain, inter-individual variability, and the spatial distribution of behaviorally relevant signals. Combined with mapping anatomical and functional connectivity, targeted electrophysiological recordings, and genetic labeling of specific populations, this comprehensive approach in zebrafish provides an unparalleled opportunity to study complete circuits in a behaving vertebrate animal.

  19. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches

    PubMed Central

    Ma, Ying; Shaik, Mohammed A.; Kozberg, Mariel G.; Thibodeaux, David N.; Zhao, Hanzhi T.; Yu, Hang

    2016-01-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574312

  20. The influence of mapped hazards on risk beliefs: a proximity-based modeling approach.

    PubMed

    Severtson, Dolores J; Burt, James E

    2012-02-01

    Interview findings suggest perceived proximity to mapped hazards influences risk beliefs when people view environmental hazard maps. For dot maps, four attributes of mapped hazards influenced beliefs: hazard value, proximity, prevalence, and dot patterns. In order to quantify the collective influence of these attributes for viewers' perceived or actual map locations, we present a model to estimate proximity-based hazard or risk (PBH) and share study results that indicate how modeled PBH and map attributes influenced risk beliefs. The randomized survey study among 447 university students assessed risk beliefs for 24 dot maps that systematically varied by the four attributes. Maps depicted water test results for a fictitious hazardous substance in private residential wells and included a designated "you live here" location. Of the nine variables that assessed risk beliefs, the numerical susceptibility variable was most consistently and strongly related to map attributes and PBH. Hazard value, location in or out of a clustered dot pattern, and distance had the largest effects on susceptibility. Sometimes, hazard value interacted with other attributes, for example, distance had stronger effects on susceptibility for larger than smaller hazard values. For all combined maps, PBH explained about the same amount of variance in susceptibility as did attributes. Modeled PBH may have utility for studying the influence of proximity to mapped hazards on risk beliefs, protective behavior, and other dependent variables. Further work is needed to examine these influences for more realistic maps and representative study samples.

  1. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  2. Sequencing the Pig Genome Using a Mapped BAC by BAC Approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We have generated a highly contiguous physical map covering >98% of the pig genome in just 176 contigs. The map is localised to the genome through integration with the UIUC RH map as well BAC end sequence alignments to the human genome. Over 265k HindIII restriction digest fingerprints totalling 1...

  3. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    NASA Astrophysics Data System (ADS)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels

  4. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    USGS Publications Warehouse

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  5. Mapping Natural Terroir Units using a multivariate approach and legacy data

    NASA Astrophysics Data System (ADS)

    Priori, Simone; Barbetti, Roberto; L'Abate, Giovanni; Bucelli, Piero; Storchi, Paolo; Costantini, Edoardo A. C.

    2014-05-01

    was then subdivided into 9 NTUs, statistically differentiated for the used variables. The vineyard areas of Siena province was subdivided into 9 NTU, statistically differentiated for the used variables. The study demonstrated the strength of a multivariate approach for NTU mapping at province scale (1:125,000), using viticultural legacy data. Identification and mapping of terroir diversity within the DOC and DOCG at the province scale suggest the adoption of viticultural subzones. The subzones, based on the NTU, could bring to the fruition of different wine-production systems that enhanced the peculiarities of the terroir.

  6. Rotating boson stars in five dimensions

    SciTech Connect

    Hartmann, Betti; Kleihaus, Burkhard; Kunz, Jutta; List, Meike

    2010-10-15

    We study rotating boson stars in five spacetime dimensions. The boson fields consist of a complex doublet scalar field. Considering boson stars rotating in two orthogonal planes with both angular momenta of equal magnitude, a special ansatz for the boson field and the metric allows for solutions with nontrivial dependence on the radial coordinate only. The charge of the scalar field equals the sum of the angular momenta. The rotating boson stars are globally regular and asymptotically flat. For our choice of a sextic potential, the rotating boson star solutions possess a flat spacetime limit. We study the solutions in flat and curved spacetime.

  7. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    NASA Astrophysics Data System (ADS)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  8. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    PubMed Central

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map. PMID:26742857

  9. Analytic boosted boson discrimination

    DOE PAGES

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2016-05-20

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D2, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits.more » By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. In conclusion, our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.« less

  10. Analytic boosted boson discrimination

    SciTech Connect

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2016-05-20

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D2, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits. By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. In conclusion, our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.

  11. Non-linear dynamics of operant behavior: a new approach via the extended return map.

    PubMed

    Li, Jay-Shake; Huston, Joseph P

    2002-01-01

    Previous efforts to apply non-linear dynamic tools to the analysis of operant behavior revealed some promise for this kind of approach, but also some doubts, since the complexity of animal behavior seemed to be beyond the analyzing ability of the available tools. We here outline a series of studies based on a novel approach. We modified the so-called 'return map' and developed a new method, the 'extended return map' (ERM) to extract information from the highly irregular time series data, the inter-response time (IRT) generated by Skinner-box experiments. We applied the ERM to operant lever pressing data from rats using the four fundamental reinforcement schedules: fixed interval (FI), fixed ratio (FR), variable interval (VI) and variable ratio (VR). Our results revealed interesting patterns in all experiment groups. In particular, the FI and VI groups exhibited well-organized clusters of data points. We calculated the fractal dimension out of these patterns and compared experimental data with surrogate data sets, that were generated by randomly shuffling the sequential order of original IRTs. This comparison supported the finding that patterns in ERM reflect the dynamics of the operant behaviors under study. We then built two models to simulate the functional mechanisms of the FI schedule. Both models can produce similar distributions of IRTs and the stereotypical 'scalloped' curve characteristic of FI responding. However, they differ in one important feature in their formulation: while one model uses a continuous function to describe the probability of occurrence of an operant behavior, the other one employs an abrupt switch of behavioral state. Comparison of ERMs showed that only the latter was able to produce patterns similar to the experimental results, indicative of the operation of an abrupt switch from one behavioral state to another over the course of the inter-reinforcement period. This example demonstrated the ERM to be a useful tool for the analysis of

  12. Crater Mapping in the Pluto-Charon System: Considerations, Approach, and Progress

    NASA Astrophysics Data System (ADS)

    Robbins, S. J.; Singer, K. N.; Bray, V. J.; Schenk, P.; Zangari, A. M.; McKinnon, W. B.; Young, L. A.; Runyon, K. D.; Beyer, R. A.; Porter, S.; Lauer, T.; Weaver, H. A., Jr.; Olkin, C.; Ennico Smith, K.; Stern, A.

    2015-12-01

    NASA's New Horizons mission successfully made its closest approach to Pluto on July 14, 2015, at 11:49A.M. UTC. The flyby nature of the mission, distance to the system, and multiple planetary bodies to observe with a diverse instrument set required a complex imaging campaign marked by numerous trade-offs; these lead to a more complicated crater population mapping than a basic orbital mission. The Pluto and Charon imaging campaigns were full-disk or mosaics of the full disk until ≈3.5 hrs before closest approach when the pixel scale was 0.9 km/px. After this, several LORRI-specific imaging campaigns were conducted of the partial disk and later the full crescent, while additional strips were ride-alongs with other instruments. These should supply partial coverage at up to 70-80 m/px for Pluto and 160 m/px for Charon. The LORRI coverage at ≈0.4 km/px does not cover the entire encounter hemisphere, but the MVIC instrument provided comparable full-disk coverage (0.5 km/px) and partial disk at 0.3 km/px. The best images of the non-encounter hemispheres of Pluto and Charon are ≈21 km/px (taken midnight July 10-11). As with any single flyby mission, we are constrained by the best pixel scales and incidence angles at which images were taken during the flyby. While most high-resolution imaging by quantity has been done over areas of variable solar incidence as the spacecraft passed by Pluto and Charon, these cover a relatively small fraction of the bodies and most coverage has been at near-noon sun which makes crater identification difficult. Numerous team members are independently using a variety of crater mapping tools and image products, which will be reconciled and merged to make a more robust final database. We will present our consensus crater database to-date of both plutonian and charonian impact craters as well as correlations with preliminary geologic units. We will also discuss how the crater population compares with predictions and modeled Kuiper Belt

  13. Mapping irrigation potential from renewable groundwater in Africa - a quantitative hydrological approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Y.; Villholth, K. G.

    2015-02-01

    Groundwater provides an important buffer to climate variability in Africa. Yet, groundwater irrigation contributes only a relatively small share of cultivated land, approximately 1% (about 2 × 106 hectares) as compared to 14% in Asia. While groundwater is over-exploited for irrigation in many parts in Asia, previous assessments indicate an underutilized potential in parts of Africa. As opposed to previous country-based estimates, this paper derives a continent-wide, distributed (0.5° spatial resolution) map of groundwater irrigation potential, indicated in terms of fractions of cropland potentially irrigable with renewable groundwater. The method builds on an annual groundwater balance approach using 41 years of hydrological data, allocating only that fraction of groundwater recharge that is in excess after satisfying other present human needs and environmental requirements, while disregarding socio-economic and physical constraints in access to the resource. Due to high uncertainty of groundwater environmental needs, three scenarios, leaving 30, 50 and 70% of recharge for the environment, were implemented. Current dominating crops and cropping rotations and associated irrigation requirements in a zonal approach were applied in order to convert recharge excess to potential irrigated cropland. Results show an inhomogeneously distributed groundwater irrigation potential across the continent, even within individual countries, mainly reflecting recharge patterns and presence or absence of cultivated cropland. Results further show that average annual renewable groundwater availability for irrigation ranges from 692 to 1644 km3 depending on scenario. The total area of cropland irrigable with renewable groundwater ranges from 44.6 to 105.3 × 106 ha, corresponding to 20.5 to 48.6% of the cropland over the continent. In particular, significant potential exists in the semi-arid Sahel and eastern African regions which could support poverty alleviation if developed

  14. Using a constructivist approach with online concept maps: relationship between theory and nursing education.

    PubMed

    Conceição, Simone C O; Taylor, Linda D

    2007-01-01

    Concept maps have been used in nursing education as a method for students to organize and analyze data. This article describes an online course that used concept maps and self-reflective journals to assess students' thinking processes. The self-reflective journals of 21 students collected over two semesters were qualitatively examined. Three major themes emerged from students' use of concept maps: 1) factors influencing the map creation, 2) developmental learning process over time, and 3) validation of existing knowledge and construction of new knowledge. The use of concept maps with reflective journaling provided a learning experience that allowed students to integrate content consistent with a constructivist paradigm. This integration is a developmental process influenced by the personal preferences of students, concept map design, and content complexity. This developmental process provides early evidence that the application of concept mapping in the online environment, along with reflective journaling, allows students to make new connections, integrate previous knowledge, and validate existing knowledge.

  15. An approach for mapping large-area impervious surfaces: Synergistic use of Landsat-7 ETM+ and high spatial resolution imagery

    USGS Publications Warehouse

    Yang, L.; Huang, C.; Homer, C.G.; Wylie, B.K.; Coan, M.J.

    2003-01-01

    A wide range of urban ecosystem studies, including urban hydrology, urban climate, land use planning, and resource management, require current and accurate geospatial data of urban impervious surfaces. We developed an approach to quantify urban impervious surfaces as a continuous variable by using multisensor and multisource datasets. Subpixel percent impervious surfaces at 30-m resolution were mapped using a regression tree model. The utility, practicality, and affordability of the proposed method for large-area imperviousness mapping were tested over three spatial scales (Sioux Falls, South Dakota, Richmond, Virginia, and the Chesapeake Bay areas of the United States). Average error of predicted versus actual percent impervious surface ranged from 8.8 to 11.4%, with correlation coefficients from 0.82 to 0.91. The approach is being implemented to map impervious surfaces for the entire United States as one of the major components of the circa 2000 national land cover database.

  16. Two-dimensional thermofield bosonization II: Massive fermions

    SciTech Connect

    Amaral, R.L.P.G.

    2008-11-15

    We consider the perturbative computation of the N-point function of chiral densities of massive free fermions at finite temperature within the thermofield dynamics approach. The infinite series in the mass parameter for the N-point functions are computed in the fermionic formulation and compared with the corresponding perturbative series in the interaction parameter in the bosonized thermofield formulation. Thereby we establish in thermofield dynamics the formal equivalence of the massive free fermion theory with the sine-Gordon thermofield model for a particular value of the sine-Gordon parameter. We extend the thermofield bosonization to include the massive Thirring model.

  17. Target attractor tracking of relative phase in Bosonic Josephson junction

    NASA Astrophysics Data System (ADS)

    Borisenok, Sergey

    2016-06-01

    The relative phase of Bosonic Josephson junction in the Josephson regime of Bose-Hubbard model is tracked via the target attractor (`synergetic') feedback algorithm with the inter-well coupling parameter presented as a control function. The efficiency of our approach is demonstrated numerically for Gaussian and harmonic types of target phases.

  18. Tailoring online information retrieval to user's needs based on a logical semantic approach to natural language processing and UMLS mapping.

    PubMed

    Kossman, Susan; Jones, Josette; Brennan, Patricia Flatley

    2007-10-11

    Depression can derail teenagers' lives and cause serious chronic health problems. Acquiring pertinent knowledge and skills supports care management, but retrieving appropriate information can be difficult. This poster presents a strategy to tailor online information to user attributes using a logical semantic approach to natural language processing (NLP) and mapping propositions to UMLS terms. This approach capitalizes on existing NLM resources and presents a potentially sustainable plan for meeting consumers and providers information needs.

  19. The Effect of Concept Mapping-Guided Discovery Integrated Teaching Approach on Chemistry Students' Achievement and Retention

    ERIC Educational Resources Information Center

    Fatokun, K. V. F.; Eniayeju, P. A.

    2014-01-01

    This study investigates the effects of Concept Mapping-Guided Discovery Integrated Teaching Approach on the achievement and retention of chemistry students. The sample comprised 162 Senior Secondary two (SS 2) students drawn from two Science Schools in Nasarawa State, Central Nigeria with equivalent mean scores of 9.68 and 9.49 in their pre-test.…

  20. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    ERIC Educational Resources Information Center

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  1. Mapping Trends in Pedagogical Approaches and Learning Technologies: Perspectives from the Canadian, International, and Military Education Contexts

    ERIC Educational Resources Information Center

    Scoppio, Grazia; Covell, Leigha

    2016-01-01

    Increased technological advances, coupled with new learners' needs, have created new realities for higher education contexts. This study explored and mapped trends in pedagogical approaches and learning technologies in postsecondary education and identified how these innovations are affecting teaching and learning practices in higher education…

  2. The derivation of tropospheric column ozone using the TOR approach and mapping technique

    NASA Astrophysics Data System (ADS)

    Yang, Qing

    2007-12-01

    Tropospheric ozone columns (TCOs) derived from differences between the Dutch-Finnish Aura Ozone Monitoring Instrument (OMI) measurements of the total atmospheric ozone column and the Aura Microwave Limb Sounder (MLS) measurements of stratospheric ozone columns are discussed. Because the measurements by these two instruments are not spatially coincident, interpolation techniques, with emphasis on mapping the stratospheric columns in space and time using the relationships between lower stratospheric ozone and potential vorticity (PV) and geopotential heights (Z), are evaluated at mid-latitudes. It is shown that this PV mapping procedure produces somewhat better agreement in comparisons with ozonesonde measurements, particularly in winter, than does simple linear interpolation of the MLS stratospheric columns or the use of typical coincidence criteria at mid-latitudes. The OMI/MLS derived tropospheric columns are calculated to be 4 Dobson units (DU) smaller than the sonde measured columns at mid-latitudes. This mean difference is consistent with the MLS (version 1.5) stratospheric ozone columns being high relative to Stratospheric Aerosol and Gas Experiment (SAGE II) columns by 3 DU. Standard deviations between the derived tropospheric columns and those measured by ozonesondes are 9 DU (30%) annually but they are just 6 DU (15%) in summer. Uncertainties in the interpolated MLS stratospheric columns are likely to be the primary cause of these standard deviations. An important advantage of the PV mapping approach is that it works well when MLS data are missing (e.g., when an orbit of measurements is missing). In the comparisons against ozonesonde measurements, it provides up to twice as many comparisons compared to the other techniques. The OMI/MLS derived tropospheric ozone columns have been compared with corresponding columns based on the Tropospheric Emission Spectrometer (TES) measurements, and Regional chEmical trAnsport Model (REAM) simulations. The variability of

  3. Agricultural Land Use mapping by multi-sensor approach for hydrological water quality monitoring

    NASA Astrophysics Data System (ADS)

    Brodsky, Lukas; Kodesova, Radka; Kodes, Vit

    2010-05-01

    The main objective of this study is to demonstrate potential of operational use of the high and medium resolution remote sensing data for hydrological water quality monitoring by mapping agriculture intensity and crop structures. In particular use of remote sensing mapping for optimization of pesticide monitoring. The agricultural mapping task is tackled by means of medium spatial and high temporal resolution ESA Envisat MERIS FR images together with single high spatial resolution IRS AWiFS image covering the whole area of interest (the Czech Republic). High resolution data (e.g. SPOT, ALOS, Landsat) are often used for agricultural land use classification, but usually only at regional or local level due to data availability and financial constraints. AWiFS data (nominal spatial resolution 56 m) due to the wide satellite swath seems to be more suitable for use at national level. Nevertheless, one of the critical issues for such a classification is to have sufficient image acquisitions over the whole vegetation period to describe crop development in appropriate way. ESA MERIS middle-resolution data were used in several studies for crop classification. The high temporal and also spectral resolution of MERIS data has indisputable advantage for crop classification. However, spatial resolution of 300 m results in mixture signal in a single pixel. AWiFS-MERIS data synergy brings new perspectives in agricultural Land Use mapping. Also, the developed methodology procedure is fully compatible with future use of ESA (GMES) Sentinel satellite images. The applied methodology of hybrid multi-sensor approach consists of these main stages: a/ parcel segmentation and spectral pre-classification of high resolution image (AWiFS); b/ ingestion of middle resolution (MERIS) vegetation spectro-temporal features; c/ vegetation signatures unmixing; and d/ semantic object-oriented classification of vegetation classes into final classification scheme. These crop groups were selected to be

  4. Mapping Agricultural Fields in Sub-Saharan Africa with a Computer Vision Approach

    NASA Astrophysics Data System (ADS)

    Debats, S. R.; Luo, D.; Estes, L. D.; Fuchs, T.; Caylor, K. K.

    2014-12-01

    Sub-Saharan Africa is an important focus for food security research, because it is experiencing unprecedented population growth, agricultural activities are largely dominated by smallholder production, and the region is already home to 25% of the world's undernourished. One of the greatest challenges to monitoring and improving food security in this region is obtaining an accurate accounting of the spatial distribution of agriculture. Households are the primary units of agricultural production in smallholder communities and typically rely on small fields of less than 2 hectares. Field sizes are directly related to household crop productivity, management choices, and adoption of new technologies. As population and agriculture expand, it becomes increasingly important to understand both the distribution of field sizes as well as how agricultural communities are spatially embedded in the landscape. In addition, household surveys, a common tool for tracking agricultural productivity in Sub-Saharan Africa, would greatly benefit from spatially explicit accounting of fields. Current gridded land cover data sets do not provide information on individual agricultural fields or the distribution of field sizes. Therefore, we employ cutting edge approaches from the field of computer vision to map fields across Sub-Saharan Africa, including semantic segmentation, discriminative classifiers, and automatic feature selection. Our approach aims to not only improve the binary classification accuracy of cropland, but also to isolate distinct fields, thereby capturing crucial information on size and geometry. Our research focuses on the development of descriptive features across scales to increase the accuracy and geographic range of our computer vision algorithm. Relevant data sets include high-resolution remote sensing imagery and Landsat (30-m) multi-spectral imagery. Training data for field boundaries is derived from hand-digitized data sets as well as crowdsourcing.

  5. 3D mapping of airway wall thickening in asthma with MSCT: a level set approach

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Brillet, Pierre-Yves; Hartley, Ruth; Grenier, Philippe A.; Brightling, Christopher

    2014-03-01

    Assessing the airway wall thickness in multi slice computed tomography (MSCT) as image marker for airway disease phenotyping such asthma and COPD is a current trend and challenge for the scientific community working in lung imaging. This paper addresses the same problem from a different point of view: considering the expected wall thickness-to-lumen-radius ratio for a normal subject as known and constant throughout the whole airway tree, the aim is to build up a 3D map of airway wall regions of larger thickness and to define an overall score able to highlight a pathological status. In this respect, the local dimension (caliber) of the previously segmented airway lumen is obtained on each point by exploiting the granulometry morphological operator. A level set function is defined based on this caliber information and on the expected wall thickness ratio, which allows obtaining a good estimate of the airway wall throughout all segmented lumen generations. Next, the vascular (or mediastinal dense tissue) contact regions are automatically detected and excluded from analysis. For the remaining airway wall border points, the real wall thickness is estimated based on the tissue density analysis in the airway radial direction; thick wall points are highlighted on a 3D representation of the airways and several quantification scores are defined. The proposed approach is fully automatic and was evaluated (proof of concept) on a patient selection coming from different databases including mild, severe asthmatics and normal cases. This preliminary evaluation confirms the discriminative power of the proposed approach regarding different phenotypes and is currently extending to larger cohorts.

  6. High Detailed Debris Flows Hazard Maps by a Cellular Automata Approach

    NASA Astrophysics Data System (ADS)

    Lupiano, V.; Lucà, F.; Robustelli, G.; Rongo, R.; D'Ambrosio, D.; Spataro, W.; Avolio, M. V.

    2012-04-01

    The individuation of areas that are more likely to be interested by new debris flows in regions that are particularly exposed to such kind of phenomena is of fundamental relevance for mitigating possible consequences, both in terms of loss of human lives and material properties. Here we show the adaption of a recent methodology, already successfully applied to lava flows, for defining flexible high-detailed and reliable hazard maps. The methodology relies on both an adequate knowledge of the study area, assessed by an accurate analysis of its past behavior, together with a reliable numerical model for simulating debris flows on present topographic data (the Cellular Automata model SCIDDICA, in the present case). Furthermore, High Performance Parallel Computing is employed for increasing computational efficiency, due to the great number of simulations of hypothetical events that are required for characterizing the susceptibility to flow invasion of the study area. The application of the presented methodology to the case of Gragnano (Italy) pointed out the goodness of the proposed approach, suggesting its appropriateness for land use planning and Civil Defense applications.

  7. Multimodality approach to optical early detection and mapping of oral neoplasia

    NASA Astrophysics Data System (ADS)

    Ahn, Yeh-Chan; Chung, Jungrae; Wilder-Smith, Petra; Chen, Zhongping

    2011-07-01

    Early detection of cancer remains the best way to ensure patient survival and quality of life. Squamous cell carcinoma is usually preceded by dysplasia presenting as white, red, or mixed red and white epithelial lesions on the oral mucosa (leukoplakia, erythroplakia). Dysplastic lesions in the form of erythroplakia can carry a risk for malignant conversion of 90%. A noninvasive diagnostic modality would enable monitoring of these lesions at regular intervals and detection of treatment needs at a very early, relatively harmless stage. The specific aim of this work was to test a multimodality approach [three-dimensional optical coherence tomography (OCT) and polarimetry] to noninvasive diagnosis of oral premalignancy and malignancy using the hamster cheek pouch model (nine hamsters). The results were compared to tissue histopathology. During carcinogenesis, epithelial down grow, eventual loss of basement membrane integrity, and subepithelial invasion were clearly visible with OCT. Polarimetry techniques identified a four to five times increased retardance in sites with squamous cell carcinoma, and two to three times greater retardance in dysplastic sites than in normal tissues. These techniques were particularly useful for mapping areas of field cancerization with multiple lesions, as well as lesion margins.

  8. Modeling and mapping potential distribution of Crimean juniper (Juniperus excelsa Bieb.) using correlative approaches.

    PubMed

    Özkan, Kürşad; Şentürk, Özdemir; Mert, Ahmet; Negiz, Mehmet Güvenç

    2015-01-01

    Modeling and mapping potential distribution of living organisms has become an important component of conservation planning and ecosystem management in recent years. Various correlative and mechanistic methods can be applied to build predictive distributions of living organisms in terrestrial and marine ecosystems. Correlative methods used to predict species' potential distribution have been described as either group discrimination techniques or profile techniques. We attempted to determine whether group discrimination techniques could perform as well as profile techniques for predicting species potential distributions, using elevation (ELVN), parent material (ROCK), slope (SLOP), radiation index (RI) and topographic position index (TPI)) as explanatory variables. We compared potential distribution predictions made for Crimean juniper (Juniperus excelsa Bieb.) in the Yukan Gokdere forest district of the Mediterranean region, Turkey, applying four group discrimination techniques (discriminate analysis (DA), logistic regression analysis (LR), generalized addictive model (GAM) and classification tree technique (CT)) and two profile techniques (a maximum entropy approach to species distribution modeling (MAXENT), the genetic algorithm for rule-set prediction (GARP)). Visual assessments of the potential distribution probability of the applied models for Crimean juniper were performed by using geographical information systems (GIS). Receiver-operating characteristic (ROC) curves were used to objectively assess model performance. The results suggested that group discrimination techniques are better than profile techniques and, among the group discrimination techniques, GAM indicated the best performance.

  9. Densely mapping the phase diagram of cuprate superconductors using a spatial composition spread approach

    NASA Astrophysics Data System (ADS)

    Saadat, Mehran; George, A. E.; Hewitt, Kevin C.

    2010-12-01

    A novel spatial composition spread approach was used successfully to deposit a 52-member library of La2-xSrxCuO4 (0 ⩽ x ⩽ 0.18) using magnetron sputtering combined with physical masking techniques. Two homemade targets of La2CuO4 and La1.82Sr0.18CuO4 were sputtered at a power of 41 W RF and 42 W DC, respectively, in a process gas of 15 mTorr argon. The libraries were sputtered onto LaSrAlO4 (0 0 1), SrTiO3 (1 0 0) and MgO (1 0 0) substrates through a 52-slot shadow mask for which a -20 V substrate bias was applied to prevent resputtering. The resulting amorphous films were post-annealed (800 °C for 1 h then at 950 °C for 2 h) in a tube sealed with oxygen gas. Wavelength Dispersive Spectroscopy (WDS) analysis revealed the expected linear variation of Sr content from 0 to 0.18 with an approximate change of 0.003 per library member. Transport measurements revealed superconducting transitions as well as changes in the quasiparticle scattering rate. These transitions and scattering rate changes were mapped to produce the T-hole concentration phase diagram.

  10. Flux Optimization in Human Specific Map-Kinase Pathways: A Systems Biology Approach to Study Cancer

    NASA Astrophysics Data System (ADS)

    Sahu, Sombeet

    2010-10-01

    Mitogen-Activated Protein Kinase (MAP kinases) transduces signals that are involved in a multitude of cellular pathways and functions in response to variety of ligands and cell stimuli. Aberrant or inappropriate functions of MAPKs have now been identified in diseases ranging from Cancer to Alzheimer disease to Leshmaniasis however the pathway is still growing and little is known about the dynamics of the pathway. Here we model the MAPK metabolic pathways and thus find the key metabolites or reactions involved on perturbing which the transcription factors are affected. The approach, which we used for modeling of this pathway, is Flux Balance Analysis (FBA). Further we established the growth factors EGF, PDGF were also responsible for the determination of downstream species concentrations. Tuning the parameters gave the optimum kinetics of the growth factor for which the downstream events were at the minimum. Also the Ras and Braf steady state concentrations were significantly affected when the Growth factor kinetics were tuned. This type of study can shed light on controlling various diseases and also may be helpful for identifying important drug targets.

  11. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M., Jr.

    2004-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  12. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M.

    2006-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  13. Multimodality approach to optical early detection and  mapping of oral neoplasia

    PubMed Central

    Ahn, Yeh-Chan; Chung, Jungrae; Wilder-Smith, Petra; Chen, Zhongping

    2011-01-01

    Early detection of cancer remains the best way to ensure patient survival and quality of life. Squamous cell carcinoma is usually preceded by dysplasia presenting as white, red, or mixed red and white epithelial lesions on the oral mucosa (leukoplakia, erythroplakia). Dysplastic lesions in the form of erythroplakia can carry a risk for malignant conversion of 90%. A noninvasive diagnostic modality would enable monitoring of these lesions at regular intervals and detection of treatment needs at a very early, relatively harmless stage. The specific aim of this work was to test a multimodality approach [three-dimensional optical coherence tomography (OCT) and polarimetry] to noninvasive diagnosis of oral premalignancy and malignancy using the hamster cheek pouch model (nine hamsters). The results were compared to tissue histopathology. During carcinogenesis, epithelial down grow, eventual loss of basement membrane integrity, and subepithelial invasion were clearly visible with OCT. Polarimetry techniques identified a four to five times increased retardance in sites with squamous cell carcinoma, and two to three times greater retardance in dysplastic sites than in normal tissues. These techniques were particularly useful for mapping areas of field cancerization with multiple lesions, as well as lesion margins. PMID:21806268

  14. A hybrid model for mapping simplified seismic response via a GIS-metamodel approach

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Bonito, L.; Revellino, P.; Guerriero, L.; Guadagno, F. M.

    2014-07-01

    In earthquake-prone areas, site seismic response due to lithostratigraphic sequence plays a key role in seismic hazard assessment. A hybrid model, consisting of GIS and metamodel (model of model) procedures, was introduced aimed at estimating the 1-D spatial seismic site response in accordance with spatial variability of sediment parameters. Inputs and outputs are provided and processed by means of an appropriate GIS model, named GIS Cubic Model (GCM). This consists of a block-layered parametric structure aimed at resolving a predicted metamodel by means of pixel to pixel vertical computing. The metamodel, opportunely calibrated, is able to emulate the classic shape of the spectral acceleration response in relation to the main physical parameters that characterize the spectrum itself. Therefore, via the GCM structure and the metamodel, the hybrid model provides maps of normalized acceleration response spectra. The hybrid model was applied and tested on the built-up area of the San Giorgio del Sannio village, located in a high-risk seismic zone of southern Italy. Efficiency tests showed a good correspondence between the spectral values resulting from the proposed approach and the 1-D physical computational models. Supported by lithology and geophysical data and corresponding accurate interpretation regarding modelling, the hybrid model can be an efficient tool in assessing urban planning seismic hazard/risk.

  15. Mapping the World - a New Approach for Volunteered Geographic Information in the Cloud

    NASA Astrophysics Data System (ADS)

    Moeller, M. S.; Furhmann, S.

    2015-05-01

    The OSM project provides a geodata basis for the entire world under the CC-SA licence agreement. But some parts of the world are mapped more densely compared to other regions. However, many less developed countries show a lack of valid geo-information. Africa for example is a sparsely mapped continent. During a huge Ebola outbreak in 2014 the lack of data became apparent. Help organization like the American Red Cross and the Humanitarian Openstreetmap Team organized mappings campaign to fill the gaps with valid OSM geodata. This paper gives a short introduction into this mapping activity.

  16. A Hybrid Wetland Map for China: A Synergistic Approach Using Census and Spatially Explicit Datasets

    PubMed Central

    Ma, Kun; You, Liangzhi; Liu, Junguo; Zhang, Mingxiang

    2012-01-01

    Wetlands play important ecological, economic, and cultural roles in societies around the world. However, wetland degradation has become a serious ecological issue, raising the global sustainability concern. An accurate wetland map is essential for wetland management. Here we used a fuzzy method to create a hybrid wetland map for China through the combination of five existing wetlands datasets, including four spatially explicit wetland distribution data and one wetland census. Our results show the total wetland area is 384,864 km2, 4.08% of China’s national surface area. The hybrid wetland map also shows spatial distribution of wetlands with a spatial resolution of 1 km. The reliability of the map is demonstrated by comparing it with spatially explicit datasets on lakes and reservoirs. The hybrid wetland map is by far the first wetland mapping that is consistent with the statistical data at the national and provincial levels in China. It provides a benchmark map for research on wetland protection and management. The method presented here is applicable for not only wetland mapping but also for other thematic mapping in China and beyond. PMID:23110105

  17. A hybrid wetland map for China: a synergistic approach using census and spatially explicit datasets.

    PubMed

    Ma, Kun; You, Liangzhi; Liu, Junguo; Zhang, Mingxiang

    2012-01-01

    Wetlands play important ecological, economic, and cultural roles in societies around the world. However, wetland degradation has become a serious ecological issue, raising the global sustainability concern. An accurate wetland map is essential for wetland management. Here we used a fuzzy method to create a hybrid wetland map for China through the combination of five existing wetlands datasets, including four spatially explicit wetland distribution data and one wetland census. Our results show the total wetland area is 384,864 km(2), 4.08% of China's national surface area. The hybrid wetland map also shows spatial distribution of wetlands with a spatial resolution of 1 km. The reliability of the map is demonstrated by comparing it with spatially explicit datasets on lakes and reservoirs. The hybrid wetland map is by far the first wetland mapping that is consistent with the statistical data at the national and provincial levels in China. It provides a benchmark map for research on wetland protection and management. The method presented here is applicable for not only wetland mapping but also for other thematic mapping in China and beyond.

  18. Integrated Georeferencing of Stereo Image Sequences Captured with a Stereovision Mobile Mapping System - Approaches and Practical Results

    NASA Astrophysics Data System (ADS)

    Eugster, H.; Huber, F.; Nebiker, S.; Gisi, A.

    2012-07-01

    Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations - in our case of the imaging sensors - normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  19. Nonequilibrium functional bosonization of quantum wire networks

    SciTech Connect

    Ngo Dinh, Stephane; Bagrets, Dmitry A.; Mirlin, Alexander D.

    2012-11-15

    We develop a general approach to nonequilibrium nanostructures formed by one-dimensional channels coupled by tunnel junctions and/or by impurity scattering. The formalism is based on nonequilibrium version of functional bosonization. A central role in this approach is played by the Keldysh action that has a form reminiscent of the theory of full counting statistics. To proceed with evaluation of physical observables, we assume the weak-tunneling regime and develop a real-time instanton method. A detailed exposition of the formalism is supplemented by two important applications: (i) tunneling into a biased Luttinger liquid with an impurity, and (ii) quantum Hall Fabry-Perot interferometry. - Highlights: Black-Right-Pointing-Pointer A nonequilibrium functional bosonization framework for quantum wire networks is developed Black-Right-Pointing-Pointer For the study of observables in the weak tunneling regime a real-time instanton method is elaborated. Black-Right-Pointing-Pointer We consider tunneling into a biased Luttinger liquid with an impurity. Black-Right-Pointing-Pointer We analyze electronic Fabry-Perot interferometers in the integer quantum Hall regime.

  20. Proposal for Microwave Boson Sampling

    NASA Astrophysics Data System (ADS)

    Peropadre, Borja; Guerreschi, Gian Giacomo; Huh, Joonsuk; Aspuru-Guzik, Alán

    2016-09-01

    Boson sampling, the task of sampling the probability distribution of photons at the output of a photonic network, is believed to be hard for any classical device. Unlike other models of quantum computation that require thousands of qubits to outperform classical computers, boson sampling requires only a handful of single photons. However, a scalable implementation of boson sampling is missing. Here, we show how superconducting circuits provide such platform. Our proposal differs radically from traditional quantum-optical implementations: rather than injecting photons in waveguides, making them pass through optical elements like phase shifters and beam splitters, and finally detecting their output mode, we prepare the required multiphoton input state in a superconducting resonator array, control its dynamics via tunable and dispersive interactions, and measure it with nondemolition techniques.

  1. Working Group Report: Higgs Boson

    SciTech Connect

    Dawson, Sally; Gritsan, Andrei; Logan, Heather; Qian, Jianming; Tully, Chris; Van Kooten, Rick

    2013-10-30

    This report summarizes the work of the Energy Frontier Higgs Boson working group of the 2013 Community Summer Study (Snowmass). We identify the key elements of a precision Higgs physics program and document the physics potential of future experimental facilities as elucidated during the Snowmass study. We study Higgs couplings to gauge boson and fermion pairs, double Higgs production for the Higgs self-coupling, its quantum numbers and $CP$-mixing in Higgs couplings, the Higgs mass and total width, and prospects for direct searches for additional Higgs bosons in extensions of the Standard Model. Our report includes projections of measurement capabilities from detailed studies of the Compact Linear Collider (CLIC), a Gamma-Gamma Collider, the International Linear Collider (ILC), the Large Hadron Collider High-Luminosity Upgrade (HL-LHC), Very Large Hadron Colliders up to 100 TeV (VLHC), a Muon Collider, and a Triple-Large Electron Positron Collider (TLEP).

  2. Mapping soil vulnerability to floods under varying land use and climate: A new approach

    NASA Astrophysics Data System (ADS)

    Alaoui, Abdallah; Spiess, Pascal; Beyeler, Marcel

    2016-04-01

    the hydrological connectivity between zones of various predisposition to excess surface runoff under different land uses. These promising results indicate that the approach is suited for mapping soil vulnerability to floods under varying land use and climate at any scale.

  3. Assessment of Social Vulnerability Identification at Local Level around Merapi Volcano - A Self Organizing Map Approach

    NASA Astrophysics Data System (ADS)

    Lee, S.; Maharani, Y. N.; Ki, S. J.

    2015-12-01

    The application of Self-Organizing Map (SOM) to analyze social vulnerability to recognize the resilience within sites is a challenging tasks. The aim of this study is to propose a computational method to identify the sites according to their similarity and to determine the most relevant variables to characterize the social vulnerability in each cluster. For this purposes, SOM is considered as an effective platform for analysis of high dimensional data. By considering the cluster structure, the characteristic of social vulnerability of the sites identification can be fully understand. In this study, the social vulnerability variable is constructed from 17 variables, i.e. 12 independent variables which represent the socio-economic concepts and 5 dependent variables which represent the damage and losses due to Merapi eruption in 2010. These variables collectively represent the local situation of the study area, based on conducted fieldwork on September 2013. By using both independent and dependent variables, we can identify if the social vulnerability is reflected onto the actual situation, in this case, Merapi eruption 2010. However, social vulnerability analysis in the local communities consists of a number of variables that represent their socio-economic condition. Some of variables employed in this study might be more or less redundant. Therefore, SOM is used to reduce the redundant variable(s) by selecting the representative variables using the component planes and correlation coefficient between variables in order to find the effective sample size. Then, the selected dataset was effectively clustered according to their similarities. Finally, this approach can produce reliable estimates of clustering, recognize the most significant variables and could be useful for social vulnerability assessment, especially for the stakeholder as decision maker. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering

  4. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach.

    PubMed

    Nardi, Pierfrancesco; Di Matteo, Giovanni; Palahi, Marc; Scarascia Mugnozza, Giuseppe

    2016-01-01

    This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier's Scopus database over the period 1980-2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17) during the period 1980-1989 and they reached 3% (N = 69) in the time slice 1990-1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476) during the period 2010-2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178) of total publications followed by Italy (18%, N = 482) and France (12%, N = 336). These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980-1994, the research topics were poorly characterized, but they become better defined during the time slice 1995-1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014) was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses revealed highly

  5. Comparative Genomics and Association Mapping Approaches for Blast Resistant Genes in Finger Millet Using SSRs

    PubMed Central

    Babu, B. Kalyana; Dinesh, Pandey; Agrawal, Pawan K.; Sood, S.; Chandrashekara, C.; Bhatt, Jagadish C.; Kumar, Anil

    2014-01-01

    The major limiting factor for production and productivity of finger millet crop is blast disease caused by Magnaporthe grisea. Since, the genome sequence information available in finger millet crop is scarce, comparative genomics plays a very important role in identification of genes/QTLs linked to the blast resistance genes using SSR markers. In the present study, a total of 58 genic SSRs were developed for use in genetic analysis of a global collection of 190 finger millet genotypes. The 58 SSRs yielded ninety five scorable alleles and the polymorphism information content varied from 0.186 to 0.677 at an average of 0.385. The gene diversity was in the range of 0.208 to 0.726 with an average of 0.487. Association mapping for blast resistance was done using 104 SSR markers which identified four QTLs for finger blast and one QTL for neck blast resistance. The genomic marker RM262 and genic marker FMBLEST32 were linked to finger blast disease at a P value of 0.007 and explained phenotypic variance (R2) of 10% and 8% respectively. The genomic marker UGEP81 was associated to finger blast at a P value of 0.009 and explained 7.5% of R2. The QTLs for neck blast was associated with the genomic SSR marker UGEP18 at a P value of 0.01, which explained 11% of R2. Three QTLs for blast resistance were found common by using both GLM and MLM approaches. The resistant alleles were found to be present mostly in the exotic genotypes. Among the genotypes of NW Himalayan region of India, VHC3997, VHC3996 and VHC3930 were found highly resistant, which may be effectively used as parents for developing blast resistant cultivars in the NW Himalayan region of India. The markers linked to the QTLs for blast resistance in the present study can be further used for cloning of the full length gene, fine mapping and their further use in the marker assisted breeding programmes for introgression of blast resistant alleles into locally adapted cultivars. PMID:24915067

  6. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach

    PubMed Central

    Nardi, Pierfrancesco; Di Matteo, Giovanni; Palahi, Marc; Scarascia Mugnozza, Giuseppe

    2016-01-01

    This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier’s Scopus database over the period 1980–2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17) during the period 1980–1989 and they reached 3% (N = 69) in the time slice 1990–1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476) during the period 2010–2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178) of total publications followed by Italy (18%, N = 482) and France (12%, N = 336). These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980–1994, the research topics were poorly characterized, but they become better defined during the time slice 1995–1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014) was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses

  7. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but

  8. Multiconfigurational Time-Dependent Hartree Methods for Bosonic Systems:. Theory and Applications

    NASA Astrophysics Data System (ADS)

    Alon, Ofir E.; Streltsov, Alexej I.; Sakmann, Kaspar; Cederbaum, Lorenz S.

    2013-02-01

    We review the multiconfigurational time-dependent Hartree method for bosons, which is a formally exact many-body theory for the propagation of the time dependent Schrödinger equation of N interacting identical bosons. In this approach, the time-dependent many-boson wavefunction is written as a sum of all permanents assembled from M orthogonal orbitals, where both the expansion coefficients and the permanents (orbitals) themselves are time-dependent and determined according to the Dirac-Frenkel time-dependent variational principle. In this way, a much larger effective subspace of the many-boson Hilbert space can be spanned in practice, in contrast to multiconfigurational expansions with timeindependent configurations. We also briefly discuss the extension of this method to bosonic mixtures and resonantly coupled bosonic atoms and molecules. Two applications in one dimension are presented: (i) the numerically exact solution of the time-dependent many-boson Schrödinger equation for the population dynamics in a repulsive bosonic Josephson junction is shown to deviate significantly from the predictions of the commonly used Gross-Pitaevskii equation and Bose-Hubbard model; and (ii) the many-body dynamics of a soliton train in an attractive Bose-Einstein condensate is shown to deviate substantially from the widely accepted predictions of the Gross--Pitaevskii mean-field theory.

  9. Chemical Genetics Approach Reveals Importance of cAMP and MAP Kinase Signaling to Lipid and Carotenoid Biosynthesis in Microalgae.

    PubMed

    Choi, Yoon-E; Rhee, Jin-Kyu; Kim, Hyun-Soo; Ahn, Joon-Woo; Hwang, Hyemin; Yang, Ji-Won

    2015-05-01

    In this study, we attempted to understand signaling pathways behind lipid biosynthesis by employing a chemical genetics approach based on small molecule inhibitors. Specific signaling inhibitors of MAP kinase or modulators of cAMP signaling were selected to evaluate the functional roles of each of the key signaling pathways in three different microalgal species: Chlamydomonas reinhardtii, Chlorella vulgaris, and Haematococcus pluvialis. Our results clearly indicate that cAMP signaling pathways are indeed positively associated with microalgal lipid biosynthesis. In contrast, MAP kinase pathways in three microalgal species are all negatively implicated in both lipid and carotenoid biosynthesis.

  10. Exotic Gauge Bosons in the 331 Model

    SciTech Connect

    Romero, D.; Ravinez, O.; Diaz, H.; Reyes, J.

    2009-04-30

    We analize the bosonic sector of the 331 model which contains exotic leptons, quarks and bosons (E,J,U,V) in order to satisfy the weak gauge SU(3){sub L} invariance. We develop the Feynman rules of the entire kinetic bosonic sector which will let us to compute some of the Z(0)' decays modes.

  11. Andreev Reflection in Bosonic Condensates

    SciTech Connect

    Zapata, I.; Sols, F.

    2009-05-08

    We study the bosonic analog of Andreev reflection at a normal-superfluid interface where the superfluid is a boson condensate. We model the normal region as a zone where nonlinear effects can be neglected. Against the background of a decaying condensate, we identify a novel contribution to the current of reflected atoms. The group velocity of this Andreev reflected component differs from that of the normally reflected one. For a three-dimensional planar or two-dimensional linear interface Andreev reflection is neither specular nor conjugate.

  12. Concept Mapping: An Approach for Evaluating a Public Alternative School Program

    ERIC Educational Resources Information Center

    Streeter, Calvin L.; Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2011-01-01

    This article describes how concept mapping techniques were applied to evaluate the development of a solution-focused, public alternative school program. Concept Systems software was used to create 15 cluster maps based on statements generated from students, teachers, and school staff. In addition, pattern matches were analyzed to examine the…

  13. A Different Approach to Preparing Novakian Concept Maps: The Indexing Method

    ERIC Educational Resources Information Center

    Turan Oluk, Nurcan; Ekmekci, Güler

    2016-01-01

    People who claim that applying Novakian concept maps in Turkish is problematic base their arguments largely upon the structural differences between the English and Turkish languages. This study aims to introduce the indexing method to eliminate problems encountered in Turkish applications of Novakian maps and to share the preliminary results of…

  14. Kramers-map approach for stabilization of a hydrogen atom in a monochromatic field

    SciTech Connect

    Shepelyansky, D.L. )

    1994-07-01

    The phenomenon of stabilization of highly excited states of a hydrogen atom in a strong monochromatic field is discussed. An approximate description of the dynamics from the introduction of the Kramers map allows one to understand the main properties of this phenomenon through analogy with the Kepler map. The analogy between the stabilization and the channneling of particles in a crystal is also discussed.

  15. A Constructivist Approach to Designing Computer Supported Concept-Mapping Environment

    ERIC Educational Resources Information Center

    Cheung, Li Siu

    2006-01-01

    In the past two decades, there has been a proliferation of research activities on exploring the use of concept maps to support teaching and learning of various knowledge disciplines which range from science to language subjects. MindNet, which is a collaborative concept mapping environment that supports both synchronous and asynchronous modes of…

  16. Ontology Mapping Neural Network: An Approach to Learning and Inferring Correspondences among Ontologies

    ERIC Educational Resources Information Center

    Peng, Yefei

    2010-01-01

    An ontology mapping neural network (OMNN) is proposed in order to learn and infer correspondences among ontologies. It extends the Identical Elements Neural Network (IENN)'s ability to represent and map complex relationships. The learning dynamics of simultaneous (interlaced) training of similar tasks interact at the shared connections of the…

  17. ConMap: Investigating New Computer-Based Approaches to Assessing Conceptual Knowledge Structure in Physics.

    ERIC Educational Resources Information Center

    Beatty, Ian D.

    There is a growing consensus among educational researchers that traditional problem-based assessments are not effective tools for diagnosing a student's knowledge state and for guiding pedagogical intervention, and that new tools grounded in the results of cognitive science research are needed. The ConMap ("Conceptual Mapping") project, described…

  18. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    USGS Publications Warehouse

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In

  19. Mapping and monitoring cropland burning in European Russia: a multi-sensor approach

    NASA Astrophysics Data System (ADS)

    Hall, J.; Loboda, T. V.; Mccarty, G.; McConnell, L.; Woldemariam, T.

    2013-12-01

    Short lived aerosols and pollutants transported from high northern latitudes have amplified the short term warming in the Arctic region. Specifically, black carbon (BC) is recognized as the second most important human emission in regards to climate forcing, behind carbon dioxide with a total climate forcing of +1.1Wm-2. Early studies have suggested that cropland burning may be a high contributor to the BC emissions which are directly deposited above the Arctic Circle. However, accurate monitoring of cropland burning from existing active fire and burned area products is limited. Most existing algorithms are focused on mapping hotter and larger wildfire events. The timing of cropland burning differs from wildfire events and their transient nature adds a further challenge to the product development. In addition, the analysis of multi-year cloud cover over Russian croplands, using the Moderate Resolution Imaging Spectroradiometer (MODIS) daily surface reflectance data showed that on average early afternoon observations from MODIS/ Aqua provided 68 clear views per growing period (defined 1st March 2003 - 30th November 2012) with a range from 30 to 101 clear views; whereas MODIS/Terra provided 75 clear views per growing period (defined 1st March 2001 - 30th November 2012) with a range from 37 to 113 clear views. Here we present a new approach to burned area mapping in croplands from satellite imagery. Our algorithm is designed to detect burned area only within croplands and does not have the requirements to perform well outside those. The algorithm focuses on tracking the natural intra-annual development curve specific for crops rather than natural vegetation and works by identifying the subtle spectral nuances between varieties of cropland field categories. Using a combination of the high visual accuracy from very high resolution (VHR, defined as spatial resolution < 5m) imagery and the temporal trend of MODIS data, we are able to differentiate between burned and plowed

  20. Water-sanitation-hygiene mapping: an improved approach for data collection at local level.

    PubMed

    Giné-Garriga, Ricard; de Palencia, Alejandro Jiménez-Fernández; Pérez-Foguet, Agustí

    2013-10-01

    Strategic planning and appropriate development and management of water and sanitation services are strongly supported by accurate and accessible data. If adequately exploited, these data might assist water managers with performance monitoring, benchmarking comparisons, policy progress evaluation, resources allocation, and decision making. A variety of tools and techniques are in place to collect such information. However, some methodological weaknesses arise when developing an instrument for routine data collection, particularly at local level: i) comparability problems due to heterogeneity of indicators, ii) poor reliability of collected data, iii) inadequate combination of different information sources, and iv) statistical validity of produced estimates when disaggregated into small geographic subareas. This study proposes an improved approach for water, sanitation and hygiene (WASH) data collection at decentralised level in low income settings, as an attempt to overcome previous shortcomings. The ultimate aim is to provide local policymakers with strong evidences to inform their planning decisions. The survey design takes the Water Point Mapping (WPM) as a starting point to record all available water sources at a particular location. This information is then linked to data produced by a household survey. Different survey instruments are implemented to collect reliable data by employing a variety of techniques, such as structured questionnaires, direct observation and water quality testing. The collected data is finally validated through simple statistical analysis, which in turn produces valuable outputs that might feed into the decision-making process. In order to demonstrate the applicability of the method, outcomes produced from three different case studies (Homa Bay District-Kenya-; Kibondo District-Tanzania-; and Municipality of Manhiça-Mozambique-) are presented.

  1. A Geospatial Approach to Mapping Bioenergy Potential of Perennial Crops in North American Tallgrass Prairie

    NASA Astrophysics Data System (ADS)

    Wang, S.; Fritschi, F. B.; Stacy, G.

    2009-12-01

    Biomass is the largest source of renewable energy in the United States and is expected to replace 30% of the domestic petroleum consumption by 2030. Corn ethanol currently constitutes 99% of the country’s biofuels. Extended annual crop planting for biofuel production, however, has raised concerns about long-term environmental, ecological and socio-economical consequences. More sustainable bioenergy resources might therefore be developed to meet the energy demand, food security and climate policy. The DOD has identified switchgrass (Panicum virgatum L.) as a model bioenergy crop. Switchgrass, along with other warm-season grasses, is native to the pre-colonial tallgrass prairie in North America. This study maps the spatial distributions of prairie grasses and marginal croplands in the tallgrass prairie with remote sensing and GIS techniques. In 2000-2008, the 8-day composition MODIS imagery was downloaded to calculate the normalized difference vegetation index (NDVI). With pixel-level temporal trajectory of NDVI, time-series trend analysis was performed to identify native prairie grasses based on their phenological uniqueness. In a case study in southwest Missouri, this trajectory approach distinguished more than 80% of warm-season prairie grasslands from row crops and cool-season pastures (Figure 1). Warm season grasses dominated in the 19 public prairies in the study area in a range of 45-98%. This study explores the geographic context of current and potential perennial bioenergy supplies in the tallgrass prairie. Beyond the current findings, it holds promise for further investigations to provide quantitative economic and environmental information in assisting bioenergy policy decision-making. Figure 1 The distribution of grasslands in the study area. The "WSG", "CSG" and “non-grass” represent warm-season prairie grasses, introduced cool-season grasses and crops and other non-grasses.

  2. A new approach to the mapping of the equatorial neutral wind field

    NASA Astrophysics Data System (ADS)

    Meriwether, John; Makela, Jonathan J.; Navarro, Luis; Harding, Brian; Milla, Marco

    Increased information about the spatial structure of thermospheric winds may be retrieved through the combination of Doppler shift observations from multiple Fabry-Perot interferometer (FPI) observatories. In this paper we present examples of results obtained for a network of three FPIs located in central Peru at Jicamarca, Nazca, and Arequipa. These results are based upon the application of a second-order Taylor series expansion of the zonal and meridional wind components as a model of the thermospheric wind field for the latitudinal span of 10 S to 20 S. The Doppler shift data are analyzed with the singular value decomposition algorithm to determine these model parameters. Results of the model fits are compared with the zonal and meridional winds observed at six common volume locations in the thermosphere for 250 km height, and good agreement was found indicating a successful application of the SVD analysis. One example of the results found from the inspection of the maps produced with this approach shows near 1-2 UT an area of weak winds that is seen to move southward as an entity through the region10-20 S near 1-2 LT. The cause of this 'null zone' region in the thermospheric wind field is proposed to be a result of the balancing of the eastward day to night pressure gradient with the westward pressure gradient of the pressure bulge as this bulge (associated with the midnight temperature maximum) propagates through the equatorial thermosphere region from the southwest toward the northeast. Further discussion about alternative basis functions that might be used in this analysis is provided.

  3. Coastal system mapping: a new approach to formalising and conceptualising the connectivity of large-scale coastal systems

    NASA Astrophysics Data System (ADS)

    French, J.; Burningham, H.; Whitehouse, R.

    2010-12-01

    The concept of the coastal sediment cell has proved invaluable as a basis for estimating sediment budgets and as a framework for coastal management. However, whilst coastal sediment cells are readily identified on compartmentalised coastlines dominated by beach-grade material, the cell concept is less suited to handling broader linkages between estuarine, coastal and offshore systems, and for incorporating longer-range suspended sediment transport. We present a new approach to the conceptualisation of large-scale coastal geomorphic systems based on a hierarchical classification of component landforms and management interventions and mapping of the interactions between them. Coastal system mapping is founded on a classification that identifies high-level landform features, low-level landform elements and engineering interventions. Geomorphic features define the large-scale organisation of a system and include landforms that define gross coastal configuration (e.g. headland, bay) as well as fluvial, estuarine and offshore sub-systems that exchange sediment with and influence the open coast. Detailed system structure is mapped out with reference to a larger set of geomorphic elements (e.g. cliff, dune, beach ridge). Element-element interactions define cross-shore linkages (conceptualised as hinterland, backshore and foreshore zones) and alongshore system structure. Both structural and non-structural engineering interventions are also represented at this level. Element-level mapping is rationalised to represent alongshore variation using as few elements as possible. System linkages include both sediment transfer pathways and influences not associated with direct mass transfer (e.g. effect of a jetty at an inlet). A formal procedure for capturing and graphically representing coastal system structure has been developed around free concept mapping software, CmapTools (http://cmap.ihmc.us). Appended meta-data allow geographic coordinates, data, images and literature

  4. Dynamics of open bosonic quantum systems in coherent state representation

    SciTech Connect

    Dalvit, D. A. R.; Berman, G. P.; Vishik, M.

    2006-01-15

    We consider the problem of decoherence and relaxation of open bosonic quantum systems from a perspective alternative to the standard master equation or quantum trajectories approaches. Our method is based on the dynamics of expectation values of observables evaluated in a coherent state representation. We examine a model of a quantum nonlinear oscillator with a density-density interaction with a collection of environmental oscillators at finite temperature. We derive the exact solution for dynamics of observables and demonstrate a consistent perturbation approach.

  5. Collaborative and multilingual approach to learn database topics using concept maps.

    PubMed

    Arruarte, Ana; Calvo, Iñaki; Elorriaga, Jon A; Larrañaga, Mikel; Conde, Angel

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives.

  6. A National Approach for Mapping and Quantifying Habitat-based Biodiversity Metrics Across Multiple Spatial Scales

    EPA Science Inventory

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national inte...

  7. A National Approach to Quantify and Map Biodiversity Conservation Metrics within an Ecosystem Services Framework

    EPA Science Inventory

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have be...

  8. Use of linkage disequilibrium approaches to map genes for bipolar disorder in the Costa Rican population

    SciTech Connect

    Escamilla, M.A.; Reus, V.I.; Smith, L.B.; Freimer, N.B.

    1996-05-31

    Linkage disequilibrium (LD) analysis provides a powerful means for screening the genome to map the location of disease genes, such as those for bipolar disorder (BP). As described in this paper, the population of the Central Valley of Costa Rica, which is descended from a small number of founders, should be suitable for LD mapping; this assertion is supported by reconstruction of extended haplotypes shared by distantly related individuals in this population suffering low-frequency hearing loss (LFHL1), which has previously been mapped by linkage analysis. A sampling strategy is described for applying LD methods to map genes for BP, and clinical and demographic characteristics of an initially collected sample are discussed. This sample will provide a complement to a previously collected set of Costa Rican BP families which is under investigation using standard linkage analysis. 42 refs., 4 figs., 2 tabs.

  9. AFSM sequencing approach: a simple and rapid method for genome-wide SNP and methylation site discovery and genetic mapping

    PubMed Central

    Xia, Zhiqiang; Zou, Meiling; Zhang, Shengkui; Feng, Binxiao; Wang, Wenquan

    2014-01-01

    We describe methods for the assessment of amplified-fragment single nucleotide polymorphism and methylation (AFSM) sites using a quick and simple molecular marker-assisted breeding strategy based on the use of two restriction enzyme pairs (EcoRI-MspI and EcoRI-HpaII) and a next-generation sequencing platform. Two sets of 85 adapter pairs were developed to concurrently identify SNPs, indels and methylation sites for 85 lines of cassava population in this study. In addition to SNPs and indels, the simplicity of the AFSM protocol makes it particularly suitable for high-throughput full methylation and hemi-methylation analyses. To further demonstrate the ease of this approach, a cassava genetic linkage map was constructed. This approach should be widely applicable for genetic mapping in a variety of organisms and will improve the application of crop genomics in assisted breeding. PMID:25466435

  10. Mapping land cover from satellite images: A basic, low cost approach

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barney, T. W.; Barr, D. J.; Johannsen, C. J.

    1978-01-01

    Simple, inexpensive methodologies developed for mapping general land cover and land use categories from LANDSAT images are reported. One methodology, a stepwise, interpretive, direct tracing technique was developed through working with university students from different disciplines with no previous experience in satellite image interpretation. The technique results in maps that are very accurate in relation to actual land cover and relative to the small investment in skill, time, and money needed to produce the products.

  11. Bosonization, coherent states and semiclassical quantum Hall skyrmions.

    PubMed

    Dutta, Sreedhar B; Shankar, R

    2008-07-09

    We bosonize (2+1)-dimensional fermionic theory using coherent states. The gauge-invariant subspace of boson-Chern-Simons Hilbert space is mapped to fermionic Hilbert space. This subspace is then equipped with a coherent state basis. These coherent states are labelled by a dynamic spinor field. The label manifold could be assigned a physical meaning in terms of density and spin density. A path-integral representation of the evolution operator in terms of these physical variables is given. The corresponding classical theory when restricted to LLL is described by spin fluctuations alone and is found to be the NLSM with Hopf term. The formalism developed here is suitable to study quantum Hall skyrmions semiclassically and/or beyond the hydrodynamic limit. The effects of Landau level mixing or the presence of slowly varying external fields can also be easily incorporated.

  12. The Translation Invariant Massive Nelson Model: III. Asymptotic Completeness Below the Two-Boson Threshold

    NASA Astrophysics Data System (ADS)

    Dybalski, Wojciech; Møller, Jacob Schach

    2015-11-01

    We show asymptotic completeness of two-body scattering for a class of translation invariant models describing a single quantum particle (the electron) linearly coupled to a massive scalar field (bosons). Our proof is based on a recently established Mourre estimate for these models. In contrast to previous approaches, it requires no number cutoff, no restriction on the particle-field coupling strength, and no restriction on the magnitude of total momentum. Energy, however, is restricted by the two-boson threshold, admitting only scattering of a dressed electron and a single asymptotic boson. The class of models we consider include the UV-cutoff Nelson and polaron models.

  13. Binding Ensemble PROfiling with (F)photoaffinity Labeling (BEProFL) Approach: Mapping the Binding Poses of HDAC8 Inhibitors

    PubMed Central

    He, Bai; Velaparthi, Subash; Pieffet, Gilles; Pennington, Chris; Mahesh, Aruna; Holzle, Denise L.; Brunsteiner, Michael; van Breemen, Richard; Blond, Sylvie Y.; Petukhov, Pavel A.

    2009-01-01

    A Binding Ensemble PROfiling with (F)photoaffinity Labeling (BEProFL) approach that utilizes photolabeling of HDAC8 with a probe containing a UV-activated aromatic azide, mapping the covalent modifications by liquid chromatography-tandem mass-spectrometry, and a computational method to characterize the multiple binding poses of the probe is described. Using the BEProFL approach two distinct binding poses of the HDAC8 probe were identified. The data also suggest that an “upside-down” pose with the surface binding group of the probe bound in an alternative pocket near the catalytic site may contribute to the binding. PMID:19886628

  14. Isotopomer mapping approach to determine N_{2}O production pathways and N_{2}O reduction

    NASA Astrophysics Data System (ADS)

    Lewicka-Szczebak, Dominika; Well, Reinhard; Cardenas, Laura; Bol, Roland

    2016-04-01

    Stable isotopomer analyses of soil-emitted N2O (δ15N, δ18Oand SP = 15N site preference within the linear N2O molecule) may help to distinguish N2O production pathways and to quantify N2O reduction to N2. Different N2O forming processes are characterised by distinct isotopic characteristics. Bacterial denitrification shows significantly lower SP and δ18Ovalues when compared to fungal denitrification and nitrification processes. But SP and δ18Ovalues are also altered during N2O reduction to N2, when the residual N2O is enriched in 18Oand centrally located 15N, resulting in increased δ18Oand SP values. Hence, the interpretation of these isotope characteristics is not straightforward, because higher δ18Oand SP values may be due to admixture of N2O from fungal denitrification or nitrification, or due to N2O reduction to N2. One of these processes, either admixture or reduction, can be quite well quantified if the other one is determined with independent methods. But usually both processes are unknown and the ability to estimate both of them simultaneously would be very beneficial. Here we present an attempt to determine both the admixture and the reduction simultaneously using the isotopomer mapping, i.e. the relation between δ18Oand SP. The measured sample points are typically situated between the two lines: reduction line with a typical slope of about 0.35 and mixing line with a higher slope of about 0.8. Combining the reduction and the mixing vector allows for the determination of both processes based on the location of the sample point between the lines. We tested this new approach for laboratory incubation studies, where a reference method for N2O reduction quantification was applied, i.e. 15N gas flux method or incubations in He atmosphere. This allowed us to check how well the calculated amounts for N2O reduction agree with the results provided by the reference method. The general trend was quite well reflected in our calculated results, however, quite

  15. Visualizing the Topical Structure of the Medical Sciences: A Self-Organizing Map Approach

    PubMed Central

    Skupin, André; Biberstine, Joseph R.; Börner, Katy

    2013-01-01

    Background We implement a high-resolution visualization of the medical knowledge domain using the self-organizing map (SOM) method, based on a corpus of over two million publications. While self-organizing maps have been used for document visualization for some time, (1) little is known about how to deal with truly large document collections in conjunction with a large number of SOM neurons, (2) post-training geometric and semiotic transformations of the SOM tend to be limited, and (3) no user studies have been conducted with domain experts to validate the utility and readability of the resulting visualizations. Our study makes key contributions to all of these issues. Methodology Documents extracted from Medline and Scopus are analyzed on the basis of indexer-assigned MeSH terms. Initial dimensionality is reduced to include only the top 10% most frequent terms and the resulting document vectors are then used to train a large SOM consisting of over 75,000 neurons. The resulting two-dimensional model of the high-dimensional input space is then transformed into a large-format map by using geographic information system (GIS) techniques and cartographic design principles. This map is then annotated and evaluated by ten experts stemming from the biomedical and other domains. Conclusions Study results demonstrate that it is possible to transform a very large document corpus into a map that is visually engaging and conceptually stimulating to subject experts from both inside and outside of the particular knowledge domain. The challenges of dealing with a truly large corpus come to the fore and require embracing parallelization and use of supercomputing resources to solve otherwise intractable computational tasks. Among the envisaged future efforts are the creation of a highly interactive interface and the elaboration of the notion of this map of medicine acting as a base map, onto which other knowledge artifacts could be overlaid. PMID:23554924

  16. Impact of a concept map teaching approach on nursing students' critical thinking skills.

    PubMed

    Kaddoura, Mahmoud; Van-Dyke, Olga; Yang, Qing

    2016-09-01

    Nurses confront complex problems and decisions that require critical thinking in order to identify patient needs and implement best practices. An active strategy for teaching students the skills to think critically is the concept map. This study explores the development of critical thinking among nursing students in a required pathophysiology and pharmacology course during the first year of a Bachelor of Science in Nursing in response to concept mapping as an interventional strategy, using the Health Education Systems, Incorporated critical thinking test. A two-group experimental study with a pretest and posttest design was used. Participants were randomly divided into a control group (n = 42) taught by traditional didactic lecturing alone, and an intervention group (n = 41), taught by traditional didactic lecturing with concept mapping. Students in the concept mapping group performed much better on the Health Education Systems, Incorporated than students in the control group. It is recommended that deans, program directors, and nursing faculties evaluate their curricula to integrate concept map teaching strategies in courses in order to develop critical thinking abilities in their students.

  17. Remote sensing approach to map riparian vegetation of the Colorado River Ecosystem, Grand Canyon area, Arizona

    NASA Astrophysics Data System (ADS)

    Nguyen, U.; Glenn, E.; Nagler, P. L.; Sankey, J. B.

    2015-12-01

    Riparian zones in the southwestern U.S. are usually a mosaic of vegetation types at varying states of succession in response to past floods or droughts. Human impacts also affect riparian vegetation patterns. Human- induced changes include introduction of exotic species, diversion of water for human use, channelization of the river to protect property, and other land use changes that can lead to deterioration of the riparian ecosystem. This study explored the use of remote sensing to map an iconic stretch of the Colorado River in the Grand Canyon National Park, Arizona. The pre-dam riparian zone in the Grand Canyon was affected by annual floods from spring run-off from the watersheds of Green River, the Colorado River and the San Juan River. A pixel-based vegetation map of the riparian zone in the Grand Canyon, Arizona, was produced from high-resolution aerial imagery. The map was calibrated and validated with ground survey data. A seven-step image processing and classification procedure was developed based on a suite of vegetation indices and classification subroutines available in ENVI Image Processing and Analysis software. The result was a quantitative species level vegetation map that could be more accurate than the qualitative, polygon-based maps presently used on the Lower Colorado River. The dominant woody species in the Grand Canyon are now saltcedar, arrowweed and mesquite, reflecting stress-tolerant forms adapted to alternated flow regimes associated with the river regulation.

  18. Microzonation Mapping Of The Yanbu Industrial City, Western Saudi Arabia: A Multicriteria Decision Analysis Approach

    NASA Astrophysics Data System (ADS)

    Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.

    2016-04-01

    Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.

  19. Segmentation and automated measurement of chronic wound images: probability map approach

    NASA Astrophysics Data System (ADS)

    Ahmad Fauzi, Mohammad Faizal; Khansa, Ibrahim; Catignani, Karen; Gordillo, Gayle; Sen, Chandan K.; Gurcan, Metin N.

    2014-03-01

    estimated 6.5 million patients in the United States are affected by chronic wounds, with more than 25 billion US dollars and countless hours spent annually for all aspects of chronic wound care. There is need to develop software tools to analyze wound images that characterize wound tissue composition, measure their size, and monitor changes over time. This process, when done manually, is time-consuming and subject to intra- and inter-reader variability. In this paper, we propose a method that can characterize chronic wounds containing granulation, slough and eschar tissues. First, we generate a Red-Yellow-Black-White (RYKW) probability map, which then guides the region growing segmentation process. The red, yellow and black probability maps are designed to handle the granulation, slough and eschar tissues, respectively found in wound tissues, while the white probability map is designed to detect the white label card for measurement calibration purpose. The innovative aspects of this work include: 1) Definition of a wound characteristics specific probability map for segmentation, 2) Computationally efficient regions growing on 4D map; 3) Auto-calibration of measurements with the content of the image. The method was applied on 30 wound images provided by the Ohio State University Wexner Medical Center, with the ground truth independently generated by the consensus of two clinicians. While the inter-reader agreement between the readers is 85.5%, the computer achieves an accuracy of 80%.

  20. Physics of W bosons at LEP

    NASA Astrophysics Data System (ADS)

    Mele, Salvatore

    2004-12-01

    The high-energy and high-luminosity data-taking campaigns of the LEP e+e- collider provided the four collaborations, ALEPH, DELPHI, L3 and OPAL, with about 50000 W-boson pairs and about a thousand singly produced W bosons. This unique data sample has an unprecedented reach in probing some aspects of the Standard Model of the electroweak interactions, and this article reviews several achievements in the understanding of W-boson physics at LEP. The measurements of the cross-sections for W-boson production are discussed, together with their implication on the existence of the coupling between Z and W bosons. The precision measurements of the magnitude of triple gauge-boson couplings are presented. The obervation of the longitudinal helicity component of the W-boson spin, related to the mechanism of electroweak symmetry breaking, is described together with the techniques used to probe the CP and CPT symmetries in the W-boson system. A discussion on the intricacies of the measurment of the mass of the W boson, whose knowledge is indispensable to test the internal consistency of the Standard Model and estimate the mass of the Higgs boson, concludes this review.

  1. Microscopic formulation of the interacting boson model for rotational nuclei

    SciTech Connect

    Nomura, Kosuke; Shimizu, Noritaka; Otsuka, Takaharu; Guo, Lu

    2011-04-15

    We propose a novel formulation of the interacting boson model (IBM) for rotational nuclei with axially symmetric, strong deformation. The intrinsic structure represented by the potential-energy surface (PES) of a given multinucleon system has a certain similarity to that of the corresponding multiboson system. Based on this feature, one can derive an appropriate boson Hamiltonian, as already reported. This prescription, however, has a major difficulty in the rotational spectra of strongly deformed nuclei: the bosonic moment of inertia is significantly smaller than the corresponding nucleonic one. We present that this difficulty originates in the difference between the rotational response of a nucleon system and that of the corresponding boson system, and could arise even if the PESs of the two systems were identical. We further suggest that the problem can be solved by implementing the L{center_dot}L term into the IBM Hamiltonian, with the coupling constant derived from the cranking approach of Skyrme mean-field models. The validity of the method is confirmed for rare-earth and actinoid nuclei, as their experimental rotational yrast bands are reproduced nicely.

  2. Thermal phase transition for some spin-boson models

    NASA Astrophysics Data System (ADS)

    Aparicio Alcalde, M.; Pimentel, B. M.

    2013-09-01

    In this work we study two different spin-boson models. Such models are generalizations of the Dicke model, it means they describe systems of N identical two-level atoms coupled to a single-mode quantized bosonic field, assuming the rotating wave approximation. In the first model, we consider the wavelength of the bosonic field to be of the order of the linear dimension of the material composed of the atoms, therefore we consider the spatial sinusoidal form of the bosonic field. The second model is the Thompson model, where we consider the presence of phonons in the material composed of the atoms. We study finite temperature properties of the models using the path integral approach and functional methods. In the thermodynamic limit, N→∞, the systems exhibit phase transitions from normal to superradiant phase at some critical values of temperature and coupling constant. We find the asymptotic behavior of the partition functions and the collective spectrums of the systems in the normal and the superradiant phases. We observe that the collective spectrums have zero energy values in the superradiant phases, corresponding to the Goldstone mode associated to the continuous symmetry breaking of the models. Our analysis and results are valid in the limit of zero temperature β→∞, where the models exhibit quantum phase transitions.

  3. A logistic map approach to economic cycles. (I). The best adapted companies

    NASA Astrophysics Data System (ADS)

    Miśkiewicz, J.; Ausloos, M.

    2004-05-01

    A birth-death lattice gas model about the influence of an environment on the fitness and concentration evolution of economic entities is analytically examined. The model can be mapped onto a high-order logistic map. The control parameter is a (scalar) “business plan”. Conditions are searched for growth and decay processes, stable states, upper and lower bounds, bifurcations, periodic and chaotic solutions. The evolution equation of the economic population for the best fitted companies indicates “microscopic conditions” for cycling. The evolution of a dynamic exponent is shown as a function of the business plan parameters.

  4. Mapping and monitoring the health and vitality of coral reefs from satellite: a biospheric approach.

    PubMed

    Dustan, P; Chakrabarti, S; Alling, A

    2000-01-01

    Biospheric studies of coral reefs require a planetary perspective that only remote sensing from space can provide. This article reviews aspects of monitoring and mapping coral reefs using Landsat and Spot satellite images. It details design considerations for developing a sensor for equatorial orbiting spacecraft, including spectral characteristics of living corals and the spatial resolution required to map coral reef communities. Possible instrumentation choices include computer techniques, filtered imagers, push-broom spectral imagery, and a newly developed hyperspectral imaging scheme using tomographic reconstruction. We compare the salient features of each technique and describe concepts for a payload to conduct planetary-scale coral reef monitoring.

  5. High-resolution EEG mapping: an equivalent charge-layer approach

    NASA Astrophysics Data System (ADS)

    Yao, Dezhong

    2003-07-01

    Brain electrical signal is one of the windows to understanding neural activities. Various high-resolution imaging techniques have been developed to reveal the electrical activities underneath the cortical surface from scalp electroencephalographic recordings, such as scalp Laplacian, cortical surface potential, equivalent charge layer (ECL) and equivalent dipole layer (EDL). In this work, we develop forward density formulae for the ECL and the EDL of neural electric sources in a 4-concentric-sphere head model, and compare ECL with EDL in theory, simulation and real evoked data tests. The results confirm that the ECL map may be of higher spatial resolution than the EDL map.

  6. Classical mapping for Hubbard operators: Application to the double-Anderson model

    SciTech Connect

    Li, Bin; Miller, William H.; Levy, Tal J.; Rabani, Eran

    2014-05-28

    A classical Cartesian mapping for Hubbard operators is developed to describe the nonequilibrium transport of an open quantum system with many electrons. The mapping of the Hubbard operators representing the many-body Hamiltonian is derived by using analogies from classical mappings of boson creation and annihilation operators vis-à-vis a coherent state representation. The approach provides qualitative results for a double quantum dot array (double Anderson impurity model) coupled to fermionic leads for a range of bias voltages, Coulomb couplings, and hopping terms. While the width and height of the conduction peaks show deviations from the master equation approach considered to be accurate in the limit of weak system-leads couplings and high temperatures, the Hubbard mapping captures all transport channels involving transition between many electron states, some of which are not captured by approximate nonequilibrium Green function closures.

  7. A novel approach to land-cover maps updating in complex scenarios based on multitemporal remote sensing images

    NASA Astrophysics Data System (ADS)

    Bahirat, K.; Bovolo, F.; Bruzzone, L.; Chaudhuri, S.

    2010-10-01

    Nowadays, an ever increasing number of multi-temporal images is available, giving the possibility of having with high temporal frequency information about the land-cover evolution on the ground. In general, the production of accurate land-cover maps requires the availability of reliable ground truth information on the considered area for each image to be classified. Unfortunately the rate of ground truth information collection will never equal the remote sensing image acquisition rate, making supervised classification unfeasible for land-cover maps updating. This problem has been faced according to domain adaptation methods that update land-cover maps under the assumption that: i) training data are available for one of the considered multi-temporal acquisitions while they are not for the others and ii) set of land-cover classes is same for all considered acquisitions. In real applications, the latter assumption represents a constraint which is often not satisfied due to possible changes occurred on the ground and associated with the presence of new classes or the absence of old classes in the new images. In this work, we propose an approach that removes this constraint by automatically identifying whether there exist differences between classes in multi-temporal images and properly handling these differences in the updating process. Experimental results on a real multi-temporal remote sensing data set confirm the effectiveness and the reliability of the proposed approach.

  8. A Robust Approach for Mapping Group Marks to Individual Marks Using Peer Assessment

    ERIC Educational Resources Information Center

    Spatar, Ciprian; Penna, Nigel; Mills, Henny; Kutija, Vedrana; Cooke, Martin

    2015-01-01

    Group work can form a substantial component of degree programme assessments. To satisfy institutional and student expectations, students must often be assigned individual marks for their contributions to the group project, typically by mapping a single holistic group mark to individual marks using peer assessment scores. Since the early 1990s,…

  9. B1 mapping with a pure phase encode approach: Quantitative density profiling

    NASA Astrophysics Data System (ADS)

    Vashaee, S.; Newling, B.; MacMillan, B.; Balcom, B. J.

    2013-07-01

    In MRI, it is frequently observed that naturally uniform samples do not have uniform image intensities. In many cases this non-uniform image intensity is due to an inhomogeneous B1 field. The ‘principle of reciprocity' states that the received signal is proportional to the local magnitude of the applied B1 field per unit current. Inhomogeneity in the B1 field results in signal intensity variations that limit the ability of MRI to yield quantitative information. In this paper a novel method is described for mapping B1 inhomogeneities based on measurement of the B1 field employing centric-scan pure phase encode MRI measurements. The resultant B1 map may be employed to correct related non-uniformities in MR images. The new method is based on acquiring successive images with systematically incremented low flip angle excitation pulses. The local image intensity variation is proportional to B12, which ensures high sensitivity to B1 field variations. Pure phase encoding ensures the resultant B1 field maps are free from distortions caused by susceptibility variation, chemical shift and paramagnetic impurities. Hence, the method works well in regions of space that are not accessible to other methods such as in the vicinity of conductive metallic structures, such as the RF probe itself. Quantitative density images result when the centric scan pure phase encode measurement is corrected with a relative or absolute B1 field map. The new technique is simple, reliable and robust.

  10. Mapping Classroom Interactions: A Spatial Approach to Analyzing Patterns of Student Participation

    ERIC Educational Resources Information Center

    Abbot, Sophia; Cook-Sather, Alison; Hein, Carola

    2014-01-01

    This article explores how mapping patterns of student participation in classroom discussion can both illuminate and complicate the dynamic relationships among identity, physical position in the classroom, student engagement, and course content. It draws on the perspectives of an undergraduate in the role of pedagogical consultant, a faculty member…

  11. Effects of Concept Mapping Instruction Approach on Students' Achievement in Basic Science

    ERIC Educational Resources Information Center

    Ogonnaya, Ukpai Patricia; Okafor, Gabriel; Abonyi, Okechukwu S.; Ugama, J. O.

    2016-01-01

    The study investigated the effects of concept mapping on students' achievement in basic science. The study was carried out in Ebonyi State of Nigeria. The study employed a quasi-experimental design. Specifically the pretest posttest non-equivalent control group research design was used. The sample was 122 students selected from two secondary…

  12. An Innovative Approach to Scheme Learning Map Considering Tradeoff Multiple Objectives

    ERIC Educational Resources Information Center

    Lin, Yu-Shih; Chang, Yi-Chun; Chu, Chih-Ping

    2016-01-01

    An important issue in personalized learning is to provide learners with customized learning according to their learning characteristics. This paper focused attention on scheming learning map as follows. The learning goal can be achieved via different pathways based on alternative materials, which have the relationships of prerequisite, dependence,…

  13. Evaluation of data analytic approaches to generating cross-domain mappings of controlled science vocabularies

    NASA Astrophysics Data System (ADS)

    Zednik, S.

    2015-12-01

    Recent data publication practices have made increasing amounts of diverse datasets available online for the general research community to explore and integrate. Even with the abundance of data online, relevant data discovery and successful integration is still highly dependent upon the data being published with well-formed and understandable metadata. Tagging a dataset with well-known or controlled community terms is a common mechanism to indicate the intended purpose, subject matter, or other relevant facts of a dataset, however controlled domain terminology can be difficult for cross-domain researchers to interpret and leverage. It is also a challenge for integration portals to successfully provide cross-domain search capabilities over data holdings described using many different controlled vocabularies. Mappings between controlled vocabularies can be challenging because communities frequently develop specialized terminologies and have highly specific and contextual usages of common words. Despite this specificity it is highly desirable to produce cross-domain mappings to support data integration. In this contribution we evaluate the applicability of several data analytic techniques for the purpose of generating mappings between hierarchies of controlled science terms. We hope our efforts initiate more discussion on the topic and encourage future mapping efforts.

  14. An energy balance approach for mapping crop waterstress and yield impacts over the Czech Republic

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There is a growing demand for timely, spatially distributed information regarding crop condition and water use to inform agricultural decision making and yield forecasting efforts. Remote sensing of land-surface temperature has proven valuable for mapping evapotranspiration (ET) and crop stress from...

  15. Putting vulnerability to climate change on the map: a review of approaches, benefits, and risks

    SciTech Connect

    Preston, Benjamin L

    2011-01-01

    There is growing demand among stakeholders across public and private institutions for spatially-explicit information regarding vulnerability to climate change at the local scale. However, the challenges associated with mapping the geography of climate change vulnerability are non-trivial, both conceptually and technically, suggesting the need for more critical evaluation of this practice. Here, we review climate change vulnerability mapping in the context of four key questions that are fundamental to assessment design. First, what are the goals of the assessment? A review of published assessments yields a range of objective statements that emphasize problem orientation or decision-making about adaptation actions. Second, how is the assessment of vulnerability framed? Assessments vary with respect to what values are assessed (vulnerability of what) and the underlying determinants of vulnerability that are considered (vulnerability to what). The selected frame ultimately influences perceptions of the primary driving forces of vulnerability as well as preferences regarding management alternatives. Third, what are the technical methods by which an assessment is conducted? The integration of vulnerability determinants into a common map remains an emergent and subjective practice associated with a number of methodological challenges. Fourth, who participates in the assessment and how will it be used to facilitate change? Assessments are often conducted under the auspices of benefiting stakeholders, yet many lack direct engagement with stakeholders. Each of these questions is reviewed in turn by drawing on an illustrative set of 45 vulnerability mapping studies appearing in the literature. A number of pathways for placing vulnerability

  16. Mapping a Mutation in "Caenorhabditis elegans" Using a Polymerase Chain Reaction-Based Approach

    ERIC Educational Resources Information Center

    Myers, Edith M.

    2014-01-01

    Many single nucleotide polymorphisms (SNPs) have been identified within the "Caenorhabditis elegans" genome. SNPs present in the genomes of two isogenic "C. elegans" strains have been routinely used as a tool in forward genetics to map a mutation to a particular chromosome. This article describes a laboratory exercise in which…

  17. Mapping Patterns of Perceptions: A Community-Based Approach to Cultural Competence Assessment

    ERIC Educational Resources Information Center

    Davis, Tamara S.

    2007-01-01

    Unclear definitions and limited system-level assessment measures inhibit cultural responsiveness in children's mental health. This study explores an alternative method to conceptualize and assess cultural competence in four children's mental health systems of care communities from family and professional perspectives. Concept Mapping was used to…

  18. Emerging bosons with three-body interactions from spin-1 atoms in optical lattices

    SciTech Connect

    Mazza, L.; Rizzi, M.; Cirac, J. I.; Lewenstein, M.

    2010-10-15

    We study two many-body systems of bosons interacting via an infinite three-body contact repulsion in a lattice: a pairs quasicondensate induced by correlated hopping and the discrete version of the Pfaffian wave function. We propose to experimentally realize systems characterized by such interaction by means of a proper spin-1 lattice Hamiltonian: spin degrees of freedom are locally mapped into occupation numbers of emerging bosons, in a fashion similar to spin-1/2 and hardcore bosons. Such a system can be realized with ultracold spin-1 atoms in a Mott insulator with a filling factor of 1. The high versatility of these setups allows us to engineer spin-hopping operators breaking the SU(2) symmetry, as needed to approximate interesting bosonic Hamiltonians with three-body hardcore constraint. For this purpose we combine bichromatic spin-independent superlattices and Raman transitions to induce a different hopping rate for each spin orientation. Finally, we illustrate how our setup could be used to experimentally realize the first setup, that is, the transition to a pairs quasicondensed phase of the emerging bosons. We also report on a route toward the realization of a discrete bosonic Pfaffian wave function and list some open problems for reaching this goal.

  19. A new approach for deriving Flood hazard maps from SAR data and global hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Matgen, P.; Hostache, R.; Chini, M.; Giustarini, L.; Pappenberger, F.; Bally, P.

    2014-12-01

    With the flood consequences likely to amplify because of the growing population and ongoing accumulation of assets in flood-prone areas, global flood hazard and risk maps are needed for improving flood preparedness at large scale. At the same time, with the rapidly growing archives of SAR images of floods, there is a high potential of making use of these images for global and regional flood management. In this framework, an original method that integrates global flood inundation modeling and microwave remote sensing is presented. It takes advantage of the combination of the time and space continuity of a global inundation model with the high spatial resolution of satellite observations. The availability of model simulations over a long time period offers opportunities for estimating flood non-exceedance probabilities in a robust way. These probabilities can be attributed to historical satellite observations. Time series of SAR-derived flood extent maps and associated non-exceedance probabilities can then be combined generate flood hazard maps with a spatial resolution equal to that of the satellite images, which is most of the time higher than that of a global inundation model. In principle, this can be done for any area of interest in the world, provided that a sufficient number of relevant remote sensing images are available. As a test case we applied the method on the Severn River (UK) and the Zambezi River (Mozambique), where large archives of Envisat flood images can be exploited. The global ECMWF flood inundation model is considered for computing the statistics of extreme events. A comparison with flood hazard maps estimated with in situ measured discharge is carried out. The first results confirm the potentiality of the method. However, further developments on two aspects are required to improve the quality of the hazard map and to ensure the acceptability of the product by potential end user organizations. On the one hand, it is of paramount importance to

  20. A Karnaugh map based approach towards systemic reviews and meta-analysis.

    PubMed

    Hassan, Abdul Wahab; Hassan, Ahmad Kamal

    2016-01-01

    Studying meta-analysis and systemic reviews since long had helped us conclude numerous parallel or conflicting studies. Existing studies are presented in tabulated forms which contain appropriate information for specific cases yet it is difficult to visualize. On meta-analysis of data, this can lead to absorption and subsumption errors henceforth having undesirable potential of consecutive misunderstandings in social and operational methodologies. The purpose of this study is to investigate an alternate forum for meta-data presentation that relies on humans' strong pictorial perception capability. Analysis of big-data is assumed to be a complex and daunting task often reserved on the computational powers of machines yet there exist mapping tools which can analyze such data in a hand-handled manner. Data analysis on such scale can benefit from the use of statistical tools like Karnaugh maps where all studies can be put together on a graph based mapping. Such a formulation can lead to more control in observing patterns of research community and analyzing further for uncertainty and reliability metrics. We present a methodological process of converting a well-established study in Health care to its equaling binary representation followed by furnishing values on to a Karnaugh Map. The data used for the studies presented herein is from Burns et al (J Publ Health 34(1):138-148, 2011) consisting of retrospectively collected data sets from various studies on clinical coding data accuracy. Using a customized filtration process, a total of 25 studies were selected for review with no, partial, or complete knowledge of six independent variables thus forming 64 independent cells on a Karnaugh map. The study concluded that this pictorial graphing as expected had helped in simplifying the overview of meta-analysis and systemic reviews.

  1. Geospatial Predictive Modelling for Climate Mapping of Selected Severe Weather Phenomena Over Poland: A Methodological Approach

    NASA Astrophysics Data System (ADS)

    Walawender, Ewelina; Walawender, Jakub P.; Ustrnul, Zbigniew

    2017-02-01

    The main purpose of the study is to introduce methods for mapping the spatial distribution of the occurrence of selected atmospheric phenomena (thunderstorms, fog, glaze and rime) over Poland from 1966 to 2010 (45 years). Limited in situ observations as well the discontinuous and location-dependent nature of these phenomena make traditional interpolation inappropriate. Spatially continuous maps were created with the use of geospatial predictive modelling techniques. For each given phenomenon, an algorithm identifying its favourable meteorological and environmental conditions was created on the basis of observations recorded at 61 weather stations in Poland. Annual frequency maps presenting the probability of a day with a thunderstorm, fog, glaze or rime were created with the use of a modelled, gridded dataset by implementing predefined algorithms. Relevant explanatory variables were derived from NCEP/NCAR reanalysis and downscaled with the use of a Regional Climate Model. The resulting maps of favourable meteorological conditions were found to be valuable and representative on the country scale but at different correlation ( r) strength against in situ data (from r = 0.84 for thunderstorms to r = 0.15 for fog). A weak correlation between gridded estimates of fog occurrence and observations data indicated the very local nature of this phenomenon. For this reason, additional environmental predictors of fog occurrence were also examined. Topographic parameters derived from the SRTM elevation model and reclassified CORINE Land Cover data were used as the external, explanatory variables for the multiple linear regression kriging used to obtain the final map. The regression model explained 89 % of annual frequency of fog variability in the study area. Regression residuals were interpolated via simple kriging.

  2. Fermiophobic Higgs boson and supersymmetry

    NASA Astrophysics Data System (ADS)

    Gabrielli, E.; Kannike, K.; Mele, B.; Racioppi, A.; Raidal, M.

    2012-09-01

    If a light Higgs boson with mass 125 GeV is fermiophobic, or partially fermiophobic, then the minimal supersymmetric standard model is excluded. The minimal supersymmetric fermiophobic Higgs scenario can naturally be formulated in the context of the next-to-minimal supersymmetric standard model (NMSSM) that admits Z3 discrete symmetries. In the fermiophobic NMSSM, the supersymmetry naturalness criteria are relaxed by a factor Ncyt4/g4˜25, removing the little hierarchy problem and allowing sparticle masses to be naturally of order 2-3 TeV. This scale motivates wino or Higgsino dark matter. The SUSY flavor and CP problems as well as the constraints on sparticle and Higgs boson masses from b→sγ, Bs→μμ and direct LHC searches are relaxed in the fermiophobic NMSSM. The price to pay is that a new, yet unknown, mechanism must be introduced to generate fermion masses. We show that in the fermiophobic NMSSM the radiative Higgs boson branchings to γγ, γZ can be modified compared to the fermiophobic and ordinary standard model predictions, and fit present collider data better. Suppression of dark matter scattering off nuclei explains the absence of signal in XENON100.

  3. Measurements of trilinear gauge boson couplings

    SciTech Connect

    Abbott, B.

    1997-10-01

    Direct measurements of the trilinear gauge boson couplings by the D0 collaboration at Fermilab are reported. Limits on the anomalous couplings were obtained at a 95% CL from four diboson production processes: W{gamma} production with the W boson decaying to e{nu} or {mu}{nu}, WW production with both of the W bosons decaying to e{nu} or {mu}{nu}, WW/WZ production with one W boson decaying to e{nu} and the other W or Z boson decaying to two jets, and Z{gamma} production with the Z boson decaying to ee, {mu}{mu}, or {nu}{nu}. Limits were also obtained from a combined fit to W{gamma}, WW {yields} dileptons and WW/WZ {yields} e{nu}jj data samples.

  4. A Monte Carlo simulation based two-stage adaptive resonance theory mapping approach for offshore oil spill vulnerability index classification.

    PubMed

    Li, Pu; Chen, Bing; Li, Zelin; Zheng, Xiao; Wu, Hongjing; Jing, Liang; Lee, Kenneth

    2014-09-15

    In this paper, a Monte Carlo simulation based two-stage adaptive resonance theory mapping (MC-TSAM) model was developed to classify a given site into distinguished zones representing different levels of offshore Oil Spill Vulnerability Index (OSVI). It consisted of an adaptive resonance theory (ART) module, an ART Mapping module, and a centroid determination module. Monte Carlo simulation was integrated with the TSAM approach to address uncertainties that widely exist in site conditions. The applicability of the proposed model was validated by classifying a large coastal area, which was surrounded by potential oil spill sources, based on 12 features. Statistical analysis of the results indicated that the classification process was affected by multiple features instead of one single feature. The classification results also provided the least or desired number of zones which can sufficiently represent the levels of offshore OSVI in an area under uncertainty and complexity, saving time and budget in spill monitoring and response.

  5. Gaussian Multiple Instance Learning Approach for Mapping the Slums of the World Using Very High Resolution Imagery

    SciTech Connect

    Vatsavai, Raju

    2013-01-01

    In this paper, we present a computationally efficient algo- rithm based on multiple instance learning for mapping infor- mal settlements (slums) using very high-resolution remote sensing imagery. From remote sensing perspective, infor- mal settlements share unique spatial characteristics that dis- tinguish them from other urban structures like industrial, commercial, and formal residential settlements. However, regular pattern recognition and machine learning methods, which are predominantly single-instance or per-pixel classi- fiers, often fail to accurately map the informal settlements as they do not capture the complex spatial patterns. To overcome these limitations we employed a multiple instance based machine learning approach, where groups of contigu- ous pixels (image patches) are modeled as generated by a Gaussian distribution. We have conducted several experi- ments on very high-resolution satellite imagery, represent- ing four unique geographic regions across the world. Our method showed consistent improvement in accurately iden- tifying informal settlements.

  6. Diffractive Higgs boson photoproduction in {gamma}p process

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2008-12-01

    We explore an alternative process for diffractive Higgs boson production in peripheral pp collisions arising from double Pomeron exchange in photon-proton interaction. We introduce the impact factor formalism in order to enable the gluon ladder exchange in the photon-proton subprocess, and to permit central Higgs production. The event rate for diffractive Higgs production in central rapidity is estimated to be about 0.6 pb at Tevatron and LHC energies. This result is higher than predictions from other approaches of diffractive Higgs production, showing that the alternative production process leads to an enhanced signal for the detection of the Higgs boson at hadron colliders. Our results are compared with those obtained from a similar approach proposed by the Durham Group. In this way, we may examine future developments in its application to pp and AA collisions.

  7. A Search for Dark Higgs Bosons

    SciTech Connect

    Lees, J.P.

    2012-06-08

    Recent astrophysical and terrestrial experiments have motivated the proposal of a dark sector with GeV-scale gauge boson force carriers and new Higgs bosons. We present a search for a dark Higgs boson using 516 fb{sup -1} of data collected with the BABAR detector. We do not observe a significant signal and we set 90% confidence level upper limits on the product of the Standard Model-dark sector mixing angle and the dark sector coupling constant.

  8. Fat jets for a light higgs boson.

    PubMed

    Plehn, Tilman; Salam, Gavin P; Spannowsky, Michael

    2010-03-19

    At the LHC associated top quark and Higgs boson production with a Higgs boson decay to bottom quarks has long been a heavily disputed search channel. Recently, it has been found not to be viable. We show how it can be observed by tagging massive Higgs bosons and top jets. For this purpose we construct boosted top and Higgs taggers for standard-model processes in a complex QCD environment.

  9. A multi-temporal fusion-based approach for land cover mapping in support of nuclear incident response

    NASA Astrophysics Data System (ADS)

    Sah, Shagan

    An increasingly important application of remote sensing is to provide decision support during emergency response and disaster management efforts. Land cover maps constitute one such useful application product during disaster events; if generated rapidly after any disaster, such map products can contribute to the efficacy of the response effort. In light of recent nuclear incidents, e.g., after the earthquake/tsunami in Japan (2011), our research focuses on constructing rapid and accurate land cover maps of the impacted area in case of an accidental nuclear release. The methodology involves integration of results from two different approaches, namely coarse spatial resolution multi-temporal and fine spatial resolution imagery, to increase classification accuracy. Although advanced methods have been developed for classification using high spatial or temporal resolution imagery, only a limited amount of work has been done on fusion of these two remote sensing approaches. The presented methodology thus involves integration of classification results from two different remote sensing modalities in order to improve classification accuracy. The data used included RapidEye and MODIS scenes over the Nine Mile Point Nuclear Power Station in Oswego (New York, USA). The first step in the process was the construction of land cover maps from freely available, high temporal resolution, low spatial resolution MODIS imagery using a time-series approach. We used the variability in the temporal signatures among different land cover classes for classification. The time series-specific features were defined by various physical properties of a pixel, such as variation in vegetation cover and water content over time. The pixels were classified into four land cover classes - forest, urban, water, and vegetation - using Euclidean and Mahalanobis distance metrics. On the other hand, a high spatial resolution commercial satellite, such as RapidEye, can be tasked to capture images over the

  10. Mapping shallow lakes in a large South American floodplain: A frequency approach on multitemporal Landsat TM/ETM data

    NASA Astrophysics Data System (ADS)

    Borro, Marta; Morandeira, Natalia; Salvia, Mercedes; Minotti, Priscilla; Perna, Pablo; Kandus, Patricia

    2014-05-01

    We propose a methodology to identify and map shallow lakes (SL) in the Paraná River floodplain, the largest freshwater wetland ecosystem in temperate South America. The presence and number of SL offer various ecosystem services and habitats for wildlife biodiversity. Our approach involved a frequency analysis over a 1987-2010 time series of the Normalized Difference Vegetation Index (NDVI), derived from Landsat 5 and 7 TM/ETM data. Through descriptive statistics of samples of pixels and field work in different types of SL, we established an NDVI threshold of 0.34 below which we assumed the presence of water in each pixel. The standard deviation of the estimated SL area decreases with the number of images in the analysis, being less than 10% when at least 30 images are used. The mean SL area for the whole period was 112,691 ha (10.9% of the study area). The influence of the hydrological conditions on the resulting SL map was evaluated by analyzing twelve sets of images, which were selected to span the whole period and different time frames according to multiannual dry and wet periods and to relative water level within each period. The Kappa index was then calculated between pairs of resulting SL maps. We compared our maps with the available national and international cartographic documents and with other published maps that used one or a few Landsat images. Landsat images time series provide an accurate spatial and temporal resolution for SL identification in floodplains, particularly in temperate zones with a good provision of cloud free images. The method evaluated in this paper considers the dynamics of SL and reduces the uncertainties of the fuzzy boundaries. Thus, it provides a robust database of SL and its temporal behavior to establish future monitoring programs based on the recent launch of Landsat 8 satellite.

  11. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach

    PubMed Central

    Lawrence, Emma; Hayes, Keith R.; Lucieer, Vanessa L.; Nichol, Scott L.; Dambacher, Jeffrey M.; Hill, Nicole A.; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve’s IUCN zone IV, and

  12. Rotating boson stars and Q-balls

    SciTech Connect

    Kleihaus, Burkhard; Kunz, Jutta; List, Meike

    2005-09-15

    We consider axially symmetric, rotating boson stars. Their flat-space limits represent spinning Q-balls. We discuss their properties and determine their domain of existence. Q-balls and boson stars are stationary solutions and exist only in a limited frequency range. The coupling to gravity gives rise to a spiral-like frequency dependence of the boson stars. We address the flat-space limit and the limit of strong gravitational coupling. For comparison we also determine the properties of spherically symmetric Q-balls and boson stars.

  13. Aura Tropospheric Ozone Columns Derived Using the TOR Approach and Mapping Techniques

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Cunnold, D. M.; Wang, H.; Jing, P.

    2005-12-01

    A 2 predictor (PV and geopotential height) interpolation/mapping technique has been applied to Aura MLS measurements combined with GEOS-4 meteorological fields to produce stratospheric ozone columns between the 300K isentropic surface and up to the 800K surface. Subtraction of these columns from OMI total ozone column measurements under clear sky conditions results in tropospheric ozone columns derived by the Tropospheric Ozone Residual (TOR) technique. The precisions and accuracies of the resulting TORs at mid-latitudes are assessed by comparisons against tropospheric ozonesonde and TORs derived from SAGE measurements. It is found that the inclusion of total ozone column as a third predictor in the interpolation increases the precision of the derived TORs. The use of trajectory mapping is also in the process of being evaluated.

  14. A symbiotic approach to SETI observations: use of maps from the Westerbork Synthesis Radio Telescope.

    PubMed

    Tarter, J C; Israel, F P

    1982-01-01

    High spatial resolution continuum radio maps produced by the Westerbork Synthesis Radio Telescope (WSRT) of The Netherlands at frequencies near the 21 cm HI line have been examined for anomalous sources of emmission coincident with the locations of nearby bright stars. From a total of 542 stellar positions investigated, no candidates for radio stars or ETI signals were discovered to formal limits on the minimum detectable signal ranging from 7.7 x 10(-22) W/m2 to 6.4 x 10(-24) W/m2. This preliminary study has verified that data collected by radio astronomers at large synthesis arrays can profitably be analysed for SETI signals (in a non-interfering manner) provided only that the data are available in the form of a more or less standard two dimensional map format.

  15. A symbiotic approach to SETI observations: use of maps from the Westerbork Synthesis Radio Telescope

    NASA Technical Reports Server (NTRS)

    Tarter, J. C.; Israel, F. P.

    1982-01-01

    High spatial resolution continuum radio maps produced by the Westerbork Synthesis Radio Telescope (WSRT) of The Netherlands at frequencies near the 21 cm HI line have been examined for anomalous sources of emmission coincident with the locations of nearby bright stars. From a total of 542 stellar positions investigated, no candidates for radio stars or ETI signals were discovered to formal limits on the minimum detectable signal ranging from 7.7 x 10(-22) W/m2 to 6.4 x 10(-24) W/m2. This preliminary study has verified that data collected by radio astronomers at large synthesis arrays can profitably be analysed for SETI signals (in a non-interfering manner) provided only that the data are available in the form of a more or less standard two dimensional map format.

  16. Using a conceptual approach with concept mapping to promote critical thinking.

    PubMed

    Vacek, Jenny E

    2009-01-01

    Promoting the development of critical thinking is crucial to nursing education for two reasons. First, the National League for Nursing and the American Association of Colleges of Nurses consider critical thinking an outcome criterion for baccalaureate nursing education. Second, and significantly more important, professional nursing practice requires critical thinking skills and problem solving abilities. Too often, teaching is not directed at specifically designed activities that foster critical thinking. Various teaching strategies have been proposed that promote critical thinking, including service learning, role playing, reflective learning, the critical incidence conference, videotaped vignettes, preceptorship, and concept mapping. This article focuses on the use of assimilation theory and concept maps to facilitate critical thinking experiences in nursing education.

  17. Mass Movement Susceptibility in the Western San Juan Mountains, Colorado: A Preliminary 3-D Mapping Approach

    NASA Astrophysics Data System (ADS)

    Kelkar, K. A.; Giardino, J. R.

    2015-12-01

    Mass movement is a major activity that impacts lives of humans and their infrastructure. Human activity in steep, mountainous regions is especially at risk to this potential hazard. Thus, the identification and quantification of risk by mapping and determining mass movement susceptibility are fundamental in protecting lives, resources and ensuring proper land use regulation and planning. Specific mass-movement processes including debris flows, rock falls, snow avalanches and landslides continuously modify the landscape of the San Juan Mountains. Historically, large-magnitude slope failures have repeatedly occurred in the region. Common triggers include intense, long-duration precipitation, freeze-thaw processes, human activity and various volcanic lithologies overlying weaker sedimentary formations. Predicting mass movement is challenging because of its episodic and spatially, discontinuous occurrence. Landslides in mountain terrain are characterized as widespread, highly mobile and have a long duration of activity. We developed a 3-D model for landslide susceptibility using Geographic Information Systems Technology (GIST). The study area encompasses eight USGS quadrangles: Ridgway, Dallas, Mount Sneffels, Ouray, Telluride, Ironton, Ophir and Silverton. Fieldwork consisted of field reconnaissance mapping at 1:5,000 focusing on surficial geomorphology. Field mapping was used to identify potential locations, which then received additional onsite investigation and photographic documentation of features indicative of slope failure. A GIS module was created using seven terrain spatial databases: geology, surficial geomorphology (digitized), slope aspect, slope angle, vegetation, soils and distance to infrastructure to map risk. The GIS database will help determine risk zonation for the study area. Correlations between terrain parameters leading to slope failure were determined through the GIS module. This 3-D model will provide a spatial perspective of the landscape to

  18. Frustration of dissipation in a spin-boson model

    NASA Astrophysics Data System (ADS)

    Ingersent, Kevin; Duru, Alper

    2009-03-01

    The spin-boson model (SBM), in which a quantum two-level system couples via one component of its effective spin to a dissipative bosonic bath, has many realizations. There has been much recent interest in the SBM with a sub-Ohmic bath characterized by a power-law spectral exponent 0 < s < 1, where at zero temperature a quantum critical point separates delocalized and localized phases. Numerical renormalization group calculations have called into question [1] the validity of the long-assumed mapping between the SBM and the classical Ising chain with interactions decaying with distance |i-j| as 1/|i-j|^1+s. Attention has also fallen on a variant of the SBM in which two components of the impurity spin couple to different bosonic baths. For Ohmic case (s = 1), competition between the baths has been shown to frustrate the dissipation and reduce the coupling of the impurity to the environment [2]. The present study addresses the SBM with two sub-Ohmic baths, where dissipative effects are intrinsically stronger than for s=1. Numerical renormalization group methods are used to identify a continuous quantum phase transition in this model and to evaluate critical exponents characterizing the quantum-critical behavior in the vicinity of the transition. [1] M. Vojta et al., Phys. Rev. Lett. 94, 070604 (2005). [2] E. Novais et al., Phys. Rev. B 72, 014417 (2005). Supported by NSF Grant DMR-0710540.

  19. An Approach to Optimize Size Parameters of Forging by Combining Hot-Processing Map and FEM

    NASA Astrophysics Data System (ADS)

    Hu, H. E.; Wang, X. Y.; Deng, L.

    2014-11-01

    The size parameters of 6061 aluminum alloy rib-web forging were optimized by using hot-processing map and finite element method (FEM) based on high-temperature compression data. The results show that the stress level of the alloy can be represented by a Zener-Holloman parameter in a hyperbolic sine-type equation with the hot deformation activation energy of 343.7 kJ/mol. Dynamic recovery and dynamic recrystallization concurrently preceded during high-temperature deformation of the alloy. Optimal hot-processing parameters for the alloy corresponding to the peak value of 0.42 are 753 K and 0.001 s-1. The instability domain occurs at deformation temperature lower than 653 K. FEM is an available method to validate hot-processing map in actual manufacture by analyzing the effect of corner radius, rib width, and web thickness on workability of rib-web forging of the alloy. Size parameters of die forgings can be optimized conveniently by combining hot-processing map and FEM.

  20. a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps

    NASA Astrophysics Data System (ADS)

    Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.

    2016-06-01

    Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.

  1. Compression map, functional groups and fossilization: A chemometric approach (Pennsylvanian neuropteroid foliage, Canada)

    USGS Publications Warehouse

    D'Angelo, J. A.; Zodrow, E.L.; Mastalerz, Maria

    2012-01-01

    Nearly all of the spectrochemical studies involving Carboniferous foliage of seed-ferns are based on a limited number of pinnules, mainly compressions. In contrast, in this paper we illustrate working with a larger pinnate segment, i.e., a 22-cm long neuropteroid specimen, compression-preserved with cuticle, the compression map. The objective is to study preservation variability on a larger scale, where observation of transparency/opacity of constituent pinnules is used as a first approximation for assessing the degree of pinnule coalification/fossilization. Spectrochemical methods by Fourier transform infrared spectrometry furnish semi-quantitative data for principal component analysis.The compression map shows a high degree of preservation variability, which ranges from comparatively more coalified pinnules to less coalified pinnules that resemble fossilized-cuticles, noting that the pinnule midveins are preserved more like fossilized-cuticles. A general overall trend of coalified pinnules towards fossilized-cuticles, i.e., variable chemistry, is inferred from the semi-quantitative FTIR data as higher contents of aromatic compounds occur in the visually more opaque upper location of the compression map. The latter also shows a higher condensation of the aromatic nuclei along with some variation in both ring size and degree of aromatic substitution. From principal component analysis we infer correspondence between transparency/opacity observation and chemical information which correlate with varying degree to fossilization/coalification among pinnules. ?? 2011 Elsevier B.V.

  2. Assessment of Ice Shape Roughness Using a Self-Orgainizing Map Approach

    NASA Technical Reports Server (NTRS)

    Mcclain, Stephen T.; Kreeger, Richard E.

    2013-01-01

    Self-organizing maps are neural-network techniques for representing noisy, multidimensional data aligned along a lower-dimensional and nonlinear manifold. For a large set of noisy data, each element of a finite set of codebook vectors is iteratively moved in the direction of the data closest to the winner codebook vector. Through successive iterations, the codebook vectors begin to align with the trends of the higher-dimensional data. Prior investigations of ice shapes have focused on using self-organizing maps to characterize mean ice forms. The Icing Research Branch has recently acquired a high resolution three dimensional scanner system capable of resolving ice shape surface roughness. A method is presented for the evaluation of surface roughness variations using high-resolution surface scans based on a self-organizing map representation of the mean ice shape. The new method is demonstrated for 1) an 18-in. NACA 23012 airfoil 2 AOA just after the initial ice coverage of the leading 5 of the suction surface of the airfoil, 2) a 21-in. NACA 0012 at 0AOA following coverage of the leading 10 of the airfoil surface, and 3) a cold-soaked 21-in.NACA 0012 airfoil without ice. The SOM method resulted in descriptions of the statistical coverage limits and a quantitative representation of early stages of ice roughness formation on the airfoils. Limitations of the SOM method are explored, and the uncertainty limits of the method are investigated using the non-iced NACA 0012 airfoil measurements.

  3. Quantum dynamics of relativistic bosons through nonminimal vector square potentials

    NASA Astrophysics Data System (ADS)

    de Oliveira, Luiz P.

    2016-09-01

    The dynamics of relativistic bosons (scalar and vectorial) through nonminimal vector square (well and barrier) potentials is studied in the Duffin-Kemmer-Petiau (DKP) formalism. We show that the problem can be mapped in effective Schrödinger equations for a component of the DKP spinor. An oscillatory transmission coefficient is found and there is total reflection. Additionally, the energy spectrum of bound states is obtained and reveals the Schiff-Snyder-Weinberg effect, for specific conditions the potential lodges bound states of particles and antiparticles.

  4. Limits on Higgs bosons, scalar-quarkonia, and ηb's from radiative upsilon decays

    NASA Astrophysics Data System (ADS)

    Franzini, P.; Son, D.; Tuts, P. M.; Youssef, S.; Zhao, T.; Lee-Franzini, J.; Horstkotte, J.; Klopfenstein, C.; Kaarsberg, T.; Lovelock, D. M. J.; Schamberger, R. D.; Sontz, S. B.; Yanagisawa, C.

    1987-05-01

    We have searched for monochromatic photon signals in the reaction Υ(9460)-->X+γ in a sample of 420 000 Υ decays observed in the CUSB detector. From the absence of signal we obtain upper limits for Higgs-boson production in Υ decay which approach the minimal-standard-model expectations, including QCD radiative corrections, for low Higgs-boson masses. For two Higgs-boson models we obtain bounds on the ratio of the vacuum expectation values of the two neutral Higgs fields as a function of the Higgs-boson mass. We do not confirm the ζ(8.3) and find no evidence for a nearby bound scalar-quark-anti-scalar-quark state. We obtain a lower bound for the ηb mass using potential-model predictions.

  5. A Concept Map Approach to Developing Collaborative Mindtools for Context-Aware Ubiquitous Learning

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Shi, Yen-Ru; Chu, Hui-Chun

    2011-01-01

    Recent advances in mobile and wireless communication technologies have enabled various new learning approaches which situate students in environments that combine real-world and digital-world learning resources; moreover, students are allowed to share knowledge or experiences with others during the learning process. Although such an approach seems…

  6. A simple approach for the development of urban climatic maps based on the urban characteristics in Tainan, Taiwan.

    PubMed

    Chen, Yu-Cheng; Lin, Tzu-Ping; Lin, Chien-Ting

    2016-12-03

    Motivated by the increasing thermal load in urban environment, this work established Urban Climatic map (UCmap) focusing on thermal environment issues based on urban development factors, e.g., land cover and building characteristics, representing thermal load of human body and ventilation path in the urban structures. In the established process of UCmap in this work, Tainan city, which is a highly developed city in southern Taiwan, is selected as the research area. A 50-m resolution grid is used to capture urban development factors and the climate data based on 1 year of mobile and fix-point measurements, from which the thermal load and the wind environment map are constructed. The results herein reveal that a higher urban development level is associated with a higher thermal load, and similar areas are more likely than others to suffer from an extreme thermal load and low wind pass conditions. Open and sparse low-rise buildings constitute the most appropriate urban characteristics for urban built environment in Tainan. By the simple approach of establishing UCmap, the microclimate condition and development intensity of regions can be easily detected and linked, for example the compact high-rise areas should be limited by floor area ratio in order to prevent the formation of hot spots. The government, urban planners, and architects without a meteorological background can efficiently obtain climate information by way of mapping the certain area, and making regulations to mitigate the growing problem of thermal stress and urban heat island.

  7. A simple approach for the development of urban climatic maps based on the urban characteristics in Tainan, Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Cheng; Lin, Tzu-Ping; Lin, Chien-Ting

    2016-12-01

    Motivated by the increasing thermal load in urban environment, this work established Urban Climatic map (UCmap) focusing on thermal environment issues based on urban development factors, e.g., land cover and building characteristics, representing thermal load of human body and ventilation path in the urban structures. In the established process of UCmap in this work, Tainan city, which is a highly developed city in southern Taiwan, is selected as the research area. A 50-m resolution grid is used to capture urban development factors and the climate data based on 1 year of mobile and fix-point measurements, from which the thermal load and the wind environment map are constructed. The results herein reveal that a higher urban development level is associated with a higher thermal load, and similar areas are more likely than others to suffer from an extreme thermal load and low wind pass conditions. Open and sparse low-rise buildings constitute the most appropriate urban characteristics for urban built environment in Tainan. By the simple approach of establishing UCmap, the microclimate condition and development intensity of regions can be easily detected and linked, for example the compact high-rise areas should be limited by floor area ratio in order to prevent the formation of hot spots. The government, urban planners, and architects without a meteorological background can efficiently obtain climate information by way of mapping the certain area, and making regulations to mitigate the growing problem of thermal stress and urban heat island.

  8. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    PubMed

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not.

  9. Mapping suitability of rice production systems for mitigation: Strategic approach for prioritizing improved irrigation management across scales

    NASA Astrophysics Data System (ADS)

    Wassmann, Reiner; Sander, Bjoern Ole

    2016-04-01

    After the successful conclusion of the COP21 in Paris, many developing countries are now embracing the task of reducing emissions with much vigor than previously. In many countries of South and South-East Asia, the agriculture sector constitutes a vast share of the national GHG budget which can mainly be attributed to methane emissions from flooded rice production. Thus, rice growing countries are now looking for tangible and easily accessible information as to how to reduce emissions from rice production in an efficient manner. Given present and future food demand, mitigation options will have to comply with aim of increasing productivity. At the same time, limited financial resources demand for strategic planning of potential mitigation projects based on cost-benefit ratios. At this point, the most promising approach for mitigating methane emissions from rice is an irrigation technique called Alternate Wetting and Drying (AWD). AWD was initially developed for saving water and subsequently, represents an adaptation strategy in its own right by coping with less rainfall. Moreover, AWD also reduces methane emissions in a range from 30-70%. However, AWD is not universally suitable. It is attractive to farmers who have to pump water and may save fuel under AWD, but renders limited incentives in situations where there is no real pressing water scarcity. Thus, planning for AWD adoption at larger scale, e.g. for country-wide programs, should be based on a systematic prioritization of target environments. This presentation encompasses a new methodology for mapping suitability of water-saving in rice production - as a means for planning adaptation and mitigation programs - alongside with preliminary results. The latter comprises three new GIS maps on climate-driven suitability of AWD in major rice growing countries (Philippines, Vietnam, Bangladesh). These maps have been derived from high-resolution data of the areal and temporal extent of rice production that are now

  10. A fast new approach to pharmacophore mapping and its application to dopaminergic and benzodiazepine agonists

    NASA Astrophysics Data System (ADS)

    Martin, Yvonne C.; Bures, Mark G.; Danaher, Elizabeth A.; DeLazzer, Jerry; Lico, Isabella; Pavlik, Patricia A.

    1993-02-01

    In the absence of a 3D structure of the target biomolecule, to propose the 3D requirements for a small molecule to exhibit a particular bioactivity, one must supply both a bioactive conformation and a superposition rule for every active compound. Our strategy identifies both simultaneously. We first generate and optimize all low-energy conformations by any suitable method. For each conformation we then use ALAD-DIN to calculate the location of points to be considered as part of the superposition. These points include atoms in the molecule and projections from the molecule to hydrogen-bond donors and acceptors or charged groups in the binding site. These positions and the relative energy of each conformation are the input to our new program DISCO. It uses a clique-detection method to find superpositions that contain a least one conformation of each molecule and user-specified numbers of point types and chirality. DISCO is fast; for example, it takes about 1 min CPU to propose pharmacophores from 21 conformations of seven molecules. We typically run DISCO several times to compare alternative pharmacophore maps. For D2 dopamine agonists DISCO shows that the newer 2-aminothiazoles fit the traditional pharmacophore. Using site points correctly identifies the bioactive enantiomers of indoles to compare with catechols whereas using only ligand points leads to selecting the inactive enantiomer for the pharmacophore map. In addition, DISCO reproduces pharmacophore maps of benzodiazepines in the literature and proposes subtle improvements. Our experience suggests that clique-detection methods will find many applications in computational chemistry and computer-assisted molecular design.

  11. Application of digital soil mapping in traditional soil survey - an approach used for the production of the national soil map of the United Arab

    NASA Astrophysics Data System (ADS)

    Abdelfattah, M. A.; Pain, C.

    2012-04-01

    Digital soil maps are essential part of the soil assessment framework which supports soil-related decisions and policy-making and therefore it is of crucial importance that they are of known quality. Digital soil mapping is perhaps the next great advancement in soil survey information. Traditional soil survey has always struggled with the collection of data. The amount of soil data and information required to justify the mapping product, how to interpolate date to similar areas, and how to incorporate older data are all challenges that need further exploration. The present study used digital soil mapping to develop a generalized national soil map of the United Arab Emirates with available recent traditional soil survey of Abu Dhabi Emirate (2006-2009) and Northern Emirates (2010-2012), together with limited data from Dubai Emirate, an important part of the country. The map was developed by joining, generalizing, and correlating the information contained in the Soil Survey of Abu Dhabi Emirate, the Soil map of Dubai with limited data, and the Soil Survey of the Northern Emirates. Because the soil surveys were completed at different times and with different standards and procedures, the original map lines and soil classifications had to be modified in order to integrate the three original maps and legends into this single national level map. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) version 2 was used to guide line placement of the map units. It was especially helpful for the Torripsamments units which are separated based on local landscape relief characteristics. A generalized soil map of the United Arab Emirates is produced, which consists of fifteen map units, twelve are named for the soil great group that dominants each unit. Three are named "Rock outcrop", "Mountains", or "Miscellaneous units". Statistical details are also presented. Soil great groups are appropriate taxa to use for soil

  12. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach

    PubMed Central

    Ammendolia, Carlo; Cassidy, David; Steensta, Ivan; Soklaridis, Sophie; Boyle, Eleanor; Eng, Stephanie; Howard, Hamer; Bhupinder, Bains; Côté, Pierre

    2009-01-01

    Background Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP) and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW) coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. Methods We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. Results A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Conclusion Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting. PMID:19508728

  13. Quantum entanglement for systems of identical bosons: I. General features

    NASA Astrophysics Data System (ADS)

    Dalton, B. J.; Goold, J.; Garraway, B. M.; Reid, M. D.

    2017-02-01

    These two accompanying papers are concerned with two mode entanglement for systems of identical massive bosons and the relationship to spin squeezing and other quantum correlation effects. Entanglement is a key quantum feature of composite systems in which the probabilities for joint measurements on the composite sub-systems are no longer determined from measurement probabilities on the separate sub-systems. There are many aspects of entanglement that can be studied. This two-part review focuses on the meaning of entanglement, the quantum paradoxes associated with entangled states, and the important tests that allow an experimentalist to determine whether a quantum state—in particular, one for massive bosons is entangled. An overall outcome of the review is to distinguish criteria (and hence experiments) for entanglement that fully utilize the symmetrization principle and the super-selection rules that can be applied to bosonic massive particles. In the first paper (I), the background is given for the meaning of entanglement in the context of systems of identical particles. For such systems, the requirement is that the relevant quantum density operators must satisfy the symmetrization principle and that global and local super-selection rules prohibit states in which there are coherences between differing particle numbers. The justification for these requirements is fully discussed. In the second quantization approach that is used, both the system and the sub-systems are modes (or sets of modes) rather than particles, particles being associated with different occupancies of the modes. The definition of entangled states is based on first defining the non-entangled states—after specifying which modes constitute the sub-systems. This work mainly focuses on the two mode entanglement for massive bosons, but is put in the context of tests of local hidden variable theories, where one may not be able to make the above restrictions. The review provides the detailed

  14. A NEW APPROACH TO CONSTRAIN BLACK HOLE SPINS IN ACTIVE GALAXIES USING OPTICAL REVERBERATION MAPPING

    SciTech Connect

    Wang, Jian-Min; Du, Pu; Li, Yan-Rong; Hu, Chen; Ho, Luis C.; Bai, Jin-Ming

    2014-09-01

    A tight relation between the size of the broad-line region (BLR) and optical luminosity has been established in about 50 active galactic nuclei studied through reverberation mapping of the broad Hβ emission line. The R {sub BLR}-L relation arises from simple photoionization considerations. Using a general relativistic model of an optically thick, geometrically thin accretion disk, we show that the ionizing luminosity jointly depends on black hole mass, accretion rate, and spin. The non-monotonic relation between the ionizing and optical luminosity gives rise to a complicated relation between the BLR size and the optical luminosity. We show that the reverberation lag of Hβ to the varying continuum depends very sensitively on black hole spin. For retrograde spins, the disk is so cold that there is a deficit of ionizing photons in the BLR, resulting in shrinkage of the hydrogen ionization front with increasing optical luminosity, and hence shortened Hβ lags. This effect is specially striking for luminous quasars undergoing retrograde accretion, manifesting in strong deviations from the canonical R {sub BLR}-L relation. This could lead to a method to estimate black hole spins of quasars and to study their cosmic evolution. At the same time, the small scatter of the observed R {sub BLR}-L relation for the current sample of reverberation-mapped active galaxies implies that the majority of these sources have rapidly spinning black holes.

  15. Structural mapping in statistical word problems: A relational reasoning approach to Bayesian inference.

    PubMed

    Johnson, Eric D; Tubau, Elisabet

    2016-09-27

    Presenting natural frequencies facilitates Bayesian inferences relative to using percentages. Nevertheless, many people, including highly educated and skilled reasoners, still fail to provide Bayesian responses to these computationally simple problems. We show that the complexity of relational reasoning (e.g., the structural mapping between the presented and requested relations) can help explain the remaining difficulties. With a non-Bayesian inference that required identical arithmetic but afforded a more direct structural mapping, performance was universally high. Furthermore, reducing the relational demands of the task through questions that directed reasoners to use the presented statistics, as compared with questions that prompted the representation of a second, similar sample, also significantly improved reasoning. Distinct error patterns were also observed between these presented- and similar-sample scenarios, which suggested differences in relational-reasoning strategies. On the other hand, while higher numeracy was associated with better Bayesian reasoning, higher-numerate reasoners were not immune to the relational complexity of the task. Together, these findings validate the relational-reasoning view of Bayesian problem solving and highlight the importance of considering not only the presented task structure, but also the complexity of the structural alignment between the presented and requested relations.

  16. A MAP approach for joint motion estimation, segmentation, and super resolution.

    PubMed

    Shen, Huanfeng; Zhang, Liangpei; Huang, Bo; Li, Pingxiang

    2007-02-01

    Super resolution image reconstruction allows the recovery of a high-resolution (HR) image from several low-resolution images that are noisy, blurred, and down sampled. In this paper, we present a joint formulation for a complex super-resolution problem in which the scenes contain multiple independently moving objects. This formulation is built upon the maximum a posteriori (MAP) framework, which judiciously combines motion estimation, segmentation, and super resolution together. A cyclic coordinate descent optimization procedure is used to solve the MAP formulation, in which the motion fields, segmentation fields, and HR images are found in an alternate manner given the two others, respectively. Specifically, the gradient-based methods are employed to solve the HR image and motion fields, and an iterated conditional mode optimization method to obtain the segmentation fields. The proposed algorithm has been tested using a synthetic image sequence, the "Mobile and Calendar" sequence, and the original "Motorcycle and Car" sequence. The experiment results and error analyses verify the efficacy of this algorithm.

  17. A Computational Approach for Functional Mapping of Quantitative Trait Loci That Regulate Thermal Performance Curves

    PubMed Central

    Yap, John Stephen; Wang, Chenguang; Wu, Rongling

    2007-01-01

    Whether and how thermal reaction norm is under genetic control is fundamental to understand the mechanistic basis of adaptation to novel thermal environments. However, the genetic study of thermal reaction norm is difficult because it is often expressed as a continuous function or curve. Here we derive a statistical model for dissecting thermal performance curves into individual quantitative trait loci (QTL) with the aid of a genetic linkage map. The model is constructed within the maximum likelihood context and implemented with the EM algorithm. It integrates the biological principle of responses to temperature into a framework for genetic mapping through rigorous mathematical functions established to describe the pattern and shape of thermal reaction norms. The biological advantages of the model lie in the decomposition of the genetic causes for thermal reaction norm into its biologically interpretable modes, such as hotter-colder, faster-slower and generalist-specialist, as well as the formulation of a series of hypotheses at the interface between genetic actions/interactions and temperature-dependent sensitivity. The model is also meritorious in statistics because the precision of parameter estimation and power of QTLdetection can be increased by modeling the mean-covariance structure with a small set of parameters. The results from simulation studies suggest that the model displays favorable statistical properties and can be robust in practical genetic applications. The model provides a conceptual platform for testing many ecologically relevant hypotheses regarding organismic adaptation within the Eco-Devo paradigm. PMID:17579725

  18. Deriving pathway maps from automated text analysis using a grammar-based approach.

    PubMed

    Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn

    2006-04-01

    We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases.

  19. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    EPA Science Inventory

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  20. Superalgebra and fermion-boson symmetry

    PubMed Central

    Miyazawa, Hironari

    2010-01-01

    Fermions and bosons are quite different kinds of particles, but it is possible to unify them in a supermultiplet, by introducing a new mathematical scheme called superalgebra. In this article we discuss the development of the concept of symmetry, starting from the rotational symmetry and finally arriving at this fermion-boson (FB) symmetry. PMID:20228617

  1. Goldstone bosons as fractional cosmic neutrinos.

    PubMed

    Weinberg, Steven

    2013-06-14

    It is suggested that Goldstone bosons may be masquerading as fractional cosmic neutrinos, contributing about 0.39 to what is reported as the effective number of neutrino types in the era before recombination. The broken symmetry associated with these Goldstone bosons is further speculated to be the conservation of the particles of dark matter.

  2. Higgs Boson Mass, New Physics and Inflation

    SciTech Connect

    Shafi, Qaisar

    2008-05-13

    Finding the Standard Model scalar (Higgs) boson is arguably the single most important mission of the LHC. I review predictions for the Higgs boson mass based on stability and perturbativity arguments, taking into account neutrino oscillations. Other topics that are briefly discussed include the CMSSM, extra dimensions, higher dimensional orbifold GUTS, and primordial inflation based on the Coleman-Weinberg potential.

  3. Higgs boson couplings: Measurements and theoretical interpretation

    NASA Astrophysics Data System (ADS)

    Mariotti, Chiara; Passarino, Giampiero

    2017-02-01

    This report will review the Higgs boson properties: the mass, the total width and the couplings to fermions and bosons. The measurements have been performed with the data collected in 2011 and 2012 at the LHC accelerator at CERN by the ATLAS and CMS experiments. Theoretical frameworks to search for new physics are also introduced and discussed.

  4. Mapping the Internal Anatomy of the Lateral Brainstem: Anatomical Study with Application to Far Lateral Approaches to Intrinsic Brainstem Tumors

    PubMed Central

    Granger, Andre; Vahedi, Payman; Loukas, Marios; Oskouian, Rod J; Fries, Fabian N; Lotfinia, Iraj; Mortazavi, Martin M; Oakes, W. Jerry; Tubbs, R. Shane

    2017-01-01

    Introduction Intramedullary brainstem tumors present a special challenge to the neurosurgeon. Unfortunately, there is no ideal part of the brainstem to incise for approaches to such pathology. Therefore, the present study was performed to identify what incisions on the lateral brainstem would result in the least amount of damage to eloquent tracts and nuclei. Case illustrations are also discussed. Materials and methods Eight human brainstems were evaluated. Based on dissections and the use of standard atlases of brainstem anatomy, the most important deeper brainstem structures were mapped to the surface of the lateral brainstem. Results With these data, we defined superior acute and inferior obtuse corridors for surgical entrance into the lateral brainstem that would minimize injury to deeper tracts and nuclei, the damage to which would result in significant morbidity. Conclusions To our knowledge, a superficial map of the lateral brainstem for identifying deeper lying and clinically significant nuclei and tracts has not previously been available. Such data might decrease patient morbidity following biopsy or tumor removal or aspiration of brainstem hemorrhage. Additionally, this information can be coupled with the previous literature on approaches into the fourth ventricular floor for more complex, multidimensional lesions. PMID:28357160

  5. Language mapping in children using resting-state functional connectivity: comparison with a task-based approach

    NASA Astrophysics Data System (ADS)

    Gallagher, Anne; Tremblay, Julie; Vannasing, Phetsamone

    2016-12-01

    Patients with brain tumor or refractory epilepsy may be candidates for neurosurgery. Presurgical evaluation often includes language investigation to prevent or reduce the risk of postsurgical language deficits. Current techniques involve significant limitations with pediatric populations. Recently, near-infrared spectroscopy (NIRS) has been shown to be a valuable neuroimaging technique for language localization in children. However, it typically requires the child to perform a task (task-based NIRS), which may constitute a significant limitation. Resting-state functional connectivity NIRS (fcNIRS) is an approach that can be used to identify language networks at rest. This study aims to assess the utility of fcNIRS in children by comparing fcNIRS to more conventional task-based NIRS for language mapping in 33 healthy participants: 25 children (ages 3 to 16) and 8 adults. Data were acquired at rest and during a language task. Results show very good concordance between both approaches for language localization (Dice similarity coefficient=0.81±0.13) and hemispheric language dominance (kappa=0.86, p<0.006). The fcNIRS technique may be a valuable tool for language mapping in clinical populations, including children and patients with cognitive and behavioral impairments.

  6. Thermostatistics of bosonic and fermionic Fibonacci oscillators

    NASA Astrophysics Data System (ADS)

    Algin, Abdullah; Arik, Metin; Senay, Mustafa; Topcu, Gozde

    2017-01-01

    In this work, we first introduce some new properties concerning the Fibonacci calculus. We then discuss the thermostatistics of gas models of two-parameter deformed oscillators, called bosonic and fermionic Fibonacci oscillators, in the thermodynamical limit. In this framework, we analyze the behavior of two-parameter deformed mean occupation numbers describing the Fibonacci-type bosonic and fermionic intermediate-statistics particles. A virial expansion of the equation of state for the bosonic Fibonacci oscillators’ gas model is obtained in both two and three dimensions, and the first five virial coefficients are derived in terms of the real independent deformation parameters p and q. The effect of bosonic and fermionic p, q-deformation on the thermostatistical properties of Fibonacci-type p, q-boson and p, q-fermion gas models are also discussed. The results obtained in this work can be useful for investigating some exotic quasiparticle states encountered in condensed matter systems.

  7. Search for new heavy charged gauge bosons

    SciTech Connect

    Magass, Carsten Martin

    2007-11-02

    Additional gauge bosons are introduced in many theoretical extensions to the Standard Model. A search for a new heavy charged gauge boson W' decaying into an electron and a neutrino is presented. The data used in this analysis was taken with the D0 detector at the Fermilab proton-antiproton collider at a center-of-mass energy of 1.96 TeV and corresponds to an integrated luminosity of about 1 fb-1. Since no significant excess is observed in the data, an upper limit is set on the production cross section times branching fraction σW'xBr (W' → ev). Using this limit, a W' boson with mass below ~1 TeV can be excluded at the 95% confidence level assuming that the new boson has the same couplings to fermions as the Standard Model W boson.

  8. The Boson peak in supercooled water

    PubMed Central

    Kumar, Pradeep; Wikfeldt, K. Thor; Schlesinger, Daniel; Pettersson, Lars G. M.; Stanley, H. Eugene

    2013-01-01

    We perform extensive molecular dynamics simulations of the TIP4P/2005 model of water to investigate the origin of the Boson peak reported in experiments on supercooled water in nanoconfined pores, and in hydration water around proteins. We find that the onset of the Boson peak in supercooled bulk water coincides with the crossover to a predominantly low-density-like liquid below the Widom line TW. The frequency and onset temperature of the Boson peak in our simulations of bulk water agree well with the results from experiments on nanoconfined water. Our results suggest that the Boson peak in water is not an exclusive effect of confinement. We further find that, similar to other glass-forming liquids, the vibrational modes corresponding to the Boson peak are spatially extended and are related to transverse phonons found in the parent crystal, here ice Ih. PMID:23771033

  9. Spherical boson stars as black hole mimickers

    SciTech Connect

    Guzman, F. S.; Rueda-Becerril, J. M.

    2009-10-15

    We present spherically symmetric boson stars as black hole mimickers based on the power spectrum of a simple accretion disk model. The free parameters of the boson star are the mass of the boson and the fourth-order self-interaction coefficient in the scalar field potential. We show that even if the mass of the boson is the only free parameter, it is possible to find a configuration that mimics the power spectrum of the disk due to a black hole of the same mass. We also show that for each value of the self-interaction a single boson star configuration can mimic a black hole at very different astrophysical scales in terms of the mass of the object and the accretion rate. In order to show that it is possible to distinguish one of our mimickers from a black hole, we also study the deflection of light.

  10. Analyzing the impact of social factors on homelessness: a Fuzzy Cognitive Map approach

    PubMed Central

    2013-01-01

    Background The forces which affect homelessness are complex and often interactive in nature. Social forces such as addictions, family breakdown, and mental illness are compounded by structural forces such as lack of available low-cost housing, poor economic conditions, and insufficient mental health services. Together these factors impact levels of homelessness through their dynamic relations. Historic models, which are static in nature, have only been marginally successful in capturing these relationships. Methods Fuzzy Logic (FL) and fuzzy cognitive maps (FCMs) are particularly suited to the modeling of complex social problems, such as homelessness, due to their inherent ability to model intricate, interactive systems often described in vague conceptual terms and then organize them into a specific, concrete form (i.e., the FCM) which can be readily understood by social scientists and others. Using FL we converted information, taken from recently published, peer reviewed articles, for a select group of factors related to homelessness and then calculated the strength of influence (weights) for pairs of factors. We then used these weighted relationships in a FCM to test the effects of increasing or decreasing individual or groups of factors. Results of these trials were explainable according to current empirical knowledge related to homelessness. Results Prior graphic maps of homelessness have been of limited use due to the dynamic nature of the concepts related to homelessness. The FCM technique captures greater degrees of dynamism and complexity than static models, allowing relevant concepts to be manipulated and interacted. This, in turn, allows for a much more realistic picture of homelessness. Through network analysis of the FCM we determined that Education exerts the greatest force in the model and hence impacts the dynamism and complexity of a social problem such as homelessness. Conclusions The FCM built to model the complex social system of homelessness

  11. Generating synthetic magnetic field intermittency using a Minimal Multiscale Lagrangian Mapping approach

    SciTech Connect

    Subedi, P.; Chhiber, R.; Tessein, J. A.; Wan, M.; Matthaeus, W. H.

    2014-12-01

    The Minimal Multiscale Lagrangian Mapping procedure developed in the context of neutral fluid turbulence is a simple method for generating synthetic vector fields. Using a sequence of low-pass filtered fields, fluid particles are displaced at their rms speed for some scale-dependent time interval, and then interpolated back to a regular grid. Fields produced in this way are seen to possess certain properties of real turbulence. This paper extends the technique to plasmas by taking into account the coupling between the velocity and magnetic fields. We examine several possible applications to plasma systems. One use is as initial conditions for simulations, wherein these synthetic fields may efficiently produce a strongly intermittent cascade. The intermittency properties of the synthetic fields are also compared with those of the solar wind. Finally, studies of cosmic ray transport and modulation in the test particle approximation may benefit from improved realism in synthetic fields produced in this way.

  12. A robotic approach to mapping post-eruptive volcanic fissure conduits

    NASA Astrophysics Data System (ADS)

    Parcheta, Carolyn E.; Pavlov, Catherine A.; Wiltsie, Nicholas; Carpenter, Kalind C.; Nash, Jeremy; Parness, Aaron; Mitchell, Karl L.

    2016-06-01

    VolcanoBot was developed to map volcanic vents and their underlying conduit systems, which are rarely preserved and generally inaccessible to human exploration. It uses a PrimeSense Carmine 1.09 sensor for mapping and carries an IR temperature sensor, analog distance sensor, and an inertial measurement unit (IMU) inside a protective shell. The first field test succeeded in collecting valuable scientific data but revealed several needed improvements, including more rugged cable connections and mechanical couplers, increased ground clearance, and higher-torque motors for uphill mobility. The second field test significantly improved on all of these aspects but it traded electrical ruggedness for reduced data collection speed. Data collected by the VolcanoBots, while intermittent, yield the first insights into the cm-scale geometry of volcanic fissures at depths of up to 25 m. VolcanoBot was deployed at the 1969 Mauna Ulu fissure system on Kīlauea volcano in Hawai'i. It collected first-of-its-kind data from inside the fissure system. We hypothesized that 1) fissure sinuosity should decrease with depth, 2) irregularity should be persistent with depth, 3) any blockages in the conduit should occur at the narrowest points, and 4) the fissure should narrow with depth until it is too narrow for VolcanoBot to pass or is plugged with solidified lava. Our field campaigns did not span enough lateral or vertical area to test sinuosity. The preliminary data indicate that 1) there were many irregularities along fissures at depth, 2) blockages occurred, but not at obviously narrow locations, and 3) the conduit width remained a consistent 0.4-0.5 m for most of the upper 10 m that we analyzed.

  13. Mixed linear model approach for mapping quantitative trait loci underlying crop seed traits.

    PubMed

    Qi, T; Jiang, B; Zhu, Z; Wei, C; Gao, Y; Zhu, S; Xu, H; Lou, X

    2014-09-01

    The crop seed is a complex organ that may be composed of the diploid embryo, the triploid endosperm and the diploid maternal tissues. According to the genetic features of seed characters, two genetic models for mapping quantitative trait loci (QTLs) of crop seed traits are proposed, with inclusion of maternal effects, embryo or endosperm effects of QTL, environmental effects and QTL-by-environment (QE) interactions. The mapping population can be generated either from double back-cross of immortalized F2 (IF2) to the two parents, from random-cross of IF2 or from selfing of IF2 population. Candidate marker intervals potentially harboring QTLs are first selected through one-dimensional scanning across the whole genome. The selected candidate marker intervals are then included in the model as cofactors to control background genetic effects on the putative QTL(s). Finally, a QTL full model is constructed and model selection is conducted to eliminate false positive QTLs. The genetic main effects of QTLs, QE interaction effects and the corresponding P-values are computed by Markov chain Monte Carlo algorithm for Gaussian mixed linear model via Gibbs sampling. Monte Carlo simulations were performed to investigate the reliability and efficiency of the proposed method. The simulation results showed that the proposed method had higher power to accurately detect simulated QTLs and properly estimated effect of these QTLs. To demonstrate the usefulness, the proposed method was used to identify the QTLs underlying fiber percentage in an upland cotton IF2 population. A computer software, QTLNetwork-Seed, was developed for QTL analysis of seed traits.

  14. Mapping permeability in low-resolution micro-CT images: A multiscale statistical approach

    NASA Astrophysics Data System (ADS)

    Botha, Pieter W. S. K.; Sheppard, Adrian P.

    2016-06-01

    We investigate the possibility of predicting permeability in low-resolution X-ray microcomputed tomography (µCT). Lower-resolution whole core images give greater sample coverage and are therefore more representative of heterogeneous systems; however, the lower resolution causes connecting pore throats to be represented by intermediate gray scale values and limits information on pore system geometry, rendering such images inadequate for direct permeability simulation. We present an imaging and computation workflow aimed at predicting absolute permeability for sample volumes that are too large to allow direct computation. The workflow involves computing permeability from high-resolution µCT images, along with a series of rock characteristics (notably open pore fraction, pore size, and formation factor) from spatially registered low-resolution images. Multiple linear regression models correlating permeability to rock characteristics provide a means of predicting and mapping permeability variations in larger scale low-resolution images. Results show excellent agreement between permeability predictions made from 16 and 64 µm/voxel images of 25 mm diameter 80 mm tall core samples of heterogeneous sandstone for which 5 µm/voxel resolution is required to compute permeability directly. The statistical model used at the lowest resolution of 64 µm/voxel (similar to typical whole core image resolutions) includes open pore fraction and formation factor as predictor characteristics. Although binarized images at this resolution do not completely capture the pore system, we infer that these characteristics implicitly contain information about the critical fluid flow pathways. Three-dimensional permeability mapping in larger-scale lower resolution images by means of statistical predictions provides input data for subsequent permeability upscaling and the computation of effective permeability at the core scale.

  15. An automatic approach for rice mapping in temperate region using time series of MODIS imagery: first results for Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Boschetti, M.; Nelson, A.; Manfrom, G.; Brivio, P. A.

    2012-04-01

    Timely and accurate information on crop typology and status are required to support suitable action to better manage agriculture production and reduce food insecurity. More specifically, regional crop masking and phenological information are important inputs for spatialized crop growth models for yield forecasting systems. Digital cartographic data available at global/regional scale, such as GLC2000, GLOBCOVER or MODIS land cover products (MOD12), are often not adequate for this crop modeling application. For this reason, there is a need to develop and test methods that can provide such information for specific cropsusing automated classification techniques.. In this framework we focused our analysis on the rice cultivation area detection due to the importance of this crop. Rice is a staple food for half of the world's population (FAO 2004). Over 90% of the world's rice is produced and consumed in Asia and the region is home to 70% of the world's poor, most of whom depend on rice for their livelihoods andor food security. Several initiatives are being promoted at the international level to provide maps of rice cultivated areas in South and South East Asia using different approaches available in literature for rice mapping in tropical regions. We contribute to these efforts by proposing an automatic method to detect rice cultivated areas in temperate regions exploiting MODIS 8-Day composite of Surface Reflectance at 500m spatial resolution (MOD09A1product). Temperate rice is cultivated worldwide in more than 20 countries covering around 16M ha for a total production of about 65M tons of paddy per year. The proposed method is based on a common approach available in literature that first identifies flood condition that can be related to rice agronomic practice and then checks for vegetation growth. The method presents innovative aspects related both to the flood detection, exploiting Short Wave Infrared spectral information, and to the crop grow monitoring analyzing

  16. An approach to define potential radon emission level maps using indoor radon concentration measurements and radiogeochemical data positive proportion relationships.

    PubMed

    Drolet, Jean-Philippe; Martel, Richard; Poulin, Patrick; Dessau, Jean-Claude; Lavoie, Denis; Parent, Michel; Lévesque, Benoît

    2013-10-01

    The aim of this paper is to present the first step of a new approach to make a map of radonprone areas showing different potential radon emission levels in the Quebec province. This map is a tool intended to assist the Quebec government in identifying populations with a higher risk of indoor radon gas exposure. This map of radon-prone areas used available radiogeochemical information for the province of Quebec: (1) Equivalent uranium (eU) concentration from airborne surface gamma-ray surveys; (2) uranium concentration measurements in sediments; and (3) bedrock and surficial geology. Positive proportion relationships (PPR) between each individual criterion and the 1417 available basement radon concentrations were demonstrated. It was also shown that those criteria were reliable indicators of radon-prone areas. The three criteria were discretized into 3, 2 and 2 statistically significant different classes respectively. For each class, statistical heterogeneity was validated by Kruskal-Wallis one way analyses of variance on ranks. Maps of radon-prone areas were traced down for each criterion. Based on this statistical study and on the maps of radon-prone areas in Quebec, 18% of the dwellings located in areas with an equivalent uranium (eU) concentration from airborne surface gamma-ray surveys under 0.75 ppm showed indoor radon concentrations above 150 Bq/m3. This percentage increases to 33% when eU concentrations are between 0.75 ppm and 1.25 ppm and exceeds 40% when eU concentrations are above 1.25 ppm. A uranium concentration in sediments above 20 ppm showed an indoor radon concentration geometric mean of 215 Bq/m3 with more than 69% of the dwellings exceeding 150 Bq/m3 or more than 50% of dwellings exceeding the Canadian radon guideline of 200 Bq/m3. It is also shown that the radon emission potential is higher where a uranium-rich bedrock unit is not covered by a low permeability (silt/clay) surficial deposit.

  17. Capturing nonlocal interaction effects in the Hubbard model: Optimal mappings and limits of applicability

    NASA Astrophysics Data System (ADS)

    van Loon, E. G. C. P.; Schüler, M.; Katsnelson, M. I.; Wehling, T. O.

    2016-10-01

    We investigate the Peierls-Feynman-Bogoliubov variational principle to map Hubbard models with nonlocal interactions to effective models with only local interactions. We study the renormalization of the local interaction induced by nearest-neighbor interaction and assess the quality of the effective Hubbard models in reproducing observables of the corresponding extended Hubbard models. We compare the renormalization of the local interactions as obtained from numerically exact determinant quantum Monte Carlo to approximate but more generally applicable calculations using dual boson, dynamical mean field theory, and the random phase approximation. These more approximate approaches are crucial for any application with real materials in mind. Furthermore, we use the dual boson method to calculate observables of the extended Hubbard models directly and benchmark these against determinant quantum Monte Carlo simulations of the effective Hubbard model.

  18. Mapping trees outside forests using high-resolution aerial imagery: a comparison of pixel- and object-based classification approaches.

    PubMed

    Meneguzzo, Dacia M; Liknes, Greg C; Nelson, Mark D

    2013-08-01

    Discrete trees and small groups of trees in nonforest settings are considered an essential resource around the world and are collectively referred to as trees outside forests (ToF). ToF provide important functions across the landscape, such as protecting soil and water resources, providing wildlife habitat, and improving farmstead energy efficiency and aesthetics. Despite the significance of ToF, forest and other natural resource inventory programs and geospatial land cover datasets that are available at a national scale do not include comprehensive information regarding ToF in the United States. Additional ground-based data collection and acquisition of specialized imagery to inventory these resources are expensive alternatives. As a potential solution, we identified two remote sensing-based approaches that use free high-resolution aerial imagery from the National Agriculture Imagery Program (NAIP) to map all tree cover in an agriculturally dominant landscape. We compared the results obtained using an unsupervised per-pixel classifier (independent component analysis-[ICA]) and an object-based image analysis (OBIA) procedure in Steele County, Minnesota, USA. Three types of accuracy assessments were used to evaluate how each method performed in terms of: (1) producing a county-level estimate of total tree-covered area, (2) correctly locating tree cover on the ground, and (3) how tree cover patch metrics computed from the classified outputs compared to those delineated by a human photo interpreter. Both approaches were found to be viable for mapping tree cover over a broad spatial extent and could serve to supplement ground-based inventory data. The ICA approach produced an estimate of total tree cover more similar to the photo-interpreted result, but the output from the OBIA method was more realistic in terms of describing the actual observed spatial pattern of tree cover.

  19. LANDSCAPE ECOLOGY APPROACHES FOR DETECTING, MAPPING, AND ASSESSING THE VULNERABILITY OF DEPRESSIONAL WETLANDS

    EPA Science Inventory

    U.S. EPA is using a landscape ecology approach to assess the ecological/hydrologic functions and related human values of depressional wetlands along coastal Texas, considered to be vulnerable to human disturbance. Many of those wetlands may be at high risk because of recent court...

  20. Geospatial Approach to Regional Mapping of Research Library Holdings: Use of Arcinfo at IRANDOC

    ERIC Educational Resources Information Center

    Sedighi, Mehri-e-

    2007-01-01

    Purpose: The purpose of this paper is to provide a report on the application of a Geographic Information System (GIS), ArcInfo, in the cataloguing of geosciences documents held by IRANDOC. Design/methodology/approach: The steps involved in the application are described: gathering the data and required input including the attribute and spatial…

  1. Constructing Adverse Outcome Pathways: a Demonstration of an Ontology-based Semantics Mapping Approach

    EPA Science Inventory

    Adverse outcome pathway (AOP) provides a conceptual framework to evaluate and integrate chemical toxicity and its effects across the levels of biological organization. As such, it is essential to develop a resource-efficient and effective approach to extend molecular initiating ...

  2. An indicators' based approach to Drought and Water Scarcity Risk Mapping in Pinios River Basin, Greece.

    NASA Astrophysics Data System (ADS)

    Kossida, Maggie; Mimikou, Maria

    2013-04-01

    Assessing the vulnerability and the associated risk to water scarcity and drought is a complex multi-factor problem. The underlying exposure to climatic stresses may be similar even in quite different conditions, yet the vulnerability and prevailing risk are a function of the socio-economic state, the current policy and institutional setting, the adaptive capacity of the affected area and population, and the response strategies adopted (Kossida et al., 2012). Although flood risk assessment has been elaborated under the EU Floods Directive, there is currently a lack of analytical frameworks for the definition and assessment of drought and water scarcity related risk at European level. This can partially be attributed to the inherent complexity of such phenomena which lie at the crossroads between physical and anthropogenic drivers and pressures, operating on many scales, and with a variety of impacts on many sectors. The quantification of the various components of drought and water scarcity risk is challenging since data present limitations, relevant indicators that can represent or proxy the various components are still not clearly defined, while their relevant weights need to be determined in view of the prevailing regional conditions. The current study in Pinios River Basin, an area highly impacted by drought and water scarcity, proposes a methodology for drought and water scarcity risk assessment using blended indicators. Using the Standard Precipitation Index (SPI) as a base drought indicator, relevant sub-indicators reflecting the magnitude, severity, duration and recurrence of drought events from 1980-2011 have been produced. These sub-indicators have been assigned relevant scores and have been blended into a Drought Vulnerability Index (DVI) using different weights derived from an analytical hierarchy process (AHP). The resulting map of DVI has been then blended with additional socio-economic indicators of surface and groundwater exploitation, water deficit

  3. Towards a new approach for generating probabilistic hazard maps for pyroclastic flows during lava dome eruptions.

    NASA Astrophysics Data System (ADS)

    Calder, E. S.; Pitman, B.; Wolpert, R.; Bayarri, S.; Spiller, E.; Berger, J.

    2009-05-01

    It is increasingly being understood that development of mathematical models of a geophysical phenomena, while a fundamental step, is only part of the process of modeling and predicting inundation limits for natural hazards. In this work we combine data from hundreds of observed pyroclastic flows at the Soufriere Hills Volcano, Montserrat, a geophysical flow model, and statistical modeling to derive a new methodology for generating probabilistic hazard maps. The initial step consists of estimating probabilities of inundation at particular discrete points of interest (e.g. airport and Plymouth). The methodology starts with a computer model of the geophysical process, in this case the TITAN2D model that has been developed for modeling geophysical mass flows. A key input to the computer model is the probability distribution for the initial volume and direction of the flows based on observed data. An important limitation is that for modeling purposes, the observations represent relatively scarce datasets, while from a volcanological perspective datasets such as those from the prolonged and relatively well-monitored eruption of the Soufriere Hills Volcano, are as complete as can be realistically obtained. By combining flow event data, probability modeling and statistical methods, a probability distribution of severity and frequency of flow events is derived. Understanding and predicting the effects of volcanic hazards involves understanding the extreme event tail (the largest flow events) but this is notoriously difficult, especially with the limited data and prohibitively expensive to compute.. Instead a statistical emulator (or surrogate of the computer model) is used, a computationally cheap response surface approximating the output of the flow simulations, which is constructed based on carefully chosen computer model runs. The speed of the emulator then allows to 'solve the inverse problem': that is, to determine regions of inputs values (characteristics of the flow

  4. A Novel Approach to Mapping Intertidal Areas Using Shore-Based X-band Marine Radar

    NASA Astrophysics Data System (ADS)

    Bird, Cai; Bell, Paul

    2014-05-01

    Monitoring the morphology of coastal zones in response to high energy weather events and changing patterns of erosion and deposition over time is vital in enabling effective decision-making at the coast. Common methods of mapping intertidal bathymetry currently include vessel-based sonar and airborne LiDAR surveys, which are expensive and thus not routinely collected on a continuous basis. Marine radar is a ubiquitous technology in the marine industry and many ports operate a system to guide ships into port, this work aims to utilise this already existing infrastructure to determine bathymetry over large intertidal areas, currently up to 4 km from the radar. Standard X-band navigational radar has been used in the marine industry to measure hydrodynamics and derive bathymetry using empirical techniques for several decades. Methods of depth mapping thus far have relied on the electromagnetic backscattering from wind-roughened water surface, which allows a radar to gather sea surface image data but requires the waves to be clearly defined. The work presented here does not rely on identifying and measuring these spatial wave features, which increases the robustness of the method. Image data collected by a 9.4Ghz Kelvin Hughes radar from a weather station on Hilbre Island at the mouth of the River Dee estuary, UK were used in the development of this method. Image intensity at each pixel is a function of returned electromagnetic energy, which in turn can be related to the roughness of the sea surface. Images collected over time periods of 30 minutes show general patterns of wave breaking and mark the advance and retreat of the waterline in accordance with the tidal cycle and intertidal morphology. Each pixel value can be extracted from these mean images and analysed over the course of several days, giving a fluctuating time series of pixel intensity, the gradient of which gives a series of pulses representing transitions between wet and dry at each location. A tidal

  5. Strong correlations in bosons and fermions

    NASA Astrophysics Data System (ADS)

    Tilahun, Dagim

    of the lasers. But as always, even in these designer-made "solid state" systems, practical considerations introduce complications that blur the theoretical interpretation of experimental results, such as inhomogeneities in the lattice structure. The first part of this thesis presents a quantum theory of ultracold bosonic atoms in optical lattices capable of describing the properties of the various phases and the transitions between them. Its usefulness, compared to other approaches, we believe rests in its broad applicability and in the relative ease it handles the complications while producing quantitatively accurate results. The second topic of the thesis deals with the behavior of electrons whose motion is restricted to a single dimension. This reduction in the dimensionality of the system brings about cataclysmic changes to their properties. This is evinced in the collective bosonic statistics of the excitations, despite the fermionic nature of the constituent particles. This is in contrast to higher dimensions where the excitations are single particle type that maintain fermionic statistics even under strong interactions, so called quasiparticles. This is perhaps quite intuitive in that a one dimensional electron can not move about without strongly affecting all the other electrons in its way, hence the collective nature of the excitations. Perhaps the most exotic feature of these systems, known as Luttinger liquids (LL), is the decoupling of the spin and charge degrees of freedom, a phenomenon known as spin-charge separation. Strong repulsive (attractive) interactions tend to enhance the propagating velocity of the charge (spin) mode while suppressing the other. At low density of electrons, stronger interactions open up a window of energy between the characteristics charge and spin energies. If one can tune the temperature of the system to this window, the spin sector will consist of thermalized random spins while the charge is still essentially a Luttinger

  6. An approach to improve the spatial resolution of a force mapping sensing system

    NASA Astrophysics Data System (ADS)

    Negri, Lucas Hermann; Manfron Schiefer, Elberth; Sade Paterno, Aleksander; Muller, Marcia; Luís Fabris, José

    2016-02-01

    This paper proposes a smart sensor system capable of detecting sparse forces applied to different positions of a metal plate. The sensing is performed with strain transducers based on fiber Bragg gratings (FBG) distributed under the plate. Forces actuating in nine squared regions of the plate, resulting from up to three different loads applied simultaneously to the plate, were monitored with seven transducers. The system determines the magnitude of the force/pressure applied on each specific area, even in the absence of a dedicated transducer for that area. The set of strain transducers with coupled responses and a compressive sensing algorithm are employed to solve the underdetermined inverse problem which emerges from mapping the force. In this configuration, experimental results have shown that the system is capable of recovering the value of the load distributed on the plate with a signal-to-noise ratio better than 12 dB, when the plate is submitted to three simultaneous test loads. The proposed method is a practical illustration of compressive sensing algorithms for the reduction of the number of FBG-based transducers used in a quasi-distributed configuration.

  7. Mapping Risk of Malaria Transmission in Mainland Portugal Using a Mathematical Modelling Approach

    PubMed Central

    Capinha, César; Rocha, Jorge; Sousa, Carla

    2016-01-01

    Malaria is currently one of the world´s major health problems. About a half-million deaths are recorded every year. In Portugal, malaria cases were significantly high until the end of the 1950s but the disease was considered eliminated in 1973. In the past few years, endemic malaria cases have been recorded in some European countries. With the increasing human mobility from countries with endemic malaria to Portugal, there is concern about the resurgence of this disease in the country. Here, we model and map the risk of malaria transmission for mainland Portugal, considering 3 different scenarios of existing imported infections. This risk assessment resulted from entomological studies on An. atroparvus, the only known mosquito capable of transmitting malaria in the study area. We used the malariogenic potential (determined by receptivity, infectivity and vulnerability) applied over geospatial data sets to estimate spatial variation in malaria risk. The results suggest that the risk exists, and the hotspots are concentrated in the northeast region of the country and in the upper and lower Alentejo regions. PMID:27814371

  8. Critical phenomena in self-organizing feature maps: Ginzburg-Landau approach

    NASA Astrophysics Data System (ADS)

    der, R.; Herrmann, M.

    1994-06-01

    Self-organizing feature maps (SOFM's) as generated by Kohonen's algorithm are prominent examples of the cross fertilization between theoretical physics and neurobiology. SOFM's serve as high-fidelity models for the internal representation of the external world in the cortex. This is exploited for applications in the fields of data analysis, robotics, and for the data-driven coarse graining of state spaces of nonlinear dynamical systems. From the point of view of physics Kohonen's algorithm may be viewed as a stochastic dynamical equation of motion for a many particle system of high complexity which may be analyzed by methods of nonequilibrium statistical mechanics. We present analytical and numerical studies of symmetry-breaking phenomena in Kohonen's SOFM that occur due to a topological mismatch between the input space and the neuron setup. We give a microscopic derivation for the time dependent Ginzburg-Landau equations describing the behavior of the order parameter close to the critical point where a topology preserving second-order phase transition takes place. By extensive computer simulations we do not only support our theoretical findings, but also discover a first order transition leading to a topology violating metastable state. Consequently, close to the critical point we observe a phase-coexistence regime.

  9. Spectral SP: A New Approach to Mapping Reservoir Flow and Permeability

    SciTech Connect

    Thomas, Donald M.; Lienert, Barry R.; Wallin, Erin L.; Gasperikova, Erika

    2014-05-27

    Our objectives for the current project were to develop an innovative inversion and analysis procedure for magnetotelluric field data and time variable self-potentials that will enable us to map not only the subsurface resistivity structure of a geothermal prospect but to also delineate the permeability distribution within the field. Hence, the ultimate objective were to provide better targeting information for exploratory and development drilling of a geothermal prospect. Field data were collected and analyzed from the Kilauea Summit, Kilauea East Rift Zone, and the Humuula Saddle between Mauna Loa and Mauna Kea volcanoes. All of these areas were known or suspected to have geothermal activity of varying intensities. Our results provided evidence for significant long-term coordinated changes in spontaneous potential that could be associated with subsurface flows, significant interferences were encountered that arose from surface environmental changes (rainfall, temperature) that rendered it nearly impossible to unequivocally distinguish between deep fluid flow changes and environmental effects. Further, the analysis of the inferred spontaneous potential changes in the context of depth of the signals, and hence, permeability horizons, were unable to be completed in the time available.

  10. A New Approach to the Internal Calibration of Reverberation-Mapping Spectra

    NASA Astrophysics Data System (ADS)

    Fausnaugh, M. M.

    2017-02-01

    We present a new procedure for the internal (night-to-night) calibration of timeseries spectra, with specific applications to optical AGN reverberation mapping data. The traditional calibration technique assumes that the narrow [O iii] λ5007 emission-line profile is constant in time; given a reference [O iii] λ5007 line profile, nightly spectra are aligned by fitting for a wavelength shift, a flux rescaling factor, and a change in the spectroscopic resolution. We propose the following modifications to this procedure: (1) we stipulate a constant spectral resolution for the final calibrated spectra, (2) we employ a more flexible model for changes in the spectral resolution, and (3) we use a Bayesian modeling framework to assess uncertainties in the calibration. In a test case using data for MCG+08-11-011, these modifications result in a calibration precision of ∼1 millimagnitude, which is approximately a factor of five improvement over the traditional technique. At this level, other systematic issues (e.g., the nightly sensitivity functions and Feii contamination) limit the final precision of the observed light curves. We implement this procedure as a python package (mapspec), which we make available to the community.

  11. Physical activity, physical fitness and academic achievement in adolescents: a self-organizing maps approach.

    PubMed

    Pellicer-Chenoll, Maite; Garcia-Massó, Xavier; Morales, Jose; Serra-Añó, Pilar; Solana-Tramunt, Mònica; González, Luis-Millán; Toca-Herrera, José-Luis

    2015-06-01

    The relationship among physical activity, physical fitness and academic achievement in adolescents has been widely studied; however, controversy concerning this topic persists. The methods used thus far to analyse the relationship between these variables have included mostly traditional lineal analysis according to the available literature. The aim of this study was to perform a visual analysis of this relationship with self-organizing maps and to monitor the subject's evolution during the 4 years of secondary school. Four hundred and forty-four students participated in the study. The physical activity and physical fitness of the participants were measured, and the participants' grade point averages were obtained from the five participant institutions. Four main clusters representing two primary student profiles with few differences between boys and girls were observed. The clustering demonstrated that students with higher energy expenditure and better physical fitness exhibited lower body mass index (BMI) and higher academic performance, whereas those adolescents with lower energy expenditure exhibited worse physical fitness, higher BMI and lower academic performance. With respect to the evolution of the students during the 4 years, ∼25% of the students originally clustered in a negative profile moved to a positive profile, and there was no movement in the opposite direction.

  12. Mapping social capital: a critical contextual approach for working with low-status families.

    PubMed

    Garcia, Marisol; McDowell, Teresa

    2010-01-01

    Promoting justice in therapeutic work with families demands an analysis of contextual factors such as race, ethnicity, gender, and social class in relationship to societal systems of power, privilege, and oppression. A broad understanding of these dynamics, however, is inadequate to inform our work with families whose social capital severely limits available life choices, social influence, and material resources. In this article, we describe working from a critical contextual perspective to consider how families gain and/or lose social capital through participation in multiple contexts. We introduce a technique for mapping social capitol within and across multiple systems as well as suggestions for interventions aimed at increasing the social well-being of low-status families. These include considering the dynamics of boundary crossing, recognizing and optimizing resistance to oppressive dynamics, finding ways to limit constraints and optimize opportunities, and developing webs of allies to support family functioning and access to resources. We offer the example of 13-year-old Pepe as a case in point.

  13. Mapping Risk of Malaria Transmission in Mainland Portugal Using a Mathematical Modelling Approach.

    PubMed

    Gomes, Eduardo; Capinha, César; Rocha, Jorge; Sousa, Carla

    2016-01-01

    Malaria is currently one of the world´s major health problems. About a half-million deaths are recorded every year. In Portugal, malaria cases were significantly high until the end of the 1950s but the disease was considered eliminated in 1973. In the past few years, endemic malaria cases have been recorded in some European countries. With the increasing human mobility from countries with endemic malaria to Portugal, there is concern about the resurgence of this disease in the country. Here, we model and map the risk of malaria transmission for mainland Portugal, considering 3 different scenarios of existing imported infections. This risk assessment resulted from entomological studies on An. atroparvus, the only known mosquito capable of transmitting malaria in the study area. We used the malariogenic potential (determined by receptivity, infectivity and vulnerability) applied over geospatial data sets to estimate spatial variation in malaria risk. The results suggest that the risk exists, and the hotspots are concentrated in the northeast region of the country and in the upper and lower Alentejo regions.

  14. Statistical approaches to human brain mapping by functional magnetic resonance imaging.

    PubMed

    Lange, N

    1996-02-28

    Proper use of functional neuro-imaging through effective experimental design and modern statistical analysis provides new insights in current brain research. This tutorial has two aims: to describe aspects of this technology to applied statisticians and to provide some statistical ideas to neuroscientists unfamiliar with quantitative analytic methods that accommodate randomness. Introductory background material and ample references to current literature on the physics of magnetic resonance imaging, Fourier methods for image reconstruction and measures of image quality are included. Two of the statistical approaches mentioned here are extensions of established methods for longitudinal data analysis to the frequency domain. A recent case study provides real-world instances of approaches, problems and open questions encountered in current functional neuro-imaging research and an introduction to the analysis of spatial time series in this context.

  15. A segmentation-based approach to SAR change detection and mapping

    NASA Astrophysics Data System (ADS)

    Garzelli, Andrea; Zoppetti, Claudia

    2016-10-01

    The potentials of SAR sensors in change detection applications have been recently strengthened by the high spatial resolution and the short revisit time provided by the new generation SAR-based missions, such as COSMO- SkyMed, TerraSAR-X, and RadarSat 3. Classical pixel-based change detection methods exploit first-order statistics variations in multitemporal acquisitions. Higher-order statistics may improve the reliability of the results, while plain object-based change detection are rarely applied to SAR images due to the low signal-to-noise ratio which characterizes 1-look VHR SAR image products. The paper presents a hybrid approach considering both a pixel-based selection of likely-changed pixels and a segmentation-driven step based on the assumption that structural changes correspond to some clusters in a multiscale amplitude/texture representation. Experiments on simulated and true SAR image pairs demonstrate the advantages of the proposed approach.

  16. Exploring links between juvenile offenders and social disorganization at a large map scale: a Bayesian spatial modeling approach

    NASA Astrophysics Data System (ADS)

    Law, Jane; Quick, Matthew

    2013-01-01

    This paper adopts a Bayesian spatial modeling approach to investigate the distribution of young offender residences in York Region, Southern Ontario, Canada, at the census dissemination area level. Few geographic researches have analyzed offender (as opposed to offense) data at a large map scale (i.e., using a relatively small areal unit of analysis) to minimize aggregation effects. Providing context is the social disorganization theory, which hypothesizes that areas with economic deprivation, high population turnover, and high ethnic heterogeneity exhibit social disorganization and are expected to facilitate higher instances of young offenders. Non-spatial and spatial Poisson models indicate that spatial methods are superior to non-spatial models with respect to model fit and that index of ethnic heterogeneity, residential mobility (1 year moving rate), and percentage of residents receiving government transfer payments are, respectively, the most significant explanatory variables related to young offender location. These findings provide overwhelming support for social disorganization theory as it applies to offender location in York Region, Ontario. Targeting areas where prevalence of young offenders could or could not be explained by social disorganization through decomposing the estimated risk map are helpful for dealing with juvenile offenders in the region. Results prompt discussion into geographically targeted police services and young offender placement pertaining to risk of recidivism. We discuss possible reasons for differences and similarities between the previous findings (that analyzed offense data and/or were conducted at a smaller map scale) and our findings, limitations of our study, and practical outcomes of this research from a law enforcement perspective.

  17. A molecular dynamics approach to receptor mapping: application to the 5HT3 and beta 2-adrenergic receptors.

    PubMed

    Gouldson, P R; Winn, P J; Reynolds, C A

    1995-09-29

    A molecular dynamics-based approach to receptor mapping is proposed, based on the method of Rizzi (Rizzi, J. P.; et al. J. Med. Chem. 1990, 33, 2721). In Rizzi's method, the interaction energy between a series of drug molecules and probe atoms (which mimic functional groups on the receptor, such as hydrogen bond donors) was calculated. These interactions were calculated on a three-dimensional grid within a molecular mechanics parameters, were placed at these minima. The distances between the dummy atom sites were monitored during molecular dynamics simulations and plotted as distance distribution functions. Important distances within the receptor became apparent, as drugs with a common mode of binding share similar peaks in the distance distribution functions. In the case of specific 5HT3 ligands, the important donor--acceptor distance within the receptor has a range of ca. 7.9--8.9 A. In the case of specific beta 2-adrenergic ligands, the important donor--acceptor distances within the receptor lie between ca. 7--9 A and between 8 and 10 A. These distances distribution functions were used to assess three different models of the beta 2-adrenergic G-protein-coupled receptor. The comparison of the distance distribution functions for the simulation with the actual donor--acceptor distances in the receptor models suggested that two of the three receptor models were much more consistent with the receptor-mapping studies. These receptor-mapping studies gave support for the use of rhodopsin, rather than the bacteriorhodopsin template, for modeling G-protein-coupled receptors but also sounded a warning that agreement with binding data from site-directed mutagenesis experiments does not necessarily validate a receptor model.

  18. Multi-Sensor Approach to Mapping Snow Cover Using Data From NASA's EOS Aqua and Terra Spacecraft

    NASA Astrophysics Data System (ADS)

    Armstrong, R. L.; Brodzik, M. J.

    2003-12-01

    Snow cover is an important variable for climate and hydrologic models due to its effects on energy and moisture budgets. Over the past several decades both optical and passive microwave satellite data have been utilized for snow mapping at the regional to global scale. For the period 1978 to 2002, we have shown earlier that both passive microwave and visible data sets indicate a similar pattern of inter-annual variability, although the maximum snow extents derived from the microwave data are, depending on season, less than those provided by the visible satellite data and the visible data typically show higher monthly variability. Snow mapping using optical data is based on the magnitude of the surface reflectance while microwave data can be used to identify snow cover because the microwave energy emitted by the underlying soil is scattered by the snow grains resulting in a sharp decrease in brightness temperature and a characteristic negative spectral gradient. Our previous work has defined the respective advantages and disadvantages of these two types of satellite data for snow cover mapping and it is clear that a blended product is optimal. We present a multi-sensor approach to snow mapping based both on historical data as well as data from current NASA EOS sensors. For the period 1978 to 2002 we combine data from the NOAA weekly snow charts with passive microwave data from the SMMR and SSM/I brightness temperature record. For the current and future time period we blend MODIS and AMSR-E data sets. An example of validation at the brightness temperature level is provided through the comparison of AMSR-E with data from the well-calibrated heritage SSM/I sensor over a large homogeneous snow-covered surface (Dome C, Antarctica). Prototype snow cover maps from AMSR-E compare well with maps derived from SSM/I. Our current blended product is being developed in the 25 km EASE-Grid while the MODIS data being used are in the Climate Modelers Grid (CMG) at approximately 5 km

  19. Approaches to strategic research and technology (R&T) analysis and road mapping

    NASA Astrophysics Data System (ADS)

    Mankins, John C.

    2002-07-01

    Increasingly, the timely and successful incorporation of innovative technologies into new systems is a critical factor in their success or failure. This is true for both commercial and government space missions. In addition, continuing progress in methodologies that may enable the effective identification of long-term technology needs and opportunities—and the guidance of ongoing research and technology (R&T) programs to address them—is vital to progress in space exploration and commercial development. NASA's long-standing use of technology readiness levels (TRLs) is one such approach. These technology discipline-independent metrics provide a valuable tool in technology management at all levels in an organization. However, TRLs provide only the basic guideposts for R&T management: information on the current and desired level of maturity of a technology for a particular application. In order to succeed over the longer term, additional methodologies are needed, including those which allow the identification of anticipated uncertainty in planned R&T programs, as well as approaches that permit the identification of overall technology-derived uncertainty in future space systems developments. This paper provides a preliminary discussion of this critical subject, including an overview of the history and the current practices of the TRL approach. In addition, the paper presents a recently-formulated strategic technology management approach that attempts to address the question of uncertainty in technology development and applications: the Integrated Technology Analysis Methodology (ITAM). The paper concludes with a discussion of a future directions for space technology management, and how these tools might be used to facilitate coordination and discussions in an international setting.

  20. A Tetrahedron-Based Endmember Selection Approach for Urban Impervious Surface Mapping

    PubMed Central

    Wang, Wei; Yao, Xinfeng; Zhai, Junpeng; Ji, Minhe

    2014-01-01

    The pixel purity index (PPI) and two-dimensional (2-D) scatter plots are two popular techniques for endmember extraction in remote sensing spectral mixture analysis, yet both suffer from one major drawback, that is, the selection of a final set of endmembers has to endure a cumbersome process of iterative visual inspection and human intervention, especially when a spectrally-complex urban scene is involved. Within the conceptual framework of a V-H-L-S (vegetation-high albedo-low albedo-soil) model, which is expanded from the classic V-I-S (vegetation-impervious surface-soil) model, a tetrahedron-based endmember selection approach combined with a multi-objective optimization genetic algorithm (MOGA) was designed to identify urban endmembers from multispectral imagery. The tetrahedron defining the enclosing volume of MNF-transformed pixels in a three-dimensional (3-D) space was algorithmically sought, so that the tetrahedral vertices can ideally match the four components of the adopted model. A case study with Landsat Enhanced Thematic Mapper Plus (ETM+) satellite imagery in Shanghai, China was conducted to verify the validity of the method. The method performance was compared with those of the traditional PPI and 2-D scatter plots approaches. The results indicated that the tetrahedron-based endmember selection approach performed better in both accuracy and ease of identification for urban surface endmembers owing to the 3-D visualization analysis and use of the MOGA. PMID:24892938

  1. A tetrahedron-based endmember selection approach for urban impervious surface mapping.

    PubMed

    Wang, Wei; Yao, Xinfeng; Zhai, Junpeng; Ji, Minhe

    2014-01-01

    The pixel purity index (PPI) and two-dimensional (2-D) scatter plots are two popular techniques for endmember extraction in remote sensing spectral mixture analysis, yet both suffer from one major drawback, that is, the selection of a final set of endmembers has to endure a cumbersome process of iterative visual inspection and human intervention, especially when a spectrally-complex urban scene is involved. Within the conceptual framework of a V-H-L-S (vegetation-high albedo-low albedo-soil) model, which is expanded from the classic V-I-S (vegetation-impervious surface-soil) model, a tetrahedron-based endmember selection approach combined with a multi-objective optimization genetic algorithm (MOGA) was designed to identify urban endmembers from multispectral imagery. The tetrahedron defining the enclosing volume of MNF-transformed pixels in a three-dimensional (3-D) space was algorithmically sought, so that the tetrahedral vertices can ideally match the four components of the adopted model. A case study with Landsat Enhanced Thematic Mapper Plus (ETM+) satellite imagery in Shanghai, China was conducted to verify the validity of the method. The method performance was compared with those of the traditional PPI and 2-D scatter plots approaches. The results indicated that the tetrahedron-based endmember selection approach performed better in both accuracy and ease of identification for urban surface endmembers owing to the 3-D visualization analysis and use of the MOGA.

  2. Mapping and valuing ecosystem services as an approach for conservation and natural-resource management.

    PubMed

    Tallis, Heather; Polasky, Stephen

    2009-04-01

    Current approaches to conservation and natural-resource management often focus on single objectives, resulting in many unintended consequences. These outcomes often affect society through unaccounted-for ecosystem services. A major challenge in moving to a more ecosystem-based approach to management that would avoid such societal damages is the creation of practical tools that bring a scientifically sound, production function-based approach to natural-resource decision making. A new set of computer-based models is presented, the Integrated Valuation of Ecosystem Services and Tradeoffs tool (InVEST) that has been designed to inform such decisions. Several of the key features of these models are discussed, including the ability to visualize relationships among multiple ecosystem services and biodiversity, the ability to focus on ecosystem services rather than biophysical processes, the ability to project service levels and values in space, sensitivity to manager-designed scenarios, and flexibility to deal with data and knowledge limitations. Sample outputs of InVEST are shown for two case applications; the Willamette Basin in Oregon and the Amazon Basin. Future challenges relating to the incorporation of social data, the projection of social distributional effects, and the design of effective policy mechanisms are discussed.

  3. Mapping the Similarities of Spectra: Global and Locally-biased Approaches to SDSS Galaxies

    NASA Astrophysics Data System (ADS)

    Lawlor, David; Budavári, Tamás; Mahoney, Michael W.

    2016-12-01

    We present a novel approach to studying the diversity of galaxies. It is based on a novel spectral graph technique, that of locally-biased semi-supervised eigenvectors. Our method introduces new coordinates that summarize an entire spectrum, similar to but going well beyond the widely used Principal Component Analysis (PCA). Unlike PCA, however, this technique does not assume that the Euclidean distance between galaxy spectra is a good global measure of similarity. Instead, we relax that condition to only the most similar spectra, and we show that doing so yields more reliable results for many astronomical questions of interest. The global variant of our approach can identify very finely numerous astronomical phenomena of interest. The locally-biased variants of our basic approach enable us to explore subtle trends around a set of chosen objects. The power of the method is demonstrated in the Sloan Digital Sky Survey Main Galaxy Sample, by illustrating that the derived spectral coordinates carry an unprecedented amount of information.

  4. NLO Vector Boson Production With Light Jets

    SciTech Connect

    Bern, Z.; Diana, G.; Dixon, L.J.; Febres Cordero, F.; Forde, D.; Gleisberg, T.; Hoeche, S.; Ita, H.; Kosower, D.A.; Maitre, D.; Ozeren, K.

    2012-02-15

    In this contribution we present recent progress in the computation of next-to-leading order (NLO) QCD corrections for the production of an electroweak vector boson in association with jets at hadron colliders. We focus on results obtained using the virtual matrix element library BlackHat in conjunction with SHERPA, focusing on results relevant to understanding the background to top production. The production of a vector boson in association with several jets at the Large Hadron Collider (LHC) is an important background for other Standard Model processes as well as new physics signals. In particular, the production of a W boson in association with many jets is an important background for processes involving one or more top quarks. Precise predictions for the backgrounds are crucial to measurement of top-quark processes. Vector boson production in association with multiple jets is also a very important background for many SUSY searches, as it mimics the signatures of many typical decay chains. Here we will discuss how polarization information can be used as an additional handle to differentiate top pair production from 'prompt' W-boson production. More generally, ratios of observables, for example for events containing a W boson versus those containing a Z boson, are expected to be better-behaved as many uncertainties cancel in such ratios. Precise calculation of ratios, along with measurement of one of the two processes in the ratio, can be used in data-driven techniques for estimating backgrounds.

  5. Has the Higgs boson been discovered?

    PubMed

    Renton, Peter

    2004-03-11

    The standard model of particle physics describes the strong and electroweak interactions of fermions (spin-1/2), gauge bosons (spin-1) and a final vital ingredient--the spin-0 Higgs boson, which gives masses to the other particles. But the Higgs boson has yet to be discovered, and its own mass is not specified by the theory. There is some evidence (although statistically not very significant) for its detection at a mass of about 115 GeV/c2, from electron-positron interactions at LEP (the Large Electron Positron collider). Indirect methods can also be used to constrain the mass of the Higgs boson, because it affects other observable quantities (for example, the mass of the W boson and some measurable properties of the Z boson). An indirect determination of the Higgs boson mass from the most recent measurements of such quantities yields a value compatible with 115 GeV/c2, but with some important caveats arising from inconsistencies in the present data.

  6. Resonant x-ray emission spectroscopy of liquid water: novel instrumentation, high resolution, and the"map" approach

    SciTech Connect

    Weinhardt, L.; Fuchs, O.; Blum, M.; Bär, M.; Weigand, M.; Denlinger, J.D.; Zubavichus, Y.; Zharnikov, M.; Grunze, M.; Heske, C.; Umbach, E.

    2008-06-17

    Techniques to study the electronic structure of liquids are rare. Most recently, resonant x-ray emission spectroscopy (XES) has been shown to be an extremely versatile spectroscopy to study both occupied and unoccupied electronic states for liquids in thermodynamic equilibrium. However, XES requires high-brilliance soft x-ray synchrotron radiation and poses significant technical challenges to maintain a liquid sample in an ultra-high vacuum environment. Our group has therefore developed and constructed a novel experimental setup for the study of liquids, with the long-term goal of investigating the electronic structure of biological systems in aqueous environments. We have developed a flow-through liquid cell in which the liquid is separated from vacuum by a thin Si3N4 or SiC window and which allows a precise control of temperature. This approach has significant advantages compared to static liquids cells used in the past. Furthermore, we have designed a dedicated high-transmission, high-resolution soft x-ray spectrometer. The high transmission makes it possible to measure complete resonant XES"maps" in less than an hour, giving unprecedented detailed insight into the electronic structure of the investigated sample. Using this new equipment we have investigated the electronic structure of liquid water. Furthermore, our XES spectra and maps give information about ultra-fast dissociation on the timescale of the O 1s core hole lifetime, which is strongly affected by the initial state hydrogen bonding configuration.

  7. Mapping the HLA ligandome landscape of acute myeloid leukemia: a targeted approach toward peptide-based immunotherapy.

    PubMed

    Berlin, C; Kowalewski, D J; Schuster, H; Mirza, N; Walz, S; Handel, M; Schmid-Horch, B; Salih, H R; Kanz, L; Rammensee, H-G; Stevanović, S; Stickel, J S

    2015-03-01

    Identification of physiologically relevant peptide vaccine targets calls for the direct analysis of the entirety of naturally presented human leukocyte antigen (HLA) ligands, termed the HLA ligandome. In this study, we implemented this direct approach using immunoprecipitation and mass spectrometry to define acute myeloid leukemia (AML)-associated peptide vaccine targets. Mapping the HLA class I ligandomes of 15 AML patients and 35 healthy controls, more than 25 000 different naturally presented HLA ligands were identified. Target prioritization based on AML exclusivity and high presentation frequency in the AML cohort identified a panel of 132 LiTAAs (ligandome-derived tumor-associated antigens), and 341 corresponding HLA ligands (LiTAPs (ligandome-derived tumor-associated peptides)) represented subset independently in >20% of AML patients. Functional characterization of LiTAPs by interferon-γ ELISPOT (Enzyme-Linked ImmunoSpot) and intracellular cytokine staining confirmed AML-specific CD8(+) T-cell recognition. Of note, our platform identified HLA ligands representing several established AML-associated antigens (e.g. NPM1, MAGED1, PRTN3, MPO, WT1), but found 80% of them to be also represented in healthy control samples. Mapping of HLA class II ligandomes provided additional CD4(+) T-cell epitopes and potentially synergistic embedded HLA ligands, allowing for complementation of a multipeptide vaccine for the immunotherapy of AML.

  8. Real-time temperature estimation and monitoring of HIFU ablation through a combined modeling and passive acoustic mapping approach.

    PubMed

    Jensen, C R; Cleveland, R O; Coussios, C C

    2013-09-07

    Passive acoustic mapping (PAM) has been recently demonstrated as a method of monitoring focused ultrasound therapy by reconstructing the emissions created by inertially cavitating bubbles (Jensen et al 2012 Radiology 262 252-61). The published method sums energy emitted by cavitation from the focal region within the tissue and uses a threshold to determine when sufficient energy has been delivered for ablation. The present work builds on this approach to provide a high-intensity focused ultrasound (HIFU) treatment monitoring software that displays both real-time temperature maps and a prediction of the ablated tissue region. This is achieved by determining heat deposition from two sources: (i) acoustic absorption of the primary HIFU beam which is calculated via a nonlinear model, and (ii) absorption of energy from bubble acoustic emissions which is estimated from measurements. The two sources of heat are used as inputs to the bioheat equation that gives an estimate of the temperature of the tissue as well as estimates of tissue ablation. The method has been applied to ex vivo ox liver samples and the estimated temperature is compared to the measured temperature and shows good agreement, capturing the effect of cavitation-enhanced heating on temperature evolution. In conclusion, it is demonstrated that by using PAM and predictions of heating it is possible to produce an evolving estimate of cell death during exposure in order to guide treatment for monitoring ablative HIFU therapy.

  9. Experimental Approach to Controllably Vary Protein Oxidation While Minimizing Electrode Adsorption for Boron-Doped Diamond Electrochemical Surface Mapping Applications

    SciTech Connect

    McClintock, Carlee; Hettich, Robert {Bob} L

    2013-01-01

    Oxidative protein surface mapping has become a powerful approach for measuring the solvent accessibility of folded protein structures. A variety of techniques exist for generating the key reagent hydroxyl radicals for these measurements; however, many of these approaches require use of radioactive sources or caustic oxidizing chemicals. The purpose of this research was to evaluate and optimize the use of boron-doped diamond (BDD) electrochemistry as a highly accessible tool for producing hydroxyl radicals as a means to induce a controllable level of oxidation on a range of intact proteins. These experiments utilize a relatively high flow rates to reduce protein residence time inside the electrochemical flow chamber, along with a unique cell activation approach to improve control over the intact protein oxidation yield. Studies were conducted to evaluate the level of protein adsorption onto the electrode surface. This report demonstrates a robust protocol for the use of BDD electrochemistry and high performance LC-MS/MS as a high-throughput experimental pipeline for probing higher order protein structure, and illustrates how it is complementary to predictive computational modeling efforts.

  10. Reconstructing mitochondrial genomes directly from genomic next-generation sequencing reads—a baiting and iterative mapping approach

    PubMed Central

    Hahn, Christoph; Bachmann, Lutz; Chevreux, Bastien

    2013-01-01

    We present an in silico approach for the reconstruction of complete mitochondrial genomes of non-model organisms directly from next-generation sequencing (NGS) data—mitochondrial baiting and iterative mapping (MITObim). The method is straightforward even if only (i) distantly related mitochondrial genomes or (ii) mitochondrial barcode sequences are available as starting-reference sequences or seeds, respectively. We demonstrate the efficiency of the approach in case studies using real NGS data sets of the two monogenean ectoparasites species Gyrodactylus thymalli and Gyrodactylus derjavinoides including their respective teleost hosts European grayling (Thymallus thymallus) and Rainbow trout (Oncorhynchus mykiss). MITObim appeared superior to existing tools in terms of accuracy, runtime and memory requirements and fully automatically recovered mitochondrial genomes exceeding 99.5% accuracy from total genomic DNA derived NGS data sets in <24 h using a standard desktop computer. The approach overcomes the limitations of traditional strategies for obtaining mitochondrial genomes for species with little or no mitochondrial sequence information at hand and represents a fast and highly efficient in silico alternative to laborious conventional strategies relying on initial long-range PCR. We furthermore demonstrate the applicability of MITObim for metagenomic/pooled data sets using simulated data. MITObim is an easy to use tool even for biologists with modest bioinformatics experience. The software is made available as open source pipeline under the MIT license at https://github.com/chrishah/MITObim. PMID:23661685

  11. Reconstructing mitochondrial genomes directly from genomic next-generation sequencing reads--a baiting and iterative mapping approach.

    PubMed

    Hahn, Christoph; Bachmann, Lutz; Chevreux, Bastien

    2013-07-01

    We present an in silico approach for the reconstruction of complete mitochondrial genomes of non-model organisms directly from next-generation sequencing (NGS) data-mitochondrial baiting and iterative mapping (MITObim). The method is straightforward even if only (i) distantly related mitochondrial genomes or (ii) mitochondrial barcode sequences are available as starting-reference sequences or seeds, respectively. We demonstrate the efficiency of the approach in case studies using real NGS data sets of the two monogenean ectoparasites species Gyrodactylus thymalli and Gyrodactylus derjavinoides including their respective teleost hosts European grayling (Thymallus thymallus) and Rainbow trout (Oncorhynchus mykiss). MITObim appeared superior to existing tools in terms of accuracy, runtime and memory requirements and fully automatically recovered mitochondrial genomes exceeding 99.5% accuracy from total genomic DNA derived NGS data sets in <24 h using a standard desktop computer. The approach overcomes the limitations of traditional strategies for obtaining mitochondrial genomes for species with little or no mitochondrial sequence information at hand and represents a fast and highly efficient in silico alternative to laborious conventional strategies relying on initial long-range PCR. We furthermore demonstrate the applicability of MITObim for metagenomic/pooled data sets using simulated data. MITObim is an easy to use tool even for biologists with modest bioinformatics experience. The software is made available as open source pipeline under the MIT license at https://github.com/chrishah/MITObim.

  12. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  13. Diffusion-Based Density-Equalizing Maps: an Interdisciplinary Approach to Visualizing Homicide Rates and Other Georeferenced Statistical Data

    NASA Astrophysics Data System (ADS)

    Mazzitello, Karina I.; Candia, Julián

    2012-12-01

    In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.

  14. A multivariate approach for mapping fire ignition risk: the example of the National Park of Cilento (southern Italy).

    PubMed

    Guglietta, Daniela; Migliozzi, Antonello; Ricotta, Carlo

    2015-07-01

    Recent advances in fire management led landscape managers to adopt an integrated fire fighting strategy in which fire suppression is supported by prevention actions and by knowledge of local fire history and ecology. In this framework, an accurate evaluation of fire ignition risk and its environmental drivers constitutes a basic step toward the optimization of fire management measures. In this paper, we propose a multivariate method for identifying and spatially portraying fire ignition risk across a complex and heterogeneous landscape such as the National Park of Cilento, Vallo di Diano, and Alburni (southern Italy). The proposed approach consists first in calculating the fire selectivity of several landscape features that are usually related to fire ignition, such as land cover or topography. Next, the fire selectivity values of single landscape features are combined with multivariate segmentation tools. The resulting fire risk map may constitute a valuable tool for optimizing fire prevention strategies and for efficiently allocating fire fighting resources.

  15. Localization of causal locus in the genome of the brown macroalga Ectocarpus: NGS-based mapping and positional cloning approaches

    PubMed Central

    Billoud, Bernard; Jouanno, Émilie; Nehr, Zofia; Carton, Baptiste; Rolland, Élodie; Chenivesse, Sabine; Charrier, Bénédicte

    2015-01-01

    Mutagenesis is the only process by which unpredicted biological gene function can be identified. Despite that several macroalgal developmental mutants have been generated, their causal mutation was never identified, because experimental conditions were not gathered at that time. Today, progresses in macroalgal genomics and judicious choices of suitable genetic models make mutated gene identification possible. This article presents a comparative study of two methods aiming at identifying a genetic locus in the brown alga Ectocarpus siliculosus: positional cloning and Next-Generation Sequencing (NGS)-based mapping. Once necessary preliminary experimental tools were gathered, we tested both analyses on an Ectocarpus morphogenetic mutant. We show how a narrower localization results from the combination of the two methods. Advantages and drawbacks of these two approaches as well as potential transfer to other macroalgae are discussed. PMID:25745426

  16. Mapping Palm Swamp Wetland Ecosystems in the Peruvian Amazon: a Multi-Sensor Remote Sensing Approach

    NASA Astrophysics Data System (ADS)

    Podest, E.; McDonald, K. C.; Schroeder, R.; Pinto, N.; Zimmerman, R.; Horna, V.

    2012-12-01

    Wetland ecosystems are prevalent in the Amazon basin, especially in northern Peru. Of specific interest are palm swamp wetlands because they are characterized by constant surface inundation and moderate seasonal water level variation. This combination of constantly saturated soils and warm temperatures year-round can lead to considerable methane release to the atmosphere. Because of the widespread occurrence and expected sensitivity of these ecosystems to climate change, it is critical to develop methods to quantify their spatial extent and inundation state in order to assess their carbon dynamics. Spatio-temporal information on palm swamps is difficult to gather because of their remoteness and difficult accessibility. Spaceborne microwave remote sensing is an effective tool for characterizing these ecosystems since it is sensitive to surface water and vegetation structure and allows monitoring large inaccessible areas on a temporal basis regardless of atmospheric conditions or solar illumination. We developed a remote sensing methodology using multi-sensor remote sensing data from the Advanced Land Observing Satellite (ALOS) Phased Array L-Band Synthetic Aperture Radar (PALSAR), Shuttle Radar Topography Mission (SRTM) DEM, and Landsat to derive maps at 100 meter resolution of palm swamp extent and inundation based on ground data collections; and combined active and passive microwave data from AMSR-E and QuikSCAT to derive inundation extent at 25 kilometer resolution on a weekly basis. We then compared information content and accuracy of the coarse resolution products relative to the high-resolution datasets. The synergistic combination of high and low resolution datasets allowed for characterization of palm swamps and assessment of their flooding status. This work has been undertaken partly within the framework of the JAXA ALOS Kyoto & Carbon Initiative. PALSAR data have been provided by JAXA. Portions of this work were carried out at the Jet Propulsion Laboratory

  17. High-Resolution Association Mapping of Quantitative Trait Loci: A Population-Based Approach

    PubMed Central

    Fan, Ruzong; Jung, Jeesun; Jin, Lei

    2006-01-01

    In this article, population-based regression models are proposed for high-resolution linkage disequilibrium mapping of quantitative trait loci (QTL). Two regression models, the “genotype effect model” and the “additive effect model,” are proposed to model the association between the markers and the trait locus. The marker can be either diallelic or multiallelic. If only one marker is used, the method is similar to a classical setting by Nielsen and Weir, and the additive effect model is equivalent to the haplotype trend regression (HTR) method by Zaykin et al. If two/multiple marker data with phase ambiguity are used in the analysis, the proposed models can be used to analyze the data directly. By analytical formulas, we show that the genotype effect model can be used to model the additive and dominance effects simultaneously; the additive effect model takes care of the additive effect only. On the basis of the two models, F-test statistics are proposed to test association between the QTL and markers. By a simulation study, we show that the two models have reasonable type I error rates for a data set of moderate sample size. The noncentrality parameter approximations of F-test statistics are derived to make power calculation and comparison. By a simulation study, it is found that the noncentrality parameter approximations of F-test statistics work very well. Using the noncentrality parameter approximations, we compare the power of the two models with that of the HTR. In addition, a simulation study is performed to make a comparison on the basis of the haplotype frequencies of 10 SNPs of angiotensin-1 converting enzyme (ACE) genes. PMID:16172503

  18. Drought Vulnerability Mapping with Geomorphological Approach in Yogyakarta Special Region (DIY) and Central Java

    NASA Astrophysics Data System (ADS)

    Sudaryatno

    2016-11-01

    This study aims to determine the level of vulnerability of the geomorphologic drought that occurred in Central Java and Yogyakarta Special Region. This study examines geomorphologic drought. Parameters used were slope, drainage, Available Water Capacity (AWC), permeability, landform, and land use. Landsat 8 and SRTM data were used for the extraction of physical parameters, such as slope, drainage, landform, and land use. The method used in this study is scoring and weighting. Query results were used for data classification by overlaying drought geomorphologic parameters. The expected outcome of this research is to map the geomorphologic drought vulnerability on Central Java and Yogyakarta Special Region. Drought vulnerability was divided into wet, normal and dry classes. Distribution of the dry class is frequent. Some of the dry classes are distributed on the steep till extremely steep slope region and on the structural and karsts landform. This was related to AWC value where region with high AWC contributed to the poor drainage of the soil, such as at Kulonprogo, Purworejo, Kebumen, Blora, Wonogiri, Purbalingga, Pekalongan, Jepara and Kudus regency. Normal classes are distributed on the sloping till steep slope, have moderate till well-drained soil and low AWC, such as at Gunung Kidul, Pati, Temanggung regency, and Magelang city. Wet classes are distributed on the flat or almost flat and sloping region. Most of the wet classes are distributed on volcanic hills and coastal area. Those regions are well-drained and the land uses are mostly for settlement and farming, such as at Sleman, Yogyakarta city, Klaten, Bantul, and Wonosobo regency.

  19. A Robust Approach for a Filter-Based Monocular Simultaneous Localization and Mapping (SLAM) System

    PubMed Central

    Munguía, Rodrigo; Castillo-Toledo, Bernardino; Grau, Antoni

    2013-01-01

    Simultaneous localization and mapping (SLAM) is an important problem to solve in robotics theory in order to build truly autonomous mobile robots. This work presents a novel method for implementing a SLAM system based on a single camera sensor. The SLAM with a single camera, or monocular SLAM, is probably one of the most complex SLAM variants. In this case, a single camera, which is freely moving through its environment, represents the sole sensor input to the system. The sensors have a large impact on the algorithm used for SLAM. Cameras are used more frequently, because they provide a lot of information and are well adapted for embedded systems: they are light, cheap and power-saving. Nevertheless, and unlike range sensors, which provide range and angular information, a camera is a projective sensor providing only angular measurements of image features. Therefore, depth information (range) cannot be obtained in a single step. In this case, special techniques for feature system-initialization are needed in order to enable the use of angular sensors (as cameras) in SLAM systems. The main contribution of this work is to present a novel and robust scheme for incorporating and measuring visual features in filtering-based monocular SLAM systems. The proposed method is based in a two-step technique, which is intended to exploit all the information available in angular measurements. Unlike previous schemes, the values of parameters used by the initialization technique are derived directly from the sensor characteristics, thus simplifying the tuning of the system. The experimental results show that the proposed method surpasses the performance of previous schemes. PMID:23823972

  20. SBH and the integration of complementary approaches in the mapping, sequencing, and understanding of complex genomes

    SciTech Connect

    Drmanac, R.; Drmanac, S.; Labat, I.; Vicentic, A.; Gemmell, A.; Stavropoulos, N.; Jarvis, J.

    1992-01-01

    A variant of sequencing by hybridization (SBH) is being developed with a potential to inexpensively determine up to 100 million base pairs per year. The method comprises (1) arraying short clones in 864-well plates; (2) growth of the M13 clones or PCR of the inserts; (3) automated spotting of DNAs by corresponding pin-arrays; (4) hybridization of dotted samples with 200-3000 [sup 32]P- or [sup 33]P-labeled 6- to 8-mer probes; and (5) scoring hybridization signals using storage phosphor plates. Some 200 7- to 8-mers can provide an inventory of the genes if CDNA clones are hybridized, or can define the order of 2-kb genomic clones, creating physical and structural maps with 100-bp resolution; the distribution of G+C, LINEs, SINEs, and gene families would be revealed. cDNAs that represent new genes and genomic clones in regions of interest selected by SBH can be sequenced by a gel method. Uniformly distributed clones from the previous step will be hybridized with 2000--3000 6- to 8-mers. As a result, approximately 50--60% of the genomic regions containing members of large repetitive and gene families and those families represented in GenBank would be completely sequenced. In the less redundant regions, every base pair is expected to be read with 3-4 probes, but the complete sequence can not be reconstructed. Such partial sequences allow the inference of similarity and the recognition of coding, regulatory, and repetitive sequences, as well as study of the evolutionary processes all the way up to the species delineation.