Sample records for tangent map analysis

  1. Symplectic Propagation of the Map, Tangent Map and Tangent Map Derivative through Quadrupole and Combined-Function Dipole Magnets without Truncation

    NASA Astrophysics Data System (ADS)

    Bruhwiler, D. L.; Cary, J. R.; Shasharina, S.

    1998-04-01

    The MAPA accelerator modeling code symplectically advances the full nonlinear map, tangent map and tangent map derivative through all accelerator elements. The tangent map and its derivative are nonlinear generalizations of Browns first- and second-order matrices(K. Brown, SLAC-75, Rev. 4 (1982), pp. 107-118.), and they are valid even near the edges of the dynamic aperture, which may be beyond the radius of convergence for a truncated Taylor series. In order to avoid truncation of the map and its derivatives, the Hamiltonian is split into pieces for which the map can be obtained analytically. Yoshidas method(H. Yoshida, Phys. Lett. A 150 (1990), pp. 262-268.) is then used to obtain a symplectic approximation to the map, while the tangent map and its derivative are appropriately composed at each step to obtain them with equal accuracy. We discuss our splitting of the quadrupole and combined-function dipole Hamiltonians and show that typically few steps are required for a high-energy accelerator.

  2. Multiband tangent space mapping and feature selection for classification of EEG during motor imagery.

    PubMed

    Islam, Md Rabiul; Tanaka, Toshihisa; Molla, Md Khademul Islam

    2018-05-08

    When designing multiclass motor imagery-based brain-computer interface (MI-BCI), a so-called tangent space mapping (TSM) method utilizing the geometric structure of covariance matrices is an effective technique. This paper aims to introduce a method using TSM for finding accurate operational frequency bands related brain activities associated with MI tasks. A multichannel electroencephalogram (EEG) signal is decomposed into multiple subbands, and tangent features are then estimated on each subband. A mutual information analysis-based effective algorithm is implemented to select subbands containing features capable of improving motor imagery classification accuracy. Thus obtained features of selected subbands are combined to get feature space. A principal component analysis-based approach is employed to reduce the features dimension and then the classification is accomplished by a support vector machine (SVM). Offline analysis demonstrates the proposed multiband tangent space mapping with subband selection (MTSMS) approach outperforms state-of-the-art methods. It acheives the highest average classification accuracy for all datasets (BCI competition dataset 2a, IIIa, IIIb, and dataset JK-HH1). The increased classification accuracy of MI tasks with the proposed MTSMS approach can yield effective implementation of BCI. The mutual information-based subband selection method is implemented to tune operation frequency bands to represent actual motor imagery tasks.

  3. Tangent map intermittency as an approximate analysis of intermittency in a high dimensional fully stochastic dynamical system: The Tangled Nature model.

    PubMed

    Diaz-Ruelas, Alvaro; Jeldtoft Jensen, Henrik; Piovani, Duccio; Robledo, Alberto

    2016-12-01

    It is well known that low-dimensional nonlinear deterministic maps close to a tangent bifurcation exhibit intermittency and this circumstance has been exploited, e.g., by Procaccia and Schuster [Phys. Rev. A 28, 1210 (1983)], to develop a general theory of 1/f spectra. This suggests it is interesting to study the extent to which the behavior of a high-dimensional stochastic system can be described by such tangent maps. The Tangled Nature (TaNa) Model of evolutionary ecology is an ideal candidate for such a study, a significant model as it is capable of reproducing a broad range of the phenomenology of macroevolution and ecosystems. The TaNa model exhibits strong intermittency reminiscent of punctuated equilibrium and, like the fossil record of mass extinction, the intermittency in the model is found to be non-stationary, a feature typical of many complex systems. We derive a mean-field version for the evolution of the likelihood function controlling the reproduction of species and find a local map close to tangency. This mean-field map, by our own local approximation, is able to describe qualitatively only one episode of the intermittent dynamics of the full TaNa model. To complement this result, we construct a complete nonlinear dynamical system model consisting of successive tangent bifurcations that generates time evolution patterns resembling those of the full TaNa model in macroscopic scales. The switch from one tangent bifurcation to the next in the sequences produced in this model is stochastic in nature, based on criteria obtained from the local mean-field approximation, and capable of imitating the changing set of types of species and total population in the TaNa model. The model combines full deterministic dynamics with instantaneous parameter random jumps at stochastically drawn times. In spite of the limitations of our approach, which entails a drastic collapse of degrees of freedom, the description of a high-dimensional model system in terms of a low

  4. Finding Equations of Tangents to Conics

    ERIC Educational Resources Information Center

    Baloglou, George; Helfgott, Michel

    2004-01-01

    A calculus-free approach is offered for determining the equation of lines tangent to conics. Four types of problems are discussed: line tangent to a conic at a given point, line tangent to a conic passing through a given point outside the conic, line of a given slope tangent to a conic, and line tangent to two conics simultaneously; in each case,…

  5. CSP-TSM: Optimizing the performance of Riemannian tangent space mapping using common spatial pattern for MI-BCI.

    PubMed

    Kumar, Shiu; Mamun, Kabir; Sharma, Alok

    2017-12-01

    Classification of electroencephalography (EEG) signals for motor imagery based brain computer interface (MI-BCI) is an exigent task and common spatial pattern (CSP) has been extensively explored for this purpose. In this work, we focused on developing a new framework for classification of EEG signals for MI-BCI. We propose a single band CSP framework for MI-BCI that utilizes the concept of tangent space mapping (TSM) in the manifold of covariance matrices. The proposed method is named CSP-TSM. Spatial filtering is performed on the bandpass filtered MI EEG signal. Riemannian tangent space is utilized for extracting features from the spatial filtered signal. The TSM features are then fused with the CSP variance based features and feature selection is performed using Lasso. Linear discriminant analysis (LDA) is then applied to the selected features and finally classification is done using support vector machine (SVM) classifier. The proposed framework gives improved performance for MI EEG signal classification in comparison with several competing methods. Experiments conducted shows that the proposed framework reduces the overall classification error rate for MI-BCI by 3.16%, 5.10% and 1.70% (for BCI Competition III dataset IVa, BCI Competition IV Dataset I and BCI Competition IV Dataset IIb, respectively) compared to the conventional CSP method under the same experimental settings. The proposed CSP-TSM method produces promising results when compared with several competing methods in this paper. In addition, the computational complexity is less compared to that of TSM method. Our proposed CSP-TSM framework can be potentially used for developing improved MI-BCI systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Tangent Lines without Calculus

    ERIC Educational Resources Information Center

    Rabin, Jeffrey M.

    2008-01-01

    This article presents a problem that can help high school students develop the concept of instantaneous velocity and connect it with the slope of a tangent line to the graph of position versus time. It also gives a method for determining the tangent line to the graph of a polynomial function at any point without using calculus. (Contains 1 figure.)

  7. Semisupervised Support Vector Machines With Tangent Space Intrinsic Manifold Regularization.

    PubMed

    Sun, Shiliang; Xie, Xijiong

    2016-09-01

    Semisupervised learning has been an active research topic in machine learning and data mining. One main reason is that labeling examples is expensive and time-consuming, while there are large numbers of unlabeled examples available in many practical problems. So far, Laplacian regularization has been widely used in semisupervised learning. In this paper, we propose a new regularization method called tangent space intrinsic manifold regularization. It is intrinsic to data manifold and favors linear functions on the manifold. Fundamental elements involved in the formulation of the regularization are local tangent space representations, which are estimated by local principal component analysis, and the connections that relate adjacent tangent spaces. Simultaneously, we explore its application to semisupervised classification and propose two new learning algorithms called tangent space intrinsic manifold regularized support vector machines (TiSVMs) and tangent space intrinsic manifold regularized twin SVMs (TiTSVMs). They effectively integrate the tangent space intrinsic manifold regularization consideration. The optimization of TiSVMs can be solved by a standard quadratic programming, while the optimization of TiTSVMs can be solved by a pair of standard quadratic programmings. The experimental results of semisupervised classification problems show the effectiveness of the proposed semisupervised learning algorithms.

  8. Transformation formulas relating geodetic coordinates to a tangent to Earth, plane coordinate system

    NASA Technical Reports Server (NTRS)

    Credeur, L.

    1981-01-01

    Formulas and their approximation were developed to map geodetic position to an Earth tangent plane with an airport centered rectangular coordinate system. The transformations were developed for use in a terminal area air traffic model with deterministic aircraft traffic. The exact configured vehicle's approximation equations used in their precision microwave landing system navigation experiments.

  9. 21 CFR 886.1810 - Tangent screen (campimeter).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Tangent screen (campimeter). 886.1810 Section 886.1810 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... a patient's visual field. This generic type of device includes projection tangent screens, target...

  10. 21 CFR 886.1810 - Tangent screen (campimeter).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) Identification. A tangent screen (campimeter) is an AC-powered or battery-powered device that is a large square... a patient's visual field. This generic type of device includes projection tangent screens, target... (general controls). The AC-powered device and the battery-powered device are exempt from the premarket...

  11. 21 CFR 886.1810 - Tangent screen (campimeter).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) Identification. A tangent screen (campimeter) is an AC-powered or battery-powered device that is a large square... a patient's visual field. This generic type of device includes projection tangent screens, target... (general controls). The AC-powered device and the battery-powered device are exempt from the premarket...

  12. 21 CFR 886.1810 - Tangent screen (campimeter).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...) Identification. A tangent screen (campimeter) is an AC-powered or battery-powered device that is a large square... a patient's visual field. This generic type of device includes projection tangent screens, target... (general controls). The AC-powered device and the battery-powered device are exempt from the premarket...

  13. 21 CFR 886.1810 - Tangent screen (campimeter).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) Identification. A tangent screen (campimeter) is an AC-powered or battery-powered device that is a large square... a patient's visual field. This generic type of device includes projection tangent screens, target... (general controls). The AC-powered device and the battery-powered device are exempt from the premarket...

  14. Arctic curves in path models from the tangent method

    NASA Astrophysics Data System (ADS)

    Di Francesco, Philippe; Lapa, Matthew F.

    2018-04-01

    Recently, Colomo and Sportiello introduced a powerful method, known as the tangent method, for computing the arctic curve in statistical models which have a (non- or weakly-) intersecting lattice path formulation. We apply the tangent method to compute arctic curves in various models: the domino tiling of the Aztec diamond for which we recover the celebrated arctic circle; a model of Dyck paths equivalent to the rhombus tiling of a half-hexagon for which we find an arctic half-ellipse; another rhombus tiling model with an arctic parabola; the vertically symmetric alternating sign matrices, where we find the same arctic curve as for unconstrained alternating sign matrices. The latter case involves lattice paths that are non-intersecting but that are allowed to have osculating contact points, for which the tangent method was argued to still apply. For each problem we estimate the large size asymptotics of a certain one-point function using LU decomposition of the corresponding Gessel–Viennot matrices, and a reformulation of the result amenable to asymptotic analysis.

  15. The derivative and tangent operators of a motion in Lorentzian space

    NASA Astrophysics Data System (ADS)

    Durmaz, Olgun; Aktaş, Buşra; Gündoğan, Hali˙t

    In this paper, by using Lorentzian matrix multiplication, L-Tangent operator is obtained in Lorentzian space. The L-Tangent operators related with planar, spherical and spatial motion are computed via special matrix groups. L-Tangent operators are related to vectors. Some illustrative examples for applications of L-Tangent operators are also presented.

  16. Examining Students' Generalizations of the Tangent Concept: A Theoretical Perspective

    ERIC Educational Resources Information Center

    Çekmez, Erdem; Baki, Adnan

    2016-01-01

    The concept of a tangent is important in understanding many topics in mathematics and science. Earlier studies on students' understanding of the concept of a tangent have reported that they have various misunderstandings and experience difficulties in transferring their knowledge about the tangent line from Euclidean geometry into calculus. In…

  17. Tangent-ogive nose cones

    NASA Technical Reports Server (NTRS)

    Wing, L. D.

    1976-01-01

    Program calculates aerodynamic heating and shear stresses at wall for tangent-ogive noses that are slender enough to maintain an attached nose shock during portion of flight when heat transfer from boundary layer to wall is significant.

  18. Pursuit Eye-Movements in Curve Driving Differentiate between Future Path and Tangent Point Models

    PubMed Central

    Lappi, Otto; Pekkanen, Jami; Itkonen, Teemu H.

    2013-01-01

    For nearly 20 years, looking at the tangent point on the road edge has been prominent in models of visual orientation in curve driving. It is the most common interpretation of the commonly observed pattern of car drivers looking through a bend, or at the apex of the curve. Indeed, in the visual science literature, visual orientation towards the inside of a bend has become known as “tangent point orientation”. Yet, it remains to be empirically established whether it is the tangent point the drivers are looking at, or whether some other reference point on the road surface, or several reference points, are being targeted in addition to, or instead of, the tangent point. Recently discovered optokinetic pursuit eye-movements during curve driving can provide complementary evidence over and above traditional gaze-position measures. This paper presents the first detailed quantitative analysis of pursuit eye movements elicited by curvilinear optic flow in real driving. The data implicates the far zone beyond the tangent point as an important gaze target area during steady-state cornering. This is in line with the future path steering models, but difficult to reconcile with any pure tangent point steering model. We conclude that the tangent point steering models do not provide a general explanation of eye movement and steering during a curve driving sequence and cannot be considered uncritically as the default interpretation when the gaze position distribution is observed to be situated in the region of the curve apex. PMID:23894300

  19. NEMOTAM: tangent and adjoint models for the ocean modelling platform NEMO

    NASA Astrophysics Data System (ADS)

    Vidard, A.; Bouttier, P.-A.; Vigilant, F.

    2015-04-01

    Tangent linear and adjoint models (TAMs) are efficient tools to analyse and to control dynamical systems such as NEMO. They can be involved in a large range of applications such as sensitivity analysis, parameter estimation or the computation of characteristic vectors. A TAM is also required by the 4D-Var algorithm, which is one of the major methods in data assimilation. This paper describes the development and the validation of the tangent linear and adjoint model for the NEMO ocean modelling platform (NEMOTAM). The diagnostic tools that are available alongside NEMOTAM are detailed and discussed, and several applications are also presented.

  20. NEMOTAM: tangent and adjoint models for the ocean modelling platform NEMO

    NASA Astrophysics Data System (ADS)

    Vidard, A.; Bouttier, P.-A.; Vigilant, F.

    2014-10-01

    The tangent linear and adjoint model (TAM) are efficient tools to analyse and to control dynamical systems such as NEMO. They can be involved in a large range of applications such as sensitivity analysis, parameter estimation or the computation of characteristics vectors. TAM is also required by the 4-D-VAR algorithm which is one of the major method in Data Assimilation. This paper describes the development and the validation of the Tangent linear and Adjoint Model for the NEMO ocean modelling platform (NEMOTAM). The diagnostic tools that are available alongside NEMOTAM are detailed and discussed and several applications are also presented.

  1. Ionized gas clouds near the Sagittarius Arm tangent

    NASA Astrophysics Data System (ADS)

    Hou, Li-Gang; Dong, Jian; Gao, Xu-Yang; Han, Jin-Lin

    2017-04-01

    Radio recombination lines (RRLs) are the best tracers of ionized gas. Simultaneous observations of multi-transitions of RRLs can significantly improve survey sensitivity. We conducted pilot RRL observations near the Sagittarius Arm tangent by using the 65-m Shanghai Tian Ma Radio Telescope (TMRT) equipped with broadband feeds and a digital backend. Six hydrogen RRLs (H96 α - H101α) at C band (6289 MHz-7319 MHz) were observed simultaneously toward a sky area of 2° × 1.2° by using on-the-fly mapping mode. These transitions were then stacked together for detection of ionized gas. Star forming complexes G48.6+0.1 and G49.5-0.3 were detected in the integrated intensity map. We found agreements between our measured centroid velocities and previous results for the 21 known HII regions in the mapped area. For more than 80 cataloged HII region candidates without previous RRL measurements, we obtained new RRL spectra at 30 targeted positions. In addition, we detected 25 new discrete RRL sources with spectral S/N > 5 σ, and they were not listed in the catalogs of previously known HII regions. The distances for 44 out of these 55 new RRL sources were estimated.

  2. An arc tangent function demodulation method of fiber-optic Fabry-Perot high-temperature pressure sensor

    NASA Astrophysics Data System (ADS)

    Ren, Qianyu; Li, Junhong; Hong, Yingping; Jia, Pinggang; Xiong, Jijun

    2017-09-01

    A new demodulation algorithm of the fiber-optic Fabry-Perot cavity length based on the phase generated carrier (PGC) is proposed in this paper, which can be applied in the high-temperature pressure sensor. This new algorithm based on arc tangent function outputs two orthogonal signals by utilizing an optical system, which is designed based on the field-programmable gate array (FPGA) to overcome the range limit of the original PGC arc tangent function demodulation algorithm. The simulation and analysis are also carried on. According to the analysis of demodulation speed and precision, the simulation of different numbers of sampling points, and measurement results of the pressure sensor, the arc tangent function demodulation method has good demodulation results: 1 MHz processing speed of single data and less than 1% error showing practical feasibility in the fiber-optic Fabry-Perot cavity length demodulation of the Fabry-Perot high-temperature pressure sensor.

  3. Designing worked examples for learning tangent lines to circles

    NASA Astrophysics Data System (ADS)

    Retnowati, E.; Marissa

    2018-03-01

    Geometry is a branch of mathematics that deals with shape and space, including the circle. A difficult topic in the circle may be the tangent line to circle. This is considered a complex material since students have to simultaneously apply several principles to solve the problems, these are the property of circle, definition of the tangent, measurement and Pythagorean theorem. This paper discusses designs of worked examples for learning tangent line to circles and how to apply this design to an effective and efficient instructional activity. When students do not have sufficient prior knowledge, solving tangent problems might be clumsy, and as a consequence, the problem-solving activity hinders learning. According to a Cognitive Load Theory, learning occurs when students can construct new knowledge based on the relevant knowledge previously learned. When the relevant knowledge is unavailable, providing students with the worked example is suggested. Worked example may reduce unproductive process during learning that causes extraneous cognitive load. Nevertheless, worked examples must be created in such a way facilitate learning.

  4. Tangent Lines without Derivatives for Quadratic and Cubic Equations

    ERIC Educational Resources Information Center

    Carroll, William J.

    2009-01-01

    In the quadratic equation, y = ax[superscript 2] + bx + c, the equation y = bx + c is identified as the equation of the line tangent to the parabola at its y-intercept. This is extended to give a convenient method of graphing tangent lines at any point on the graph of a quadratic or a cubic equation. (Contains 5 figures.)

  5. Teachers' Conceptions of Tangent Line

    ERIC Educational Resources Information Center

    Paez Murillo, Rosa Elvira; Vivier, Laurent

    2013-01-01

    In order to study the conceptions, and their evolutions, of the tangent line to a curve an updating workshop which took place in Mexico was designed for upper secondary school teachers. This workshop was planned using the methodology of cooperative learning, scientific debate and auto reflection (ACODESA) and the conception-knowing-concept model…

  6. Measurement of corneal tangent modulus using ultrasound indentation.

    PubMed

    Wang, Li-Ke; Huang, Yan-Ping; Tian, Lei; Kee, Chea-Su; Zheng, Yong-Ping

    2016-09-01

    Biomechanical properties are potential information for the diagnosis of corneal pathologies. An ultrasound indentation probe consisting of a load cell and a miniature ultrasound transducer as indenter was developed to detect the force-indentation relationship of the cornea. The key idea was to utilize the ultrasound transducer to compress the cornea and to ultrasonically measure the corneal deformation with the eyeball overall displacement compensated. Twelve corneal silicone phantoms were fabricated with different stiffness for the validation of measurement with reference to an extension test. In addition, fifteen fresh porcine eyes were measured by the developed system in vitro. The tangent moduli of the corneal phantoms calculated using the ultrasound indentation data agreed well with the results from the tensile test of the corresponding phantom strips (R(2)=0.96). The mean tangent moduli of the porcine corneas measured by the proposed method were 0.089±0.026MPa at intraocular pressure (IOP) of 15mmHg and 0.220±0.053MPa at IOP of 30mmHg, respectively. The coefficient of variation (CV) and intraclass correlation coefficient (ICC) of tangent modulus were 14.4% and 0.765 at 15mmHg, and 8.6% and 0.870 at 30mmHg, respectively. The preliminary study showed that ultrasound indentation could be applied to the measurement of corneal tangent modulus with good repeatability and improved measurement accuracy compared to conventional surface displacement-based measurement method. The ultrasound indentation can be a potential tool for the corneal biomechanical properties measurement in vivo. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Internal mammary lymph node inclusion in standard tangent breast fields: effects of body habitus.

    PubMed

    Proulx, G M; Lee, R J; Stomper, P C

    2001-01-01

    The purpose of this study was to determine the variability of internal mammary node (IMN) coverage with standard breast tangent fields using surface anatomy as determined by computed tomography (CT) planning for patients treated with either breast-conserving treatment or postmastectomy, and to evaluate the influence of body habitus and shape on IMN coverage with standard tangent fields. This prospective study included consecutive women with breast cancer who underwent either local excision or mastectomy and had standard tangent fields intended to cover the breast plus a margin simulated using surface anatomy. CT planning determined the location of the IMN with respect to the tangent fields designed from surface anatomy. The internal mammary vessels were used as surrogates for the IMNs. CT measurements of the presternal fat thickness and anteroposterior (AP) and transverse skeletal diameters were made to determine their relationship to the inclusion of IMNs within the tangent fields. Only seven patients (14%) had their IMNs completely within the tangent fields. Twenty patients (40%) had partial coverage of their IMNs, and 23 (46%) had their IMNs completely outside the fields. IMN inclusion was inversely correlated with presternal fat thickness. Thoracic skeletal shape was not associated with IMN inclusion. Standard tangent fields generally do not cover the IMNs completely but may cover them at least partially in a majority of patients. The presternal fat thickness is inversely correlated with IMN inclusion in the tangent fields.

  8. Reverse-Tangent Injection in a Centrifugal Compressor

    NASA Technical Reports Server (NTRS)

    Skoch, Gary J.

    2007-01-01

    Injection of working fluid into a centrifugal compressor in the reverse tangent direction has been invented as a way of preventing flow instabilities (stall and surge) or restoring stability when stall or surge has already commenced. The invention applies, in particular, to a centrifugal compressor, the diffuser of which contains vanes that divide the flow into channels oriented partly radially and partly tangentially. In reverse-tangent injection, a stream or jet of the working fluid (the fluid that is compressed) is injected into the vaneless annular region between the blades of the impeller and the vanes of the diffuser. As used here, "reverse" signifies that the injected flow opposes (and thereby reduces) the tangential component of the velocity of the impeller discharge. At the same time, the injected jet acts to increase the radial component of the velocity of the impeller discharge.

  9. Diffusion tensor analysis with invariant gradients and rotation tangents.

    PubMed

    Kindlmann, Gordon; Ennis, Daniel B; Whitaker, Ross T; Westin, Carl-Fredrik

    2007-11-01

    Guided by empirically established connections between clinically important tissue properties and diffusion tensor parameters, we introduce a framework for decomposing variations in diffusion tensors into changes in shape and orientation. Tensor shape and orientation both have three degrees-of-freedom, spanned by invariant gradients and rotation tangents, respectively. As an initial demonstration of the framework, we create a tunable measure of tensor difference that can selectively respond to shape and orientation. Second, to analyze the spatial gradient in a tensor volume (a third-order tensor), our framework generates edge strength measures that can discriminate between different neuroanatomical boundaries, as well as creating a novel detector of white matter tracts that are adjacent yet distinctly oriented. Finally, we apply the framework to decompose the fourth-order diffusion covariance tensor into individual and aggregate measures of shape and orientation covariance, including a direct approximation for the variance of tensor invariants such as fractional anisotropy.

  10. Local electron tomography using angular variations of surface tangents: Stomo version 2

    NASA Astrophysics Data System (ADS)

    Petersen, T. C.; Ringer, S. P.

    2012-03-01

    . Program summaryProgram title: STOMO version 2 Catalogue identifier: AEFS_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFS_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2854 No. of bytes in distributed program, including test data, etc.: 23 559 Distribution format: tar.gz Programming language: C/C++ Computer: PC Operating system: Windows XP RAM: Scales as the product of experimental image dimensions multiplied by the number of points chosen by the user in polynomial fitting. Typical runs require between 50 Mb and 100 Mb of RAM. Supplementary material: Sample output files, for the test run provided, are available. Classification: 7.4, 14 Catalogue identifier of previous version: AEFS_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 676 Does the new version supersede the previous version?: Yes Nature of problem: A local electron tomography algorithm of specimens for which conventional back projection may fail and or data for which there is a limited angular range (which would otherwise cause significant 'missing-wedge' artefacts). The algorithm does not solve the tomography back projection problem but rather locally reconstructs the 3D morphology of surfaces defined by varied scattering densities. Solution method: Local reconstruction is effected using image-analysis edge and ridge detection computations on experimental tilt series to measure smooth angular variations of surface tangent-line intersections, which generate point clouds decorating the embedded and or external scattering surfaces of a specimen. Reasons for new version: The new version was coded to cater for rectangular images in experimental tilt-series, ensure smoother image rotations, provide ridge detection (suitable for sensing phase-contrast Fresnel fringes and other

  11. Dielectric response of high permittivity polymer ceramic composite with low loss tangent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Subodh, G.; 1.Physikalisches Institut, Universitat Stuttgart, Pfaffenwaldring 57, Stuttgart 70550; Deepu, V.

    2009-08-10

    The present communication investigates the dielectric response of the Sr{sub 9}Ce{sub 2}Ti{sub 12}O{sub 36} ceramics loaded high density polyethylene and epoxy resin. Sr{sub 9}Ce{sub 2}Ti{sub 12}O{sub 36} ceramic filled polyethylene and epoxy composites were prepared using hot blending and mechanical mixing, respectively. 40 vol % ceramic loaded polyethylene has relative permittivity of 12.1 and loss tangent of 0.004 at 8 GHz, whereas the corresponding composite using epoxy as matrix has permittivity and loss tangent of 14.1 and 0.022, respectively. The effective medium theory fits relatively well for the observed permittivity of these composites.

  12. Toward more accurate loss tangent measurements in reentrant cavities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moyer, R. D.

    1980-05-01

    Karpova has described an absolute method for measurement of dielectric properties of a solid in a coaxial reentrant cavity. His cavity resonance equation yields very accurate results for dielectric constants. However, he presented only approximate expressions for the loss tangent. This report presents more exact expressions for that quantity and summarizes some experimental results.

  13. Robust video copy detection approach based on local tangent space alignment

    NASA Astrophysics Data System (ADS)

    Nie, Xiushan; Qiao, Qianping

    2012-04-01

    We propose a robust content-based video copy detection approach based on local tangent space alignment (LTSA), which is an efficient dimensionality reduction algorithm. The idea is motivated by the fact that the content of video becomes richer and the dimension of content becomes higher. It does not give natural tools for video analysis and understanding because of the high dimensionality. The proposed approach reduces the dimensionality of video content using LTSA, and then generates video fingerprints in low dimensional space for video copy detection. Furthermore, a dynamic sliding window is applied to fingerprint matching. Experimental results show that the video copy detection approach has good robustness and discrimination.

  14. Doubly stratified MHD tangent hyperbolic nanofluid flow due to permeable stretched cylinder

    NASA Astrophysics Data System (ADS)

    Nagendramma, V.; Leelarathnam, A.; Raju, C. S. K.; Shehzad, S. A.; Hussain, T.

    2018-06-01

    An investigation is exhibited to analyze the presence of heat source and sink in doubly stratified MHD incompressible tangent hyperbolic fluid due to stretching of cylinder embedded in porous space under nanoparticles. To develop the mathematical model of tangent hyperbolic nanofluid, movement of Brownian and thermophoretic are accounted. The established equations of continuity, momentum, thermal and solutal boundary layers are reassembled into sets of non-linear expressions. These assembled expressions are executed with the help of Runge-Kutta scheme with MATLAB. The impacts of sundry parameters are illustrated graphically and the engineering interest physical quantities like skin friction, Nusselt and Sherwood number are examined by computing numerical values. It is clear that the power-law index parameter and curvature parameter shows favorable effect on momentum boundary layer thickness whereas Weissennberg number reveals inimical influence.

  15. Catalog of Observed Tangents to the Spiral Arms in the Milky Way Galaxy

    NASA Astrophysics Data System (ADS)

    Vallée, Jacques P.

    2014-11-01

    From the Sun's location in the Galactic disk, one can use different arm tracers (CO, H I, thermal or ionized or relativistic electrons, masers, cold and hot dust, etc.) to locate a tangent to each spiral arm in the disk of the Milky Way. We present a master catalog of the astronomically observed tangents to the Galaxy's spiral arms, using different arm tracers from the literature. Some arm tracers can have slightly divergent results from several papers, so a mean value is taken—see the Appendix for CO, H II, and masers. The catalog of means currently consists of 63 mean tracer entries, spread over many arms (Carina, Crux-Centaurus, Norma, Perseus origin, near 3 kpc, Scutum, Sagittarius), stemming from 107 original arm tracer entries. Additionally, we updated and revised a previous statistical analysis of the angular offset and linear separation from the mid-arm for each different mean arm tracer. Given enough arm tracers, and summing and averaging over all four spiral arms, one could determine if arm tracers have separate and parallel lanes in the Milky Way. This statistical analysis allows a cross-cut of a Galactic spiral arm to be made, confirming a recent discovery of a linear separation between arm tracers. Here, from the mid-arm's CO to the inner edge's hot dust, the arm halfwidth is about 340 pc doubling would yield a full arm width of 680 pc. We briefly compare these observations with the predictions of many spiral arm theories, notably the density wave theory.

  16. CATALOG OF OBSERVED TANGENTS TO THE SPIRAL ARMS IN THE MILKY WAY GALAXY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vallée, Jacques P., E-mail: jacques.vallee@nrc-cnrc.gc.ca

    2014-11-01

    From the Sun's location in the Galactic disk, one can use different arm tracers (CO, H I, thermal or ionized or relativistic electrons, masers, cold and hot dust, etc.) to locate a tangent to each spiral arm in the disk of the Milky Way. We present a master catalog of the astronomically observed tangents to the Galaxy's spiral arms, using different arm tracers from the literature. Some arm tracers can have slightly divergent results from several papers, so a mean value is taken—see the Appendix for CO, H II, and masers. The catalog of means currently consists of 63 meanmore » tracer entries, spread over many arms (Carina, Crux-Centaurus, Norma, Perseus origin, near 3 kpc, Scutum, Sagittarius), stemming from 107 original arm tracer entries. Additionally, we updated and revised a previous statistical analysis of the angular offset and linear separation from the mid-arm for each different mean arm tracer. Given enough arm tracers, and summing and averaging over all four spiral arms, one could determine if arm tracers have separate and parallel lanes in the Milky Way. This statistical analysis allows a cross-cut of a Galactic spiral arm to be made, confirming a recent discovery of a linear separation between arm tracers. Here, from the mid-arm's CO to the inner edge's hot dust, the arm halfwidth is about 340 pc; doubling would yield a full arm width of 680 pc. We briefly compare these observations with the predictions of many spiral arm theories, notably the density wave theory.« less

  17. A method for calculating aerodynamic heating on sounding rocket tangent ogive noses.

    NASA Technical Reports Server (NTRS)

    Wing, L. D.

    1973-01-01

    A method is presented for calculating the aerodynamic heating and shear stresses at the wall for tangent ogive noses that are slender enough to maintain an attached nose shock through that portion of flight during which heat transfer from the boundary layer to the wall is significant. The lower entropy of the attached nose shock combined with the inclusion of the streamwise pressure gradient yields a reasonable estimate of the actual flow conditions. Both laminar and turbulent boundary layers are examined and an approximation of the effects of (up to) moderate angles-of-attack is included in the analysis. The analytical method has been programmed in FORTRAN IV for an IBM 360/91 computer.

  18. A method for calculating aerodynamic heating on sounding rocket tangent ogive noses

    NASA Technical Reports Server (NTRS)

    Wing, L. D.

    1972-01-01

    A method is presented for calculating the aerodynamic heating and shear stresses at the wall for tangent ogive noses that are slender enough to maintain an attached nose shock through that portion of flight during which heat transfer from the boundary layer to the wall is significant. The lower entropy of the attached nose shock combined with the inclusion of the streamwise pressure gradient yields a reasonable estimate of the actual flow conditions. Both laminar and turbulent boundary layers are examined and an approximation of the effects of (up to) moderate angles-of-attack is included in the analysis. The analytical method has been programmed in FORTRAN 4 for an IBM 360/91 computer.

  19. Without derivatives or limits: from visual and geometrical points of view to algebraic methods for identifying tangent lines

    NASA Astrophysics Data System (ADS)

    Vivier, L.

    2013-07-01

    Usually, the tangent line is considered to be a calculus notion. However, it is also a graphical and an algebraic notion. The graphical frame, where our primary conceptions are conceived, could give rise to algebraic methods to obtain the tangent line to a curve. In this pre-calculus perspective, two methods are described and discussed according to their potential for secondary students and teacher training.

  20. Generic Equations for Constructing Smooth Paths Along Circles and Tangent Lines With Application to Airport Ground Paths

    NASA Technical Reports Server (NTRS)

    Barker, L. Keith

    1998-01-01

    The primary purpose of this publication is to develop a mathematical model to describe smooth paths along any combination of circles and tangent lines. Two consecutive circles in a path are either tangent (externally or internally) or they appear on the same (lateral) or opposite (transverse) sides of a connecting tangent line. A path may start or end on either a segment or circle. The approach is to use mathematics common to robotics to design the path as a multilink manipulator. This approach allows a hierarchical view of the problem and keeps the notation manageable. A user simply specifies a few parameters to configure a path. Necessary and sufficient conditions automatically ensure the consistency of the inputs for a smooth path. Two example runway exit paths are given, and an angle to go assists in knowing when to switch from one path element to the next.

  1. Boundary Layer Studies on a Spinning Tangent-Ogive-Cylinder Model

    DTIC Science & Technology

    1975-07-01

    ca) An experimental investigation of the Magnus effect on a seven caliber tangent-I ;’ ogive- cylinder model in supersonic flow is reported. The...necessary and Identify by block number) Three-Dimiensional Boundary Layer Compressible Flow Body of Revolution Magnus Effects Boundary Layer...factors have resulted in renewed interest in the study of the Magnus effect . This report describes an experimental study of the effects of spin on

  2. Application of the Double-Tangent Construction of Coexisting Phases to Any Type of Phase Equilibrium for Binary Systems Modeled with the Gamma-Phi Approach

    ERIC Educational Resources Information Center

    Jaubert, Jean-Noël; Privat, Romain

    2014-01-01

    The double-tangent construction of coexisting phases is an elegant approach to visualize all the multiphase binary systems that satisfy the equality of chemical potentials and to select the stable state. In this paper, we show how to perform the double-tangent construction of coexisting phases for binary systems modeled with the gamma-phi…

  3. Determination of loss tangent of human tear film at 9.8 GHz

    NASA Astrophysics Data System (ADS)

    Bansal, Namita; Dhaliwal, A. S.; Mann, K. S.

    2015-08-01

    Basal (non-stimulated) tears that are produced by accessory lacrimal glands located in conjunctiva of human eye, form tear film which in turn keeps the eye moist and lubricate; nourishes the eye; protects the eye from dust, bacterial infection and shear forces generated during eye movements and blinking; and provides a refractive surface on the corneal epithelium. Film is known to contain water, mucin, lipids, lysozyme, glucose, urea, sodium etc. In present communication, loss tangent of human tear film has been determined at 9.8 GHz microwaves by employing cavity perturbation technique at a temperature of 37°C. The basal tears from a small population comprising six subjects were collected and average value of loss tangent is reported. Slater's technique was used to reduce the error caused in measuring the volume of sample. The determined values are useful to study the biological effects of microwaves on tear film as well as other parts of human eye such as eye lens and lens epithelial cells. To the best of author's knowledge, no such study is available in literature at any radio as well as microwave frequencies; therefore present determinations are first of its kind.

  4. Overview of TANGENT (Tandem Aerosol Nucleation and Growth ENvironment Tube) 2017 IOP Study

    NASA Astrophysics Data System (ADS)

    Tiszenkel, L.

    2017-12-01

    New particle formation consists of two steps: nucleation and growth of nucleated particles. However, most laboratory studies have been conducted under conditions where these two processes are convoluted together, thereby hampering the detailed understanding of the effect of chemical species and atmospheric conditions on two processes. The objective of the Tandem Aerosol Nucleation and Growth ENvironment Tube (TANGENT) laboratory study is to investigate aerosol nucleation and growth properties independently by separating these two processes in two different flow tubes. This research is a collaboration between the University of Alabama in Huntsville and the University of Delaware. In this poster we will present the experimental setup of TANGENT and summarize the key results from the first IOP (intense observation period) experiments undertaken during Summer 2017. Nucleation takes place in a temperature- and RH-controlled fast flow reactor (FT-1) where sulfuric acid forms from OH radicals and sulfur dioxide. Sulfuric acid and impurity base compounds are detected with chemical ionization mass spectrometers (CIMS). Particle sizes and number concentrations of newly nucleated particles are measured with a scanning mobility particle sizer (SMPS) and particle size magnifier (PSM), providing concentrations of particles between 1-100 nm. The nucleation particles are transferred directly to the growth tube (FT-2) where oxidants and biogenic organic precursors are added to grow nucleated nanoparticles. Sizes of particles after growth are analyzed with an additional SMPS and elemental chemical composition of 50 nm and above particles detected with a nano-aerosol mass spectrometer (NAMS). TANGENT provides the unique ability to conduct experiments that can monitor and control reactant concentrations, aerosol size and aerosol chemical composition during nucleation and growth. Experiments during this first IOP study have elucidated the effects of sulfur dioxide, particle size

  5. Calculation of symmetric and asymmetric vortex seperation on cones and tangent ogives based on discrete vortex models

    NASA Technical Reports Server (NTRS)

    Chin, S.; Lan, C. Edward

    1988-01-01

    An inviscid discrete vortex model, with newly derived expressions for the tangential velocity imposed at the separation points, is used to investigate the symmetric and asymmetric vortex separation on cones and tangent ogives. The circumferential locations of separation are taken from experimental data. Based on a slender body theory, the resulting simultaneous nonlinear algebraic equations in a cross-flow plane are solved with Broyden's modified Newton-Raphson method. Total force coefficients are obtained through momentum principle with new expressions for nonconical flow. It is shown through the method of function deflation that multiple solutions exist at large enough angles of attack, even with symmetric separation points. These additional solutions are asymmetric in vortex separation and produce side force coefficients which agree well with data for cones and tangent ogives.

  6. Comparison of the Tangent Linear Properties of Tracer Transport Schemes Applied to Geophysical Problems.

    NASA Technical Reports Server (NTRS)

    Kent, James; Holdaway, Daniel

    2015-01-01

    A number of geophysical applications require the use of the linearized version of the full model. One such example is in numerical weather prediction, where the tangent linear and adjoint versions of the atmospheric model are required for the 4DVAR inverse problem. The part of the model that represents the resolved scale processes of the atmosphere is known as the dynamical core. Advection, or transport, is performed by the dynamical core. It is a central process in many geophysical applications and is a process that often has a quasi-linear underlying behavior. However, over the decades since the advent of numerical modelling, significant effort has gone into developing many flavors of high-order, shape preserving, nonoscillatory, positive definite advection schemes. These schemes are excellent in terms of transporting the quantities of interest in the dynamical core, but they introduce nonlinearity through the use of nonlinear limiters. The linearity of the transport schemes used in Goddard Earth Observing System version 5 (GEOS-5), as well as a number of other schemes, is analyzed using a simple 1D setup. The linearized version of GEOS-5 is then tested using a linear third order scheme in the tangent linear version.

  7. Predictive mapping of soil organic carbon in wet cultivated lands using classification-tree based models: the case study of Denmark.

    PubMed

    Bou Kheir, Rania; Greve, Mogens H; Bøcher, Peder K; Greve, Mette B; Larsen, René; McCloy, Keith

    2010-05-01

    Soil organic carbon (SOC) is one of the most important carbon stocks globally and has large potential to affect global climate. Distribution patterns of SOC in Denmark constitute a nation-wide baseline for studies on soil carbon changes (with respect to Kyoto protocol). This paper predicts and maps the geographic distribution of SOC across Denmark using remote sensing (RS), geographic information systems (GISs) and decision-tree modeling (un-pruned and pruned classification trees). Seventeen parameters, i.e. parent material, soil type, landscape type, elevation, slope gradient, slope aspect, mean curvature, plan curvature, profile curvature, flow accumulation, specific catchment area, tangent slope, tangent curvature, steady-state wetness index, Normalized Difference Vegetation Index (NDVI), Normalized Difference Wetness Index (NDWI) and Soil Color Index (SCI) were generated to statistically explain SOC field measurements in the area of interest (Denmark). A large number of tree-based classification models (588) were developed using (i) all of the parameters, (ii) all Digital Elevation Model (DEM) parameters only, (iii) the primary DEM parameters only, (iv), the remote sensing (RS) indices only, (v) selected pairs of parameters, (vi) soil type, parent material and landscape type only, and (vii) the parameters having a high impact on SOC distribution in built pruned trees. The best constructed classification tree models (in the number of three) with the lowest misclassification error (ME) and the lowest number of nodes (N) as well are: (i) the tree (T1) combining all of the parameters (ME=29.5%; N=54); (ii) the tree (T2) based on the parent material, soil type and landscape type (ME=31.5%; N=14); and (iii) the tree (T3) constructed using parent material, soil type, landscape type, elevation, tangent slope and SCI (ME=30%; N=39). The produced SOC maps at 1:50,000 cartographic scale using these trees are highly matching with coincidence values equal to 90.5% (Map T1

  8. Quantum particles in general spacetimes: A tangent bundle formalism

    NASA Astrophysics Data System (ADS)

    Wohlfarth, Mattias N. R.

    2018-06-01

    Using tangent bundle geometry we construct an equivalent reformulation of classical field theory on flat spacetimes which simultaneously encodes the perspectives of multiple observers. Its generalization to curved spacetimes realizes a new type of nonminimal coupling of the fields and is shown to admit a canonical quantization procedure. For the resulting quantum theory we demonstrate the emergence of a particle interpretation, fully consistent with general relativistic geometry. The path dependency of parallel transport forces each observer to carry their own quantum state; we find that the communication of the corresponding quantum information may generate extra particles on curved spacetimes. A speculative link between quantum information and spacetime curvature is discussed which might lead to novel explanations for quantum decoherence and vanishing interference in double-slit or interaction-free measurement scenarios, in the mere presence of additional observers.

  9. Study on Remote Monitoring System of Crossing and Spanning Tangent Tower

    NASA Astrophysics Data System (ADS)

    Chen, Da-bing; Zhang, Nai-long; Zhang, Meng-ge; Wang, Ze-hua; Zhang, Yan

    2017-05-01

    In order to grasp the vibration state of overhead transmission line and ensure the operational security of transmission line, the remote monitoring system of crossing and spanning tangent tower was studied. By use of this system, the displacement, velocity and acceleration of the tower, and the local weather data are collected automatically, displayed on computer of remote monitoring centre through wireless network, real-time collection and transmission of vibration signals are realized. The applying results show that the system is excellent in reliability and accuracy and so on. The system can be used to remote monitoring of transmission tower of UHV power transmission lines and in large spanning areas.

  10. Numerical calculation of thermo-mechanical problems at large strains based on complex step derivative approximation of tangent stiffness matrices

    NASA Astrophysics Data System (ADS)

    Balzani, Daniel; Gandhi, Ashutosh; Tanaka, Masato; Schröder, Jörg

    2015-05-01

    In this paper a robust approximation scheme for the numerical calculation of tangent stiffness matrices is presented in the context of nonlinear thermo-mechanical finite element problems and its performance is analyzed. The scheme extends the approach proposed in Kim et al. (Comput Methods Appl Mech Eng 200:403-413, 2011) and Tanaka et al. (Comput Methods Appl Mech Eng 269:454-470, 2014 and bases on applying the complex-step-derivative approximation to the linearizations of the weak forms of the balance of linear momentum and the balance of energy. By incorporating consistent perturbations along the imaginary axis to the displacement as well as thermal degrees of freedom, we demonstrate that numerical tangent stiffness matrices can be obtained with accuracy up to computer precision leading to quadratically converging schemes. The main advantage of this approach is that contrary to the classical forward difference scheme no round-off errors due to floating-point arithmetics exist within the calculation of the tangent stiffness. This enables arbitrarily small perturbation values and therefore leads to robust schemes even when choosing small values. An efficient algorithmic treatment is presented which enables a straightforward implementation of the method in any standard finite-element program. By means of thermo-elastic and thermo-elastoplastic boundary value problems at finite strains the performance of the proposed approach is analyzed.

  11. The Cretaceous superchron geodynamo: Observations near the tangent cylinder

    PubMed Central

    Tarduno, John A.; Cottrell, Rory D.; Smirnov, Alexei V.

    2002-01-01

    If relationships exist between the frequency of geomagnetic reversals and the morphology, secular variation, and intensity of Earth's magnetic field, they should be best expressed during superchrons, intervals tens of millions of years long lacking reversals. Here we report paleomagnetic and paleointensity data from lavas of the Cretaceous Normal Polarity Superchron that formed at high latitudes near the tangent cylinder that surrounds the solid inner core. The time-averaged field recorded by these lavas is remarkably strong and stable. When combined with global results available from lower latitudes, these data define a time-averaged field that is overwhelmingly dominated by the axial dipole (octupole components are insignificant). These observations suggest that the basic features of the geomagnetic field are intrinsically related. Superchrons may reflect times when the nature of core–mantle boundary heat flux allows the geodynamo to operate at peak efficiency. PMID:12388778

  12. Preliminary Analysis of Chang'E-2 Microwave Brightness Temperature Maps of the Moon

    NASA Astrophysics Data System (ADS)

    Blewett, D. T.; Zheng, Y. C.; Chan, K. L.; Neish, C.; Tsang, K. T.; Zhu, Y. C.; Jozwiak, L.

    2016-12-01

    China's Chang'E-2 (CE-2) lunar orbiter carried a microwave radiometer (MRM) that conducted passive remote sensing of the Moon at 3, 7.8, 19.35 and 37 GHz during 2010-2011. Earlier, the Chang'E-1 MRM obtained lower spatial resolution microwave data from a 200-km orbit, higher than CE-2's 100-km orbit. The MRM datasets represent a unique set of measurements of a type that have not been conducted by any previous lunar missions. Thermal emission of the lunar surface was measured and calibrated to brightness temperature (TB). Spherical harmonics fits were then used to model the TB variation as functions of local time and latitude for each of the four channels. Using the spherical harmonics fits, the day- and nighttime TB maps measured at various local times were normalized to noon-time and midnight conditions. The resulting eight MRM TB maps provide key information on the surface and near-subsurface structure and thermophysical properties of the lunar regolith; this information is complementary to that derived from LRO Diviner observations in the infrared. We have observed many thermal anomalies on the Moon, i.e., hot regions in the daytime map and cold spots in the nighttime map. We find that the high-Ti maria are heated in the day and cool in the night much more quickly than the other maria, attributable to the greater abundance of ilmenite (which has higher dielectric loss tangent than silicate minerals) in the high-Ti basalts. We note interesting contrasts in thermal behavior among high-reflectance, rayed craters. For example, the high-reflectance rays of Tycho are cooler than the surroundings in the 3 GHz daytime and nighttime maps, while the prominent rays of some other craters like Giordano Bruno are not distinctive in the 3 GHz maps. These differences can be understood in terms of variations in composition, structure, and thermophysical properties of the ray materials.

  13. Cable Effects Study. Tangents, Rabbit Holes, Dead Ends, and Valuable Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ardelean, Emil V.; Babuška, Vít; Goodding, James C.

    Lessons learned during a study on the effects that electrical power and signal wiring harness cables introduce on the dynamic response of precision spacecraft is presented, along with the most significant results. Our study was a three year effort to discover a set of practical approaches for updating well-defined dynamic models of harness-free structures where knowledge of the cable type, position, and tie-down method are known. Although cables are found on every satellite, the focus was on precision, low damping, and very flexible structures. Obstacles encountered, classified as tangents, rabbit holes, and dead ends, offer practical lessons for structural dynamicsmore » research. The paper traces the historical, experiential progression of the project, describing how the obstacles affected the project. Methods were developed to estimate cable properties. Problems were encountered because of the flexible, highly damped nature of cables. A beam was used as a test article to validate experimentally derived cable properties and to refine the assumptions regarding boundary conditions. Furthermore, a spacecraft bus-like panel with cables attached was designed, and finite element models were developed and validated through experiment. Various paths were investigated at each stage before a consistent test and analysis methodology was developed« less

  14. Cable Effects Study. Tangents, Rabbit Holes, Dead Ends, and Valuable Results

    DOE PAGES

    Ardelean, Emil V.; Babuška, Vít; Goodding, James C.; ...

    2014-08-04

    Lessons learned during a study on the effects that electrical power and signal wiring harness cables introduce on the dynamic response of precision spacecraft is presented, along with the most significant results. Our study was a three year effort to discover a set of practical approaches for updating well-defined dynamic models of harness-free structures where knowledge of the cable type, position, and tie-down method are known. Although cables are found on every satellite, the focus was on precision, low damping, and very flexible structures. Obstacles encountered, classified as tangents, rabbit holes, and dead ends, offer practical lessons for structural dynamicsmore » research. The paper traces the historical, experiential progression of the project, describing how the obstacles affected the project. Methods were developed to estimate cable properties. Problems were encountered because of the flexible, highly damped nature of cables. A beam was used as a test article to validate experimentally derived cable properties and to refine the assumptions regarding boundary conditions. Furthermore, a spacecraft bus-like panel with cables attached was designed, and finite element models were developed and validated through experiment. Various paths were investigated at each stage before a consistent test and analysis methodology was developed« less

  15. Assessing the Tangent Linear Behaviour of Common Tracer Transport Schemes and Their Use in a Linearised Atmospheric General Circulation Model

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Kent, James

    2015-01-01

    The linearity of a selection of common advection schemes is tested and examined with a view to their use in the tangent linear and adjoint versions of an atmospheric general circulation model. The schemes are tested within a simple offline one-dimensional periodic domain as well as using a simplified and complete configuration of the linearised version of NASA's Goddard Earth Observing System version 5 (GEOS-5). All schemes which prevent the development of negative values and preserve the shape of the solution are confirmed to have nonlinear behaviour. The piecewise parabolic method (PPM) with certain flux limiters, including that used by default in GEOS-5, is found to support linear growth near the shocks. This property can cause the rapid development of unrealistically large perturbations within the tangent linear and adjoint models. It is shown that these schemes with flux limiters should not be used within the linearised version of a transport scheme. The results from tests using GEOS-5 show that the current default scheme (a version of PPM) is not suitable for the tangent linear and adjoint model, and that using a linear third-order scheme for the linearised model produces better behaviour. Using the third-order scheme for the linearised model improves the correlations between the linear and non-linear perturbation trajectories for cloud liquid water and cloud liquid ice in GEOS-5.

  16. The incidence and effect of fatty atrophy, positive tangent sign, and rotator cuff tears on outcomes after total shoulder arthroplasty.

    PubMed

    Choate, W Stephen; Shanley, Ellen; Washburn, Richard; Tolan, Stefan J; Salim, Tariq I; Tadlock, Josh; Shealy, Elizabeth C; Long, Catherine D; Crawford, Ashley E; Kissenberth, Michael J; Lonergan, Keith T; Hawkins, Richard J; Tokish, John M

    2017-12-01

    Treatment choices for total shoulder arthroplasty (TSA) in the absence of full-thickness rotator cuff tears (RCTs) are not clearly defined in current literature. This study investigated the prevalence and effect of preoperative partial-thickness RCTs and muscular degenerative changes on postoperative outcomes after TSA. Medical records and magnetic resonance imaging studies were reviewed for patients who underwent TSA for primary glenohumeral osteoarthritis with minimum 2-year follow-up to determine preoperative tear classification, Goutallier grade, and supraspinatus tangent sign. Postoperative pain on the visual analog scale, range of motion, and patient outcomes scores were obtained to correlate preoperative RCT status, Goutallier grading, tangent sign, and postoperative outcomes. Patients with full-thickness RCT on preoperative magnetic resonance imaging were excluded. Forty-five patients met all inclusion criteria (average age, 65 ± 10 years; average follow-up, 43 months). Of the patients undergoing TSA, 40% had a significant (>50% thickness) partial RCT. Grade 3 to 4 Goutallier changes were noted in 22% of all patients, and 13% demonstrated grade 3 to 4 changes in the context of no tear. Positive tangent sign was present in 7% of all patients. The preoperative Goutallier grade of the infraspinatus was significantly negatively correlated with postoperative forward elevation (P = .02) and external rotation (P = .05), but rotator cuff pathology, including tear status, Goutallier grade, and the presence of a tangent sign, did not correlate with postoperative functional outcome scores. Even in the absence of a full-thickness RCT, rotator cuff atrophy, fatty infiltration, and partial thickness tearing are common findings. Although postoperative range of motion is correlated to Goutallier changes of the infraspinatus, rotator cuff pathology is not correlated to outcomes after TSA; therefore, one may proceed with TSA without concern of their effect on

  17. MAP - a mapping and analysis program for harvest planning

    Treesearch

    Robert N. Eli; Chris B. LeDoux; Penn A. Peters

    1984-01-01

    The Northeastern Forest Experiment Station and the Department of Civil Engineering at West Virginia University are cooperating in the development of a Mapping and Analysis Program, to be named MAP. The goal of this computer software package is to significantly improve the planning and harvest efficiency of small to moderately sized harvest units located in mountainous...

  18. Tangent function transformation of the Abbreviated Injury Scale improves accuracy and simplifies scoring.

    PubMed

    Wang, Muding; Qiu, Wusi; Qiu, Fang; Mo, Yinan; Fan, Wenhui

    2015-03-16

    The Injury Severity Score (ISS) and the New Injury Severity Score (NISS) are widely used for anatomic severity assessments after trauma. We present here the Tangent Injury Severity Score (TISS), which transforms the Abbreviated Injury Scale (AIS) as a predictor of mortality. The TISS is defined as the sum of the tangent function of AIS/6 to the power 3.04 multiplied by 18.67 of a patient's three most severe AIS injuries regardless of body regions. TISS values were calculated for every patient in two large independent data sets: 3,908 and 4,171 patients treated during a 6-year period at level-3 first-class comprehensive hospitals: the Affiliated Hospital of Hangzhou Normal University and Fengtian Hospital Affiliated to Shenyang Medical College, China. The power of TISS to predict mortality was compared with previously calculated NISS values for the same patients in each data set. The TISS is more predictive of survival than NISS (Hangzhou: receiver operating characteristic (ROC): NISS = 0.929, TISS = 0.949; p = 0.002; Shenyang: ROC: NISS = 0.924, TISS = 0.942; p = 0.008). Moreover, TISS provides a better fit throughout its entire range of prediction (Hosmer Lemeshow statistic for Hangzhou NISS = 29.71; p < 0.001, TISS = 19.59; p = 0.003; Hosmer Lemeshow statistic for Shenyang NISS = 33.49; p < 0.001, TISS = 21.19; p = 0.002). The TISS shows more accurate prediction of prognosis and a linear relation to mortality. The TISS might be a better injury scoring tool with simple computation.

  19. Benefits Mapping and Analysis Program (BenMAP)

    EPA Pesticide Factsheets

    This area summarizes the key features of the BenMAP-CE program and links to pages that provide more details regarding the program, the basic principles of air pollution benefits analysis and a link to download the software.

  20. Dynamic Geometry Software and Tracing Tangents in the Context of the Mean Value Theorem: Technique and Theory Production

    ERIC Educational Resources Information Center

    Martínez-Hernández, Cesar; Ulloa-Azpeitia, Ricardo

    2017-01-01

    Based on the theoretical elements of the instrumental approach to tool use known as Task-Technique-Theory (Artigue, 2002), this paper analyses and discusses the performance of graduate students enrolled in a Teacher Training program. The latter performance relates to tracing tangent lines to the curve of a quadratic function in Dynamic Geometry…

  1. StructMap: Elastic Distance Analysis of Electron Microscopy Maps for Studying Conformational Changes.

    PubMed

    Sanchez Sorzano, Carlos Oscar; Alvarez-Cabrera, Ana Lucia; Kazemi, Mohsen; Carazo, Jose María; Jonić, Slavica

    2016-04-26

    Single-particle electron microscopy (EM) has been shown to be very powerful for studying structures and associated conformational changes of macromolecular complexes. In the context of analyzing conformational changes of complexes, distinct EM density maps obtained by image analysis and three-dimensional (3D) reconstruction are usually analyzed in 3D for interpretation of structural differences. However, graphic visualization of these differences based on a quantitative analysis of elastic transformations (deformations) among density maps has not been done yet due to a lack of appropriate methods. Here, we present an approach that allows such visualization. This approach is based on statistical analysis of distances among elastically aligned pairs of EM maps (one map is deformed to fit the other map), and results in visualizing EM maps as points in a lower-dimensional distance space. The distances among points in the new space can be analyzed in terms of clusters or trajectories of points related to potential conformational changes. The results of the method are shown with synthetic and experimental EM maps at different resolutions. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Measurement of the loss tangent of low-density polyethylene with a nanoindentation technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loubet, J. L.; Oliver, W. C.; Lucas, B. N.

    2000-05-01

    This paper describes experimental measurements of the linear viscoelastic behavior of the surface of low-density (LD) polyethylene in contact with a pyramidal Berkovich diamond indenter. The experiments were carried out at two different temperatures, 15.9 and 27.2 degree sign C, between frequencies of 0.1 and 800 Hz. Using the shift of the loss tangent between the two temperatures at frequencies lower than 20 Hz and an Arrhenius equation, an activation energy of 105{+-}2 kJ/mol was obtained. This value is in good agreement with the bulk value of the {alpha} relaxation of LD polyethylene reported in the literature. (c) 2000 Materialsmore » Research Society.« less

  3. Fractional Snow Cover Mapping by Artificial Neural Networks and Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Çiftçi, B. B.; Kuter, S.; Akyürek, Z.; Weber, G.-W.

    2017-11-01

    Snow is an important land cover whose distribution over space and time plays a significant role in various environmental processes. Hence, snow cover mapping with high accuracy is necessary to have a real understanding for present and future climate, water cycle, and ecological changes. This study aims to investigate and compare the design and use of artificial neural networks (ANNs) and support vector machines (SVMs) algorithms for fractional snow cover (FSC) mapping from satellite data. ANN and SVM models with different model building settings are trained by using Moderate Resolution Imaging Spectroradiometer surface reflectance values of bands 1-7, normalized difference snow index and normalized difference vegetation index as predictor variables. Reference FSC maps are generated from higher spatial resolution Landsat ETM+ binary snow cover maps. Results on the independent test data set indicate that the developed ANN model with hyperbolic tangent transfer function in the output layer and the SVM model with radial basis function kernel produce high FSC mapping accuracies with the corresponding values of R = 0.93 and R = 0.92, respectively.

  4. MAP stability, design, and analysis

    NASA Technical Reports Server (NTRS)

    Ericsson-Jackson, A. J.; Andrews, S. F.; O'Donnell, J. R., Jr.; Markley, F. L.

    1998-01-01

    The Microwave Anisotropy Probe (MAP) is a follow-on to the Differential Microwave Radiometer (DMR) instrument on the Cosmic Background Explorer (COBE) spacecraft. The design and analysis of the MAP attitude control system (ACS) have been refined since work previously reported. The full spacecraft and instrument flexible model was developed in NASTRAN, and the resulting flexible modes were plotted and reduced with the Modal Significance Analysis Package (MSAP). The reduced-order model was used to perform the linear stability analysis for each control mode, the results of which are presented in this paper. Although MAP is going to a relatively disturbance-free Lissajous orbit around the Earth-Sun L(2) Lagrange point, a detailed disturbance-torque analysis is required because there are only a small number of opportunities for momentum unloading each year. Environmental torques, including solar pressure at L(2), aerodynamic and gravity gradient during phasing-loop orbits, were calculated and simulated. Thruster plume impingement torques that could affect the performance of the thruster modes were estimated and simulated, and a simple model of fuel slosh was derived to model its effect on the motion of the spacecraft. In addition, a thruster mode linear impulse controller was developed to meet the accuracy requirements of the phasing loop burns. A dynamic attitude error limiter was added to improve the performance of the ACS during large attitude slews. The result of this analysis is a stable ACS subsystem that meets all of the mission's requirements.

  5. A Tangent Bundle Theory for Visual Curve Completion.

    PubMed

    Ben-Yosef, Guy; Ben-Shahar, Ohad

    2012-07-01

    Visual curve completion is a fundamental perceptual mechanism that completes the missing parts (e.g., due to occlusion) between observed contour fragments. Previous research into the shape of completed curves has generally followed an "axiomatic" approach, where desired perceptual/geometrical properties are first defined as axioms, followed by mathematical investigation into curves that satisfy them. However, determining psychophysically such desired properties is difficult and researchers still debate what they should be in the first place. Instead, here we exploit the observation that curve completion is an early visual process to formalize the problem in the unit tangent bundle R(2) × S(1), which abstracts the primary visual cortex (V1) and facilitates exploration of basic principles from which perceptual properties are later derived rather than imposed. Exploring here the elementary principle of least action in V1, we show how the problem becomes one of finding minimum-length admissible curves in R(2) × S(1). We formalize the problem in variational terms, we analyze it theoretically, and we formulate practical algorithms for the reconstruction of these completed curves. We then explore their induced visual properties vis-à-vis popular perceptual axioms and show how our theory predicts many perceptual properties reported in the corresponding perceptual literature. Finally, we demonstrate a variety of curve completions and report comparisons to psychophysical data and other completion models.

  6. Learning in fully recurrent neural networks by approaching tangent planes to constraint surfaces.

    PubMed

    May, P; Zhou, E; Lee, C W

    2012-10-01

    In this paper we present a new variant of the online real time recurrent learning algorithm proposed by Williams and Zipser (1989). Whilst the original algorithm utilises gradient information to guide the search towards the minimum training error, it is very slow in most applications and often gets stuck in local minima of the search space. It is also sensitive to the choice of learning rate and requires careful tuning. The new variant adjusts weights by moving to the tangent planes to constraint surfaces. It is simple to implement and requires no parameters to be set manually. Experimental results show that this new algorithm gives significantly faster convergence whilst avoiding problems like local minima. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Linear response formula for piecewise expanding unimodal maps

    NASA Astrophysics Data System (ADS)

    Baladi, Viviane; Smania, Daniel

    2008-04-01

    The average R(t)=\\int \\varphi\\,\\rmd \\mu_t of a smooth function phiv with respect to the SRB measure μt of a smooth one-parameter family ft of piecewise expanding interval maps is not always Lipschitz (Baladi 2007 Commun. Math. Phys. 275 839-59, Mazzolena 2007 Master's Thesis Rome 2, Tor Vergata). We prove that if ft is tangent to the topological class of f, and if ∂t ft|t = 0 = X circle f, then R(t) is differentiable at zero, and R'(0) coincides with the resummation proposed (Baladi 2007) of the (a priori divergent) series \\sum_{n=0}^\\infty \\int X(y) \\partial_y (\\varphi \\circ f^n)(y)\\,\\rmd \\mu_0(y) given by Ruelle's conjecture. In fact, we show that t map μt is differentiable within Radon measures. Linear response is violated if and only if ft is transversal to the topological class of f.

  8. Boundary layer flow of MHD tangent hyperbolic nanofluid over a stretching sheet: A numerical investigation

    NASA Astrophysics Data System (ADS)

    Khan, Mair; Hussain, Arif; Malik, M. Y.; Salahuddin, T.; Khan, Farzana

    This article presents the two-dimensional flow of MHD hyperbolic tangent fluid with nanoparticles towards a stretching surface. The mathematical modelling of current flow analysis yields the nonlinear set of partial differential equations which then are reduce to ordinary differential equations by using suitable scaling transforms. Then resulting equations are solved by using shooting technique. The behaviour of the involved physical parameters (Weissenberg number We , Hartmann number M , Prandtl number Pr , Brownian motion parameter Nb , Lewis number Le and thermophoresis number Nt) on velocity, temperature and concentration are interpreted in detail. Additionally, local skin friction, local Nusselt number and local Sherwood number are computed and analyzed. It has been explored that Weissenberg number and Hartmann number are decelerate fluid motion. Brownian motion and thermophoresis both enhance the fluid temperature. Local Sherwood number is increasing function whereas Nusselt number is reducing function for increasing values of Brownian motion parameter Nb , Prandtl number Pr , thermophoresis parameter Nt and Lewis number Le . Additionally, computed results are compared with existing literature to validate the accuracy of solution, one can see that present results have quite resemblance with reported data.

  9. Unified treatment of microscopic boundary conditions and efficient algorithms for estimating tangent operators of the homogenized behavior in the computational homogenization method

    NASA Astrophysics Data System (ADS)

    Nguyen, Van-Dung; Wu, Ling; Noels, Ludovic

    2017-03-01

    This work provides a unified treatment of arbitrary kinds of microscopic boundary conditions usually considered in the multi-scale computational homogenization method for nonlinear multi-physics problems. An efficient procedure is developed to enforce the multi-point linear constraints arising from the microscopic boundary condition either by the direct constraint elimination or by the Lagrange multiplier elimination methods. The macroscopic tangent operators are computed in an efficient way from a multiple right hand sides linear system whose left hand side matrix is the stiffness matrix of the microscopic linearized system at the converged solution. The number of vectors at the right hand side is equal to the number of the macroscopic kinematic variables used to formulate the microscopic boundary condition. As the resolution of the microscopic linearized system often follows a direct factorization procedure, the computation of the macroscopic tangent operators is then performed using this factorized matrix at a reduced computational time.

  10. Hydrogen Production Cost Analysis Map (Text Version) | Hyrdrogen and Fuel

    Science.gov Websites

    Cells | Hydrogen and Fuel Cells | NREL Analysis Map (Text Version) Hydrogen Production Cost Analysis Map (Text Version) Below is a text version of the U.S. map that provides the results of NREL's

  11. Driver Gaze Behavior Is Different in Normal Curve Driving and when Looking at the Tangent Point

    PubMed Central

    Itkonen, Teemu; Pekkanen, Jami; Lappi, Otto

    2015-01-01

    Several steering models in the visual science literature attempt to capture the visual strategies in curve driving. Some of them are based on steering points on the future path (FP), others on tangent points (TP). It is, however, challenging to differentiate between the models’ predictions in real–world contexts. Analysis of optokinetic nystagmus (OKN) parameters is one useful measure, as the different strategies predict measurably different OKN patterns. Here, we directly test this prediction by asking drivers to either a) “drive as they normally would” or b) to “look at the TP”. The design of the experiment is similar to a previous study by Kandil et al., but uses more sophisticated methods of eye–movement analysis. We find that the eye-movement patterns in the “normal” condition are indeed markedly different from the “tp” condition, and consistent with drivers looking at waypoints on the future path. This is the case for both overall fixation distribution, as well as the more informative fixation–by–fixation analysis of OKN. We find that the horizontal gaze speed during OKN corresponds well to the quantitative prediction of the future path models. The results also definitively rule out the alternative explanation that the OKN is produced by an involuntary reflex even while the driver is “trying” to look at the TP. The results are discussed in terms of the sequential organization of curve driving. PMID:26287914

  12. A cost-benefit analysis of The National Map

    USGS Publications Warehouse

    Halsing, David L.; Theissen, Kevin; Bernknopf, Richard

    2003-01-01

    The Geography Discipline of the U.S. Geological Survey (USGS) has conducted this cost-benefit analysis (CBA) of The National Map. This analysis is an evaluation of the proposed Geography Discipline initiative to provide the Nation with a mechanism to access current and consistent digital geospatial data. This CBA is a supporting document to accompany the Exhibit 300 Capital Asset Plan and Business Case of The National Map Reengineering Program. The framework for estimating the benefits is based on expected improvements in processing information to perform any of the possible applications of spatial data. This analysis does not attempt to determine the benefits and costs of performing geospatial-data applications. Rather, it estimates the change in the differences between those benefits and costs with The National Map and the current situation without it. The estimates of total costs and benefits of The National Map were based on the projected implementation time, development and maintenance costs, rates of data inclusion and integration, expected usage levels over time, and a benefits estimation model. The National Map provides data that are current, integrated, consistent, complete, and more accessible in order to decrease the cost of implementing spatial-data applications and (or) improve the outcome of those applications. The efficiency gains in per-application improvements are greater than the cost to develop and maintain The National Map, meaning that the program would bring a positive net benefit to the Nation. The average improvement in the net benefit of performing a spatial data application was multiplied by a simulated number of application implementations across the country. The numbers of users, existing applications, and rates of application implementation increase over time as The National Map is developed and accessed by spatial data users around the country. Results from the 'most likely' estimates of model parameters and data inputs indicate that

  13. Concept Landscapes: Aggregating Concept Maps for Analysis

    ERIC Educational Resources Information Center

    Mühling, Andreas

    2017-01-01

    This article presents "concept landscapes"--a novel way of investigating the state and development of knowledge structures in groups of persons using concept maps. Instead of focusing on the assessment and evaluation of single maps, the data of many persons is aggregated and data mining approaches are used in analysis. New insights into…

  14. Numerical prediction of rail roughness growth on tangent railway tracks

    NASA Astrophysics Data System (ADS)

    Nielsen, J. C. O.

    2003-10-01

    Growth of railhead roughness (irregularities, waviness) is predicted through numerical simulation of dynamic train-track interaction on tangent track. The hypothesis is that wear is caused by longitudinal slip due to driven wheelsets, and that wear is proportional to the longitudinal frictional power in the contact patch. Emanating from an initial roughness spectrum corresponding to a new or a recent ground rail, an initial roughness profile is determined. Wheel-rail contact forces, creepages and wear for one wheelset passage are calculated in relation to location along a discretely supported track model. The calculated wear is scaled by a chosen number of wheelset passages, and is then added to the initial roughness profile. Field observations of rail corrugation on a Dutch track are used to validate the simulation model. Results from the simulations predict a large roughness growth rate for wavelengths around 30-40 mm. The large growth in this wavelength interval is explained by a low track receptance near the sleepers around the pinned-pinned resonance frequency, in combination with a large number of driven passenger wheelset passages at uniform speed. The agreement between simulations and field measurements is good with respect to dominating roughness wavelength and annual wear rate. Remedies for reducing roughness growth are discussed.

  15. Mask Analysis Program (MAP) reference manual

    NASA Technical Reports Server (NTRS)

    Mitchell, C. L.

    1976-01-01

    A document intended to serve as a User's Manual and a Programmer's Manual for the Mask Analysis Program is presented. The first portion of the document is devoted to the user. It contains all of the information required to execute MAP. The remainder of the document describes the details of MAP software logic. Although the information in this portion is not required to run the program, it is recommended that every user review it to gain an appreciation for the program functions.

  16. 76 FR 78015 - Revised Analysis and Mapping Procedures for Non-Accredited Levees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-15

    ...] Revised Analysis and Mapping Procedures for Non-Accredited Levees AGENCY: Federal Emergency Management... comments on the proposed solution for Revised Analysis and Mapping Procedures for Non-Accredited Levees. This document proposes a revised procedure for the analysis and mapping of non-accredited levees on...

  17. BrainMap VBM: An environment for structural meta-analysis.

    PubMed

    Vanasse, Thomas J; Fox, P Mickle; Barron, Daniel S; Robertson, Michaela; Eickhoff, Simon B; Lancaster, Jack L; Fox, Peter T

    2018-05-02

    The BrainMap database is a community resource that curates peer-reviewed, coordinate-based human neuroimaging literature. By pairing the results of neuroimaging studies with their relevant meta-data, BrainMap facilitates coordinate-based meta-analysis (CBMA) of the neuroimaging literature en masse or at the level of experimental paradigm, clinical disease, or anatomic location. Initially dedicated to the functional, task-activation literature, BrainMap is now expanding to include voxel-based morphometry (VBM) studies in a separate sector, titled: BrainMap VBM. VBM is a whole-brain, voxel-wise method that measures significant structural differences between or within groups which are reported as standardized, peak x-y-z coordinates. Here we describe BrainMap VBM, including the meta-data structure, current data volume, and automated reverse inference functions (region-to-disease profile) of this new community resource. CBMA offers a robust methodology for retaining true-positive and excluding false-positive findings across studies in the VBM literature. As with BrainMap's functional database, BrainMap VBM may be synthesized en masse or at the level of clinical disease or anatomic location. As a use-case scenario for BrainMap VBM, we illustrate a trans-diagnostic data-mining procedure wherein we explore the underlying network structure of 2,002 experiments representing over 53,000 subjects through independent components analysis (ICA). To reduce data-redundancy effects inherent to any database, we demonstrate two data-filtering approaches that proved helpful to ICA. Finally, we apply hierarchical clustering analysis (HCA) to measure network- and disease-specificity. This procedure distinguished psychiatric from neurological diseases. We invite the neuroscientific community to further exploit BrainMap VBM with other modeling approaches. © 2018 Wiley Periodicals, Inc.

  18. Drainage information analysis and mapping system.

    DOT National Transportation Integrated Search

    2012-10-01

    The primary objective of this research is to develop a Drainage Information Analysis and Mapping System (DIAMS), with online inspection : data submission, which will comply with the necessary requirements, mandated by both the Governmental Accounting...

  19. Weak field equations and generalized FRW cosmology on the tangent Lorentz bundle

    NASA Astrophysics Data System (ADS)

    Triantafyllopoulos, A.; Stavrinos, P. C.

    2018-04-01

    We study field equations for a weak anisotropic model on the tangent Lorentz bundle TM of a spacetime manifold. A geometrical extension of general relativity (GR) is considered by introducing the concept of local anisotropy, i.e. a direct dependence of geometrical quantities on observer 4‑velocity. In this approach, we consider a metric on TM as the sum of an h-Riemannian metric structure and a weak anisotropic perturbation, field equations with extra terms are obtained for this model. As well, extended Raychaudhuri equations are studied in the framework of Finsler-like extensions. Canonical momentum and mass-shell equation are also generalized in relation to their GR counterparts. Quantization of the mass-shell equation leads to a generalization of the Klein–Gordon equation and dispersion relation for a scalar field. In this model the accelerated expansion of the universe can be attributed to the geometry itself. A cosmological bounce is modeled with the introduction of an anisotropic scalar field. Also, the electromagnetic field equations are directly incorporated in this framework.

  20. Methods of analysis and resources available for genetic trait mapping.

    PubMed

    Ott, J

    1999-01-01

    Methods of genetic linkage analysis are reviewed and put in context with other mapping techniques. Sources of information are outlined (books, web sites, computer programs). Special consideration is given to statistical problems in canine genetic mapping (heterozygosity, inbreeding, marker maps).

  1. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  2. Saturation of an Intra-Gene Pool Linkage Map: Towards a Unified Consensus Linkage Map for Fine Mapping and Synteny Analysis in Common Bean

    PubMed Central

    Galeano, Carlos H.; Fernandez, Andrea C.; Franco-Herrera, Natalia; Cichy, Karen A.; McClean, Phillip E.; Vanderleyden, Jos; Blair, Matthew W.

    2011-01-01

    Map-based cloning and fine mapping to find genes of interest and marker assisted selection (MAS) requires good genetic maps with reproducible markers. In this study, we saturated the linkage map of the intra-gene pool population of common bean DOR364×BAT477 (DB) by evaluating 2,706 molecular markers including SSR, SNP, and gene-based markers. On average the polymorphism rate was 7.7% due to the narrow genetic base between the parents. The DB linkage map consisted of 291 markers with a total map length of 1,788 cM. A consensus map was built using the core mapping populations derived from inter-gene pool crosses: DOR364×G19833 (DG) and BAT93×JALO EEP558 (BJ). The consensus map consisted of a total of 1,010 markers mapped, with a total map length of 2,041 cM across 11 linkage groups. On average, each linkage group on the consensus map contained 91 markers of which 83% were single copy markers. Finally, a synteny analysis was carried out using our highly saturated consensus maps compared with the soybean pseudo-chromosome assembly. A total of 772 marker sequences were compared with the soybean genome. A total of 44 syntenic blocks were identified. The linkage group Pv6 presented the most diverse pattern of synteny with seven syntenic blocks, and Pv9 showed the most consistent relations with soybean with just two syntenic blocks. Additionally, a co-linear analysis using common bean transcript map information against soybean coding sequences (CDS) revealed the relationship with 787 soybean genes. The common bean consensus map has allowed us to map a larger number of markers, to obtain a more complete coverage of the common bean genome. Our results, combined with synteny relationships provide tools to increase marker density in selected genomic regions to identify closely linked polymorphic markers for indirect selection, fine mapping or for positional cloning. PMID:22174773

  3. Mixed convection and heat generation/absorption aspects in MHD flow of tangent-hyperbolic nanoliquid with Newtonian heat/mass transfer

    NASA Astrophysics Data System (ADS)

    Qayyum, Sajid; Hayat, Tasawar; Shehzad, Sabir Ali; Alsaedi, Ahmed

    2018-03-01

    This article concentrates on the magnetohydrodynamic (MHD) stagnation point flow of tangent hyperbolic nanofluid in the presence of buoyancy forces. Flow analysis caused due to stretching surface. Characteristics of heat transfer are examined under the influence of thermal radiation and heat generation/absorption. Newtonian conditions for heat and mass transfer are employed. Nanofluid model includes Brownian motion and thermophoresis. The governing nonlinear partial differential systems of the problem are transformed into a systems of nonlinear ordinary differential equations through appropriate variables. Impact of embedded parameters on the velocity, temperature and nanoparticle concentration fields are presented graphically. Numerical computations are made to obtain the values of skin friction coefficient, local Nusselt and Sherwood numbers. It is concluded that velocity field enhances in the frame of mixed convection parameter while reverse situation is observed due to power law index. Effect of Brownian motion parameter on the temperature and heat transfer rate is quite reverse. Moreover impact of solutal conjugate parameter on the concentration and local Sherwood number is quite similar.

  4. Improving estimates of genetic maps: a meta-analysis-based approach.

    PubMed

    Stewart, William C L

    2007-07-01

    Inaccurate genetic (or linkage) maps can reduce the power to detect linkage, increase type I error, and distort haplotype and relationship inference. To improve the accuracy of existing maps, I propose a meta-analysis-based method that combines independent map estimates into a single estimate of the linkage map. The method uses the variance of each independent map estimate to combine them efficiently, whether the map estimates use the same set of markers or not. As compared with a joint analysis of the pooled genotype data, the proposed method is attractive for three reasons: (1) it has comparable efficiency to the maximum likelihood map estimate when the pooled data are homogeneous; (2) relative to existing map estimation methods, it can have increased efficiency when the pooled data are heterogeneous; and (3) it avoids the practical difficulties of pooling human subjects data. On the basis of simulated data modeled after two real data sets, the proposed method can reduce the sampling variation of linkage maps commonly used in whole-genome linkage scans. Furthermore, when the independent map estimates are also maximum likelihood estimates, the proposed method performs as well as or better than when they are estimated by the program CRIMAP. Since variance estimates of maps may not always be available, I demonstrate the feasibility of three different variance estimators. Overall, the method should prove useful to investigators who need map positions for markers not contained in publicly available maps, and to those who wish to minimize the negative effects of inaccurate maps. Copyright 2007 Wiley-Liss, Inc.

  5. Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes

    PubMed Central

    Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin

    2012-01-01

    Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993

  6. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  7. A Graphics System for Pole-Zero Map Analysis.

    ERIC Educational Resources Information Center

    Beyer, William Fred, III

    Computer scientists have developed an interactive, graphical display system for pole-zero map analysis. They designed it for use as an educational tool in teaching introductory courses in automatic control systems. The facilities allow the user to specify a control system and an input function in the form of a pole-zero map and then examine the…

  8. Quality Analysis of Open Street Map Data

    NASA Astrophysics Data System (ADS)

    Wang, M.; Li, Q.; Hu, Q.; Zhou, M.

    2013-05-01

    Crowd sourcing geographic data is an opensource geographic data which is contributed by lots of non-professionals and provided to the public. The typical crowd sourcing geographic data contains GPS track data like OpenStreetMap, collaborative map data like Wikimapia, social websites like Twitter and Facebook, POI signed by Jiepang user and so on. These data will provide canonical geographic information for pubic after treatment. As compared with conventional geographic data collection and update method, the crowd sourcing geographic data from the non-professional has characteristics or advantages of large data volume, high currency, abundance information and low cost and becomes a research hotspot of international geographic information science in the recent years. Large volume crowd sourcing geographic data with high currency provides a new solution for geospatial database updating while it need to solve the quality problem of crowd sourcing geographic data obtained from the non-professionals. In this paper, a quality analysis model for OpenStreetMap crowd sourcing geographic data is proposed. Firstly, a quality analysis framework is designed based on data characteristic analysis of OSM data. Secondly, a quality assessment model for OSM data by three different quality elements: completeness, thematic accuracy and positional accuracy is presented. Finally, take the OSM data of Wuhan for instance, the paper analyses and assesses the quality of OSM data with 2011 version of navigation map for reference. The result shows that the high-level roads and urban traffic network of OSM data has a high positional accuracy and completeness so that these OSM data can be used for updating of urban road network database.

  9. MAP Attitude Control System Design and Analysis

    NASA Technical Reports Server (NTRS)

    Andrews, S. F.; Campbell, C. E.; Ericsson-Jackson, A. J.; Markley, F. L.; ODonnell, J. R., Jr.

    1997-01-01

    The Microwave Anisotropy Probe (MAP) is a follow-on to the Differential Microwave Radiometer (DMR) instrument on the Cosmic Background Explorer (COBE) spacecraft. The MAP spacecraft will perform its mission in a Lissajous orbit around the Earth-Sun L(sub 2) Lagrange point to suppress potential instrument disturbances. To make a full-sky map of cosmic microwave background fluctuations, a combination fast spin and slow precession motion will be used. MAP requires a propulsion system to reach L(sub 2), to unload system momentum, and to perform stationkeeping maneuvers once at L(sub 2). A minimum hardware, power and thermal safe control mode must also be provided. Sufficient attitude knowledge must be provided to yield instrument pointing to a standard deviation of 1.8 arc-minutes. The short development time and tight budgets require a new way of designing, simulating, and analyzing the Attitude Control System (ACS). This paper presents the design and analysis of the control system to meet these requirements.

  10. Drainage identification analysis and mapping, phase 2.

    DOT National Transportation Integrated Search

    2017-01-01

    Drainage Identification, Analysis and Mapping System (DIAMS) is a computerized database that captures and : stores relevant information associated with all aboveground and underground hydraulic structures belonging to : the New Jersey Department of T...

  11. A tangent-ring optical TWDM-MAN enabling three-level transregional reconfigurations and shared protections by multipoint distributed control

    NASA Astrophysics Data System (ADS)

    Gou, Kaiyu; Gan, Chaoqin; Zhang, Xiaoyu; Zhang, Yuchao

    2018-03-01

    An optical time-and-wavelength-division-multiplexing metro-access network (TWDM-MAN) is proposed and demonstrated in this paper. By the reuse of tangent-ring optical distribution network and the design of distributed control mechanism, ONUs needing to communicate with each other can be flexibly accessed to successfully make up three kinds of reconfigurable networks. By the nature advantage of ring topology in protection, three-level comprehensive protections covering both feeder and distribution fibers are also achieved. Besides, a distributed wavelength allocation (DWA) is designed to support efficient parallel upstream transmission. The analyses including capacity, congestion and transmission simulation show that this network has a great performance.

  12. Magnetic properties and energy-mapping analysis.

    PubMed

    Xiang, Hongjun; Lee, Changhoon; Koo, Hyun-Joo; Gong, Xingao; Whangbo, Myung-Hwan

    2013-01-28

    The magnetic energy levels of a given magnetic solid are closely packed in energy because the interactions between magnetic ions are weak. Thus, in describing its magnetic properties, one needs to generate its magnetic energy spectrum by employing an appropriate spin Hamiltonian. In this review article we discuss how to determine and specify a necessary spin Hamiltonian in terms of first principles electronic structure calculations on the basis of energy-mapping analysis and briefly survey important concepts and phenomena that one encounters in reading the current literature on magnetic solids. Our discussion is given on a qualitative level from the perspective of magnetic energy levels and electronic structures. The spin Hamiltonian appropriate for a magnetic system should be based on its spin lattice, i.e., the repeat pattern of its strong magnetic bonds (strong spin exchange paths), which requires one to evaluate its Heisenberg spin exchanges on the basis of energy-mapping analysis. Other weaker energy terms such as Dzyaloshinskii-Moriya (DM) spin exchange and magnetocrystalline anisotropy energies, which a spin Hamiltonian must include in certain cases, can also be evaluated by performing energy-mapping analysis. We show that the spin orientation of a transition-metal magnetic ion can be easily explained by considering its split d-block levels as unperturbed states with the spin-orbit coupling (SOC) as perturbation, that the DM exchange between adjacent spin sites can become comparable in strength to the Heisenberg spin exchange when the two spin sites are not chemically equivalent, and that the DM interaction between rare-earth and transition-metal cations is governed largely by the magnetic orbitals of the rare-earth cation.

  13. Earth mapping - aerial or satellite imagery comparative analysis

    NASA Astrophysics Data System (ADS)

    Fotev, Svetlin; Jordanov, Dimitar; Lukarski, Hristo

    Nowadays, solving the tasks for revision of existing map products and creation of new maps requires making a choice of the land cover image source. The issue of the effectiveness and cost of the usage of aerial mapping systems versus the efficiency and cost of very-high resolution satellite imagery is topical [1, 2, 3, 4]. The price of any remotely sensed image depends on the product (panchromatic or multispectral), resolution, processing level, scale, urgency of task and on whether the needed image is available in the archive or has to be requested. The purpose of the present work is: to make a comparative analysis between the two approaches for mapping the Earth having in mind two parameters: quality and cost. To suggest an approach for selection of the map information sources - airplane-based or spacecraft-based imaging systems with very-high spatial resolution. Two cases are considered: area that equals approximately one satellite scene and area that equals approximately the territory of Bulgaria.

  14. Chromatic Image Analysis For Quantitative Thermal Mapping

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  15. Variants for HDL-C, LDL-C and Triglycerides Identified from Admixture Mapping and Fine-Mapping Analysis in African-American Families

    PubMed Central

    Shetty, Priya B.; Tang, Hua; Feng, Tao; Tayo, Bamidele; Morrison, Alanna C.; Kardia, Sharon L.R.; Hanis, Craig L.; Arnett, Donna K.; Hunt, Steven C.; Boerwinkle, Eric; Rao, D.C.; Cooper, R.S.; Risch, Neil; Zhu, Xiaofeng

    2015-01-01

    Background Admixture mapping of lipids was followed-up by family-based association analysis to identify variants for cardiovascular disease in African-Americans. Methods and Results The present study conducted admixture mapping analysis for total cholesterol, high-density lipoprotein cholesterol (HDL-C), low-density lipoprotein cholesterol (LDL-C) and triglycerides. The analysis was performed in 1,905 unrelated African-American subjects from the National Heart, Lung and Blood Institute’s Family Blood Pressure Program. Regions showing admixture evidence were followed-up with family-based association analysis in 3,556 African-American subjects from the FBPP. The admixture mapping and family-based association analyses were adjusted for age, age2, sex, body-mass-index, and genome-wide mean ancestry to minimize the confounding due to population stratification. Regions that were suggestive of local ancestry association evidence were found on chromosomes 7 (LDL-C), 8 (HDL-C), 14 (triglycerides) and 19 (total cholesterol and triglycerides). In the fine-mapping analysis, 52,939 SNPs were tested and 11 SNPs (8 independent SNPs) showed nominal significant association with HDL-C (2 SNPs), LDL-C (4 SNPs) and triglycerides (5 SNPs). The family data was used in the fine-mapping to identify SNPs that showed novel associations with lipids and regions including genes with known associations for cardiovascular disease. Conclusions This study identified regions on chromosomes 7, 8, 14 and 19 and 11 SNPs from the fine-mapping analysis that were associated with HDL-C, LDL-C and triglycerides for further studies of cardiovascular disease in African-Americans. PMID:25552592

  16. Variants for HDL-C, LDL-C, and triglycerides identified from admixture mapping and fine-mapping analysis in African American families.

    PubMed

    Shetty, Priya B; Tang, Hua; Feng, Tao; Tayo, Bamidele; Morrison, Alanna C; Kardia, Sharon L R; Hanis, Craig L; Arnett, Donna K; Hunt, Steven C; Boerwinkle, Eric; Rao, Dabeeru C; Cooper, Richard S; Risch, Neil; Zhu, Xiaofeng

    2015-02-01

    Admixture mapping of lipids was followed-up by family-based association analysis to identify variants for cardiovascular disease in African Americans. The present study conducted admixture mapping analysis for total cholesterol, high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, and triglycerides. The analysis was performed in 1905 unrelated African American subjects from the National Heart, Lung and Blood Institute's Family Blood Pressure Program (FBPP). Regions showing admixture evidence were followed-up with family-based association analysis in 3556 African American subjects from the FBPP. The admixture mapping and family-based association analyses were adjusted for age, age(2), sex, body mass index, and genome-wide mean ancestry to minimize the confounding caused by population stratification. Regions that were suggestive of local ancestry association evidence were found on chromosomes 7 (low-density lipoprotein cholesterol), 8 (high-density lipoprotein cholesterol), 14 (triglycerides), and 19 (total cholesterol and triglycerides). In the fine-mapping analysis, 52 939 single-nucleotide polymorphisms (SNPs) were tested and 11 SNPs (8 independent SNPs) showed nominal significant association with high-density lipoprotein cholesterol (2 SNPs), low-density lipoprotein cholesterol (4 SNPs), and triglycerides (5 SNPs). The family data were used in the fine-mapping to identify SNPs that showed novel associations with lipids and regions, including genes with known associations for cardiovascular disease. This study identified regions on chromosomes 7, 8, 14, and 19 and 11 SNPs from the fine-mapping analysis that were associated with high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, and triglycerides for further studies of cardiovascular disease in African Americans. © 2014 American Heart Association, Inc.

  17. An analysis of logical thinking using mind mapping

    NASA Astrophysics Data System (ADS)

    Swestyani, S.; Masykuri, M.; Prayitno, B. A.; Rinanto, Y.; Widoretno, S.

    2018-05-01

    Brains can remember information in different forms, i.e images, symbols, sounds, and senses, and the information is connected by logical gate. This information needs imagination and association to construct new meaningful images. The purpose of this research was to describe a method of teaching which based on Tony Buzan’s mind mapping technique. This research showed how mind mapping could be used to measure students’ logical thinking and how mind mapping could promote students’ understanding in meaningful way. The test of mind mapping that involved 31 students of XI grade in SMA Batik 2 Surakarta was used as the data collecting method in this research. Then, the Ohassta’s mind mapping rubric was used to analyze the structure and content of mind mapping. The rubric includes four aspects, i.e knowledge, communication, thinking, and application. A qualitative analysis Miles and Hubberman’s was used to assess the obtained data. The result showed that the percentage of knowledge aspect was 53,23 %, communication aspect was 28,33 %, thinking aspect was 28,33 %, and knowledge aspect was 41,53 %. Mind mapping makes logical thinking visible so that the quality of learning that has occurred can be seen and explored. Using mind mapping in the course of teaching means that learning is no longer a complex and intractable process, measurable is not only by proxy but also by an observable phenomenon.

  18. The isotropic-nematic and nematic-nematic phase transition of binary mixtures of tangent hard-sphere chain fluids: An analytical equation of state

    NASA Astrophysics Data System (ADS)

    van Westen, Thijs; Vlugt, Thijs J. H.; Gross, Joachim

    2014-01-01

    An analytical equation of state (EoS) is derived to describe the isotropic (I) and nematic (N) phase of linear- and partially flexible tangent hard-sphere chain fluids and their mixtures. The EoS is based on an extension of Onsager's second virial theory that was developed in our previous work [T. van Westen, B. Oyarzún, T. J. H. Vlugt, and J. Gross, J. Chem. Phys. 139, 034505 (2013)]. Higher virial coefficients are calculated using a Vega-Lago rescaling procedure, which is hereby generalized to mixtures. The EoS is used to study (1) the effect of length bidispersity on the I-N and N-N phase behavior of binary linear tangent hard-sphere chain fluid mixtures, (2) the effect of partial molecular flexibility on the binary phase diagram, and (3) the solubility of hard-sphere solutes in I- and N tangent hard-sphere chain fluids. By changing the length bidispersity, two types of phase diagrams were found. The first type is characterized by an I-N region at low pressure and a N-N demixed region at higher pressure that starts from an I-N-N triphase equilibrium. The second type does not show the I-N-N equilibrium. Instead, the N-N region starts from a lower critical point at a pressure above the I-N region. The results for the I-N region are in excellent agreement with the results from molecular simulations. It is shown that the N-N demixing is driven both by orientational and configurational/excluded volume entropy. By making the chains partially flexible, it is shown that the driving force resulting from the configurational entropy is reduced (due to a less anisotropic pair-excluded volume), resulting in a shift of the N-N demixed region to higher pressure. Compared to linear chains, no topological differences in the phase diagram were found. We show that the solubility of hard-sphere solutes decreases across the I-N phase transition. Furthermore, it is shown that by using a liquid crystal mixture as the solvent, the solubility difference can by maximized by tuning the

  19. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  20. Quantitative architectural analysis: a new approach to cortical mapping.

    PubMed

    Schleicher, A; Palomero-Gallagher, N; Morosan, P; Eickhoff, S B; Kowalski, T; de Vos, K; Amunts, K; Zilles, K

    2005-12-01

    Recent progress in anatomical and functional MRI has revived the demand for a reliable, topographic map of the human cerebral cortex. Till date, interpretations of specific activations found in functional imaging studies and their topographical analysis in a spatial reference system are, often, still based on classical architectonic maps. The most commonly used reference atlas is that of Brodmann and his successors, despite its severe inherent drawbacks. One obvious weakness in traditional, architectural mapping is the subjective nature of localising borders between cortical areas, by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, more objective, quantitative mapping procedures have been established in the past years. The quantification of the neocortical, laminar pattern by defining intensity line profiles across the cortical layers, has a long tradition. During the last years, this method has been extended to enable a reliable, reproducible mapping of the cortex based on image analysis and multivariate statistics. Methodological approaches to such algorithm-based, cortical mapping were published for various architectural modalities. In our contribution, principles of algorithm-based mapping are described for cyto- and receptorarchitecture. In a cytoarchitectural parcellation of the human auditory cortex, using a sliding window procedure, the classical areal pattern of the human superior temporal gyrus was modified by a replacing of Brodmann's areas 41, 42, 22 and parts of area 21, with a novel, more detailed map. An extension and optimisation of the sliding window procedure to the specific requirements of receptorarchitectonic mapping, is also described using the macaque central sulcus and adjacent superior parietal lobule as a second, biologically independent example. Algorithm-based mapping procedures, however, are not limited to these two architectural modalities, but can be applied to all images in

  1. Geoscience data visualization and analysis using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D

  2. Weighted analysis methods for mapped plot forest inventory data: Tables, regressions, maps and graphs

    Treesearch

    Paul C. Van Deusen; Linda S. Heath

    2010-01-01

    Weighted estimation methods for analysis of mapped plot forest inventory data are discussed. The appropriate weighting scheme can vary depending on the type of analysis and graphical display. Both statistical issues and user expectations need to be considered in these methods. A weighting scheme is proposed that balances statistical considerations and the logical...

  3. Side forces on forebodies at high angles of attack and Mach numbers from 0.1 to 0.7: two tangent ogives, paraboloid and cone

    NASA Technical Reports Server (NTRS)

    Keener, E. R.; Chapman, G. T.; Taleghani, J.; Cohen, L.

    1977-01-01

    An experimental investigation was conducted in the Ames 12-Foot Wind Tunnel to determine the subsonic aerodynamic characteristics of four forebodies at high angles of attack. The forebodies tested were a tangent ogive with fineness ratio of 5, a paraboloid with fineness ratio of 3.5, a 20 deg cone, and a tangent ogive with an elliptic cross section. The investigation included the effects of nose bluntness and boundary-layer trips. The tangent-ogive forebody was also tested in the presence of a short afterbody and with the afterbody attached. Static longitudinal and lateral/directional stability data were obtained. The investigation was conducted to investigate the existence of large side forces and yawing moments at high angles of attack and zero sideslip. It was found that all of the forebodies experience steady side forces that start at angles of attack of from 20 deg to 35 deg and exist to as high as 80 deg, depending on forebody shape. The side is as large as 1.6 times the normal force and is generally repeatable with increasing and decreasing angle of attack and, also, from test to test. The side force is very sensitive to the nature of the boundary layer, as indicated by large changes with boundary trips. The maximum side force caries considerably with Reynolds number and tends to decrease with increasing Mach number. The direction of the side force is sensitive to the body geometry near the nose. The angle of attack of onset of side force is not strongly influenced by Reynolds number or Mach number but varies with forebody shape. Maximum normal force often occurs at angles of attack near 60 deg. The effect of the elliptic cross section is to reduce the angle of onset by about 10 deg compared to that of an equivalent circular forebody with the same fineness ratio. The short afterbody reduces the angle of onset by about 5 deg.

  4. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    PubMed

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Application of automated multispectral analysis to Delaware's coastal vegetation mapping

    NASA Technical Reports Server (NTRS)

    Klemas, V.; Daiber, F.; Bartlett, D.; Crichton, O.; Fornes, A.

    1973-01-01

    A baseline mapping project was undertaken in Delaware's coastal wetlands as a prelude to an evaluation of the relative value of different parcels of marsh and the setting of priorities for use of these marshes. A description of Delaware's wetlands is given and a mapping approach is discussed together with details concerning an automated analysis. The precision and resolution of the analysis was limited primarily by the quality of the imagery used.

  6. Mapping as a learning strategy in health professions education: a critical analysis.

    PubMed

    Pudelko, Beatrice; Young, Meredith; Vincent-Lamarre, Philippe; Charlin, Bernard

    2012-12-01

    Mapping is a means of representing knowledge in a visual network and is becoming more commonly used as a learning strategy in medical education. The assumption driving the development and use of concept mapping is that it supports and furthers meaningful learning. The goal of this paper was to examine the effectiveness of concept mapping as a learning strategy in health professions education. The authors conducted a critical analysis of recent literature on the use of concept mapping as a learning strategy in the area of health professions education. Among the 65 studies identified, 63% were classified as empirical work, the majority (76%) of which used pre-experimental designs. Only 24% of empirical studies assessed the impact of mapping on meaningful learning. Results of the analysis do not support the hypothesis that mapping per se furthers and supports meaningful learning, memorisation or factual recall. When documented improvements in learning were found, they often occurred when mapping was used in concert with other strategies, such as collaborative learning or instructor modelling, scaffolding and feedback. Current empirical research on mapping as a learning strategy presents methodological shortcomings that limit its internal and external validity. The results of our analysis indicate that mapping strategies that make use of feedback and scaffolding have beneficial effects on learning. Accordingly, we see a need to expand the process of reflection on the characteristics of representational guidance as it is provided by mapping techniques and tools based on field of knowledge, instructional objectives, and the characteristics of learners in health professions education. © Blackwell Publishing Ltd 2012.

  7. TRANSVERSE MERCATOR MAP PROJECTION OF THE SPHEROID USING TRANSFORMATION OF THE ELLIPTIC INTEGRAL

    NASA Technical Reports Server (NTRS)

    Wallis, D. E.

    1994-01-01

    This program produces the Gauss-Kruger (constant meridional scale) Transverse Mercator Projection which is used to construct the U.S. Army's Universal Transverse Mercator (UTM) Grid System. The method is capable of mapping the entire northern hemisphere of the earth (and, by symmetry of the projection, the entire earth) accurately with respect to a single principal meridian, and is therefore mathematically insensitive to proximity either to the pole or the equator, or to the departure of the meridian from the central meridian. This program could be useful to any map-making agency. The program overcomes the limitations of the "series" method (Thomas, 1952) presently used to compute the UTM Grid, specifically its complicated derivation, non-convergence near the pole, lack of rigorous error analysis, and difficulty of obtaining increased accuracy. The method is based on the principle that the parametric colatitude of a point is the amplitude of the Elliptic Integral of the 2nd Kind, and this (irreducible) integral is the desired projection. Thus, a specification of the colatitude leads, most directly (and with strongest motivation) to a formulation in terms of amplitude. The most difficult problem to be solved was setting up the method so that the Elliptic Integral of the 2nd Kind could be used elsewhere than on the principal meridian. The point to be mapped is specified in conventional geographic coordinates (geodetic latitude and longitudinal departure from the principal meridian). Using the colatitude (complement of latitude) and the longitude (departure), the initial step is to map the point to the North Polar Stereographic Projection. The closed-form, analytic function that coincides with the North Polar Stereographic Projection of the spheroid along the principal meridian is put into a Newton-Raphson iteration that solves for the tangent of one half the parametric colatitude, generalized to the complex plane. Because the parametric colatitude is the amplitude of

  8. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    1996-01-01

    We first report on our current progress in the area of explicit methods for tangent curve computation. The basic idea of this method is to decompose the domain into a collection of triangles (or tetrahedra) and assume linear variation of the vector field over each cell. With this assumption, the equations which define a tangent curve become a system of linear, constant coefficient ODE's which can be solved explicitly. There are five different representation of the solution depending on the eigenvalues of the Jacobian. The analysis of these five cases is somewhat similar to the phase plane analysis often associate with critical point classification within the context of topological methods, but it is not exactly the same. There are some critical differences. Moving from one cell to the next as a tangent curve is tracked, requires the computation of the exit point which is an intersection of the solution of the constant coefficient ODE and the edge of a triangle. There are two possible approaches to this root computation problem. We can express the tangent curve into parametric form and substitute into an implicit form for the edge or we can express the edge in parametric form and substitute in an implicit form of the tangent curve. Normally the solution of a system of ODE's is given in parametric form and so the first approach is the most accessible and straightforward. The second approach requires the 'implicitization' of these parametric curves. The implicitization of parametric curves can often be rather difficult, but in this case we have been successful and have been able to develop algorithms and subsequent computer programs for both approaches. We will give these details along with some comparisons in a forthcoming research paper on this topic.

  9. Wavelet analysis of polarization maps of polycrystalline biological fluids networks

    NASA Astrophysics Data System (ADS)

    Ushenko, Y. A.

    2011-12-01

    The optical model of human joints synovial fluid is proposed. The statistic (statistic moments), correlation (autocorrelation function) and self-similar (Log-Log dependencies of power spectrum) structure of polarization two-dimensional distributions (polarization maps) of synovial fluid has been analyzed. It has been shown that differentiation of polarization maps of joint synovial fluid with different physiological state samples is expected of scale-discriminative analysis. To mark out of small-scale domain structure of synovial fluid polarization maps, the wavelet analysis has been used. The set of parameters, which characterize statistic, correlation and self-similar structure of wavelet coefficients' distributions of different scales of polarization domains for diagnostics and differentiation of polycrystalline network transformation connected with the pathological processes, has been determined.

  10. A web-based tool for groundwater mapping and drought analysis

    NASA Astrophysics Data System (ADS)

    Christensen, S.; Burns, M.; Jones, N.; Strassberg, G.

    2012-12-01

    In 2011-2012, the state of Texas saw the worst one-year drought on record. Fluctuations in gravity measured by GRACE satellites indicate that as much as 100 cubic kilometers of water was lost during this period. Much of this came from reservoirs and shallow soil moisture, but a significant amount came from aquifers. In response to this crisis, a Texas Drought Technology Steering Committee (TDTSC) consisting of academics and water managers was formed to develop new tools and strategies to assist the state in monitoring, predicting, and responding to drought events. In this presentation, we describe one of the tools that was developed as part of this effort. When analyzing the impact of drought on groundwater levels, it is fairly common to examine time series data at selected monitoring wells. However, accurately assessing impacts and trends requires both spatial and temporal analysis involving the development of detailed water level maps at various scales. Creating such maps in a flexible and rapid fashion is critical for effective drought analysis, but can be challenging due to the massive amounts of data involved and the processing required to generate such maps. Furthermore, wells are typically not sampled at the same points in time, and so developing a water table map for a particular date requires both spatial and temporal interpolation of water elevations. To address this challenge, a Cloud-based water level mapping system was developed for the state of Texas. The system is based on the Texas Water Development Board (TWDB) groundwater database, but can be adapted to use other databases as well. The system involves a set of ArcGIS workflows running on a server with a web-based front end and a Google Earth plug-in. A temporal interpolation geoprocessing tool was developed to estimate the piezometric heads for all wells in a given region at a specific date using a regression analysis. This interpolation tool is coupled with other geoprocessing tools to filter

  11. New Mexico Play Fairway Analysis: Particle Tracking ArcGIS Map Packages

    DOE Data Explorer

    Jeff Pepin

    2015-11-15

    These are map packages used to visualize geochemical particle-tracking analysis results in ArcGIS. It includes individual map packages for several regions of New Mexico including: Acoma, Rincon, Gila, Las Cruces, Socorro and Truth or Consequences.

  12. High-density genetic map construction and comparative genome analysis in asparagus bean.

    PubMed

    Huang, Haitao; Tan, Huaqiang; Xu, Dongmei; Tang, Yi; Niu, Yisong; Lai, Yunsong; Tie, Manman; Li, Huanxiu

    2018-03-19

    Genetic maps are a prerequisite for quantitative trait locus (QTL) analysis, marker-assisted selection (MAS), fine gene mapping, and assembly of genome sequences. So far, several asparagus bean linkage maps have been established using various kinds of molecular markers. However, these maps were all constructed by gel- or array-based markers. No maps based on sequencing method have been reported. In this study, an NGS-based strategy, SLAF-seq, was applied to create a high-density genetic map for asparagus bean. Through SLAF library construction and Illumina sequencing of two parents and 100 F2 individuals, a total of 55,437 polymorphic SLAF markers were developed and mined for SNP markers. The map consisted of 5,225 SNP markers in 11 LGs, spanning a total distance of 1,850.81 cM, with an average distance between markers of 0.35 cM. Comparative genome analysis with four other legume species, soybean, common bean, mung bean and adzuki bean showed that asparagus bean is genetically more related to adzuki bean. The results will provide a foundation for future genomic research, such as QTL fine mapping, comparative mapping in pulses, and offer support for assembling asparagus bean genome sequence.

  13. Prior-knowledge-based spectral mixture analysis for impervious surface mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jinshui; He, Chunyang; Zhou, Yuyu

    2014-01-03

    In this study, we developed a prior-knowledge-based spectral mixture analysis (PKSMA) to map impervious surfaces by using endmembers derived separately for high- and low-density urban regions. First, an urban area was categorized into high- and low-density urban areas, using a multi-step classification method. Next, in high-density urban areas that were assumed to have only vegetation and impervious surfaces (ISs), the Vegetation-Impervious model (V-I) was used in a spectral mixture analysis (SMA) with three endmembers: vegetation, high albedo, and low albedo. In low-density urban areas, the Vegetation-Impervious-Soil model (V-I-S) was used in an SMA analysis with four endmembers: high albedo, lowmore » albedo, soil, and vegetation. The fraction of IS with high and low albedo in each pixel was combined to produce the final IS map. The root mean-square error (RMSE) of the IS map produced using PKSMA was about 11.0%, compared to 14.52% using four-endmember SMA. Particularly in high-density urban areas, PKSMA (RMSE = 6.47%) showed better performance than four-endmember (15.91%). The results indicate that PKSMA can improve IS mapping compared to traditional SMA by using appropriately selected endmembers and is particularly strong in high-density urban areas.« less

  14. A technique for conducting point pattern analysis of cluster plot stem-maps

    Treesearch

    C.W. Woodall; J.M. Graham

    2004-01-01

    Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...

  15. ReadXplorer—visualization and analysis of mapped sequences

    PubMed Central

    Hilker, Rolf; Stadermann, Kai Bernd; Doppmeier, Daniel; Kalinowski, Jörn; Stoye, Jens; Straube, Jasmin; Winnebald, Jörn; Goesmann, Alexander

    2014-01-01

    Motivation: Fast algorithms and well-arranged visualizations are required for the comprehensive analysis of the ever-growing size of genomic and transcriptomic next-generation sequencing data. Results: ReadXplorer is a software offering straightforward visualization and extensive analysis functions for genomic and transcriptomic DNA sequences mapped on a reference. A unique specialty of ReadXplorer is the quality classification of the read mappings. It is incorporated in all analysis functions and displayed in ReadXplorer's various synchronized data viewers for (i) the reference sequence, its base coverage as (ii) normalizable plot and (iii) histogram, (iv) read alignments and (v) read pairs. ReadXplorer's analysis capability covers RNA secondary structure prediction, single nucleotide polymorphism and deletion–insertion polymorphism detection, genomic feature and general coverage analysis. Especially for RNA-Seq data, it offers differential gene expression analysis, transcription start site and operon detection as well as RPKM value and read count calculations. Furthermore, ReadXplorer can combine or superimpose coverage of different datasets. Availability and implementation: ReadXplorer is available as open-source software at http://www.readxplorer.org along with a detailed manual. Contact: rhilker@mikrobio.med.uni-giessen.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24790157

  16. SU-F-T-380: Comparing the Effect of Respiration On Dose Distribution Between Conventional Tangent Pair and IMRT Techniques for Adjuvant Radiotherapy in Early Stage Breast Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, M; Ramaseshan, R

    2016-06-15

    Purpose: In this project, we compared the conventional tangent pair technique to IMRT technique by analyzing the dose distribution. We also investigated the effect of respiration on planning target volume (PTV) dose coverage in both techniques. Methods: In order to implement IMRT technique a template based planning protocol, dose constrains and treatment process was developed. Two open fields with optimized field weights were combined with two beamlet optimization fields in IMRT plans. We compared the dose distribution between standard tangential pair and IMRT. The improvement in dose distribution was measured by parameters such as conformity index, homogeneity index and coveragemore » index. Another end point was the IMRT technique will reduce the planning time for staff. The effect of patient’s respiration on dose distribution was also estimated. The four dimensional computed tomography (4DCT) for different phase of breathing cycle was used to evaluate the effect of respiration on IMRT planned dose distribution. Results: We have accumulated 10 patients that acquired 4DCT and planned by both techniques. Based on the preliminary analysis, the dose distribution in IMRT technique was better than conventional tangent pair technique. Furthermore, the effect of respiration in IMRT plan was not significant as evident from the 95% isodose line coverage of PTV drawn on all phases of 4DCT. Conclusion: Based on the 4DCT images, the breathing effect on dose distribution was smaller than what we expected. We suspect that there are two reasons. First, the PTV movement due to respiration was not significant. It might be because we used a tilted breast board to setup patients. Second, the open fields with optimized field weights in IMRT technique might reduce the breathing effect on dose distribution. A further investigation is necessary.« less

  17. IsoMAP (Isoscape Modeling, Analysis, and Prediction)

    NASA Astrophysics Data System (ADS)

    Miller, C. C.; Bowen, G. J.; Zhang, T.; Zhao, L.; West, J. B.; Liu, Z.; Rapolu, N.

    2009-12-01

    IsoMAP is a TeraGrid-based web portal aimed at building the infrastructure that brings together distributed multi-scale and multi-format geospatial datasets to enable statistical analysis and modeling of environmental isotopes. A typical workflow enabled by the portal includes (1) data source exploration and selection, (2) statistical analysis and model development; (3) predictive simulation of isotope distributions using models developed in (1) and (2); (4) analysis and interpretation of simulated spatial isotope distributions (e.g., comparison with independent observations, pattern analysis). The gridded models and data products created by one user can be shared and reused among users within the portal, enabling collaboration and knowledge transfer. This infrastructure and the research it fosters can lead to fundamental changes in our knowledge of the water cycle and ecological and biogeochemical processes through analysis of network-based isotope data, but it will be important A) that those with whom the data and models are shared can be sure of the origin, quality, inputs, and processing history of these products, and B) the system is agile and intuitive enough to facilitate this sharing (rather than just ‘allow’ it). IsoMAP researchers are therefore building into the portal’s architecture several components meant to increase the amount of metadata about users’ products and to repurpose those metadata to make sharing and discovery more intuitive and robust to both expected, professional users as well as unforeseeable populations from other sectors.

  18. Quantification of soil mapping by digital analysis of LANDSAT data. [Clinton County, Indiana

    NASA Technical Reports Server (NTRS)

    Kirschner, F. R.; Kaminsky, S. A.; Hinzel, E. J.; Sinclair, H. R.; Weismiller, R. A.

    1977-01-01

    Soil survey mapping units are designed such that the dominant soil represents the major proportion of the unit. At times, soil mapping delineations do not adequately represent conditions as stated in the mapping unit descriptions. Digital analysis of LANDSAT multispectral scanner (MSS) data provides a means of accurately describing and quantifying soil mapping unit composition. Digital analysis of LANDSAT MSS data collected on 9 June 1973 was used to prepare a spectral soil map for a 430-hectare area in Clinton County, Indiana. Fifteen spectral classes were defined, representing 12 soil and 3 vegetation classes. The 12 soil classes were grouped into 4 moisture regimes based upon their spectral responses; the 3 vegetation classes were grouped into one all-inclusive class.

  19. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  20. High-resolution melt analysis to identify and map sequence-tagged site anchor points onto linkage maps: a white lupin (Lupinus albus) map as an exemplar.

    PubMed

    Croxford, Adam E; Rogers, Tom; Caligari, Peter D S; Wilkinson, Michael J

    2008-01-01

    * The provision of sequence-tagged site (STS) anchor points allows meaningful comparisons between mapping studies but can be a time-consuming process for nonmodel species or orphan crops. * Here, the first use of high-resolution melt analysis (HRM) to generate STS markers for use in linkage mapping is described. This strategy is rapid and low-cost, and circumvents the need for labelled primers or amplicon fractionation. * Using white lupin (Lupinus albus, x = 25) as a case study, HRM analysis was applied to identify 91 polymorphic markers from expressed sequence tag (EST)-derived and genomic libraries. Of these, 77 generated STS anchor points in the first fully resolved linkage map of the species. The map also included 230 amplified fragment length polymorphisms (AFLP) loci, spanned 1916 cM (84.2% coverage) and divided into the expected 25 linkage groups. * Quantitative trait loci (QTL) analyses performed on the population revealed genomic regions associated with several traits, including the agronomically important time to flowering (tf), alkaloid synthesis and stem height (Ph). Use of HRM-STS markers also allowed us to make direct comparisons between our map and that of the related crop, Lupinus angustifolius, based on the conversion of RFLP, microsatellite and single nucleotide polymorphism (SNP) markers into HRM markers.

  1. A Karnaugh map based approach towards systemic reviews and meta-analysis.

    PubMed

    Hassan, Abdul Wahab; Hassan, Ahmad Kamal

    2016-01-01

    Studying meta-analysis and systemic reviews since long had helped us conclude numerous parallel or conflicting studies. Existing studies are presented in tabulated forms which contain appropriate information for specific cases yet it is difficult to visualize. On meta-analysis of data, this can lead to absorption and subsumption errors henceforth having undesirable potential of consecutive misunderstandings in social and operational methodologies. The purpose of this study is to investigate an alternate forum for meta-data presentation that relies on humans' strong pictorial perception capability. Analysis of big-data is assumed to be a complex and daunting task often reserved on the computational powers of machines yet there exist mapping tools which can analyze such data in a hand-handled manner. Data analysis on such scale can benefit from the use of statistical tools like Karnaugh maps where all studies can be put together on a graph based mapping. Such a formulation can lead to more control in observing patterns of research community and analyzing further for uncertainty and reliability metrics. We present a methodological process of converting a well-established study in Health care to its equaling binary representation followed by furnishing values on to a Karnaugh Map. The data used for the studies presented herein is from Burns et al (J Publ Health 34(1):138-148, 2011) consisting of retrospectively collected data sets from various studies on clinical coding data accuracy. Using a customized filtration process, a total of 25 studies were selected for review with no, partial, or complete knowledge of six independent variables thus forming 64 independent cells on a Karnaugh map. The study concluded that this pictorial graphing as expected had helped in simplifying the overview of meta-analysis and systemic reviews.

  2. Experimental study of the convection in a rotating tangent cylinder

    NASA Astrophysics Data System (ADS)

    Aujogue, Kélig; Pothérat, Alban; Sreenivasan, Binod; Debray, François

    2018-05-01

    This paper experimentally investigates the convection in a fast rotating Tangent Cylinder (TC), for Ekman numbers down to $E=3.36\\times10^{-6}$, in a configuration relevant to the liquid core of the Earth. In the apparatus, the TC results from the Proudman-Taylor constraint incurred by rotating a hemispherical fluid vessel heated in its centre by a protruding heating element of cylindrical shape. The resulting convection that develops above the heater, i.e within the TC, is shown to set in for critical Rayleigh numbers and wavenumbers respectively scaling as $Ra_c\\sim E^{4/3}$ and $a_c\\sim E^{1/3}$ with the Ekman number $E$. Though exhibiting the same exponents as for plane rotating convection, these laws are indicative of much larger convective plumes at onset. The structure and dynamics of these plumes are in fact closer to those found in solid rotating cylinders heated from below, suggesting that the confinement within the TC induced by the Taylor-Proudman constraint influences convection in a similar way as solid walls would do. There is further similarity in that the critical modes in the TC all exhibit a slow retrograde precession at onset. In supercritical regimes, the precession evolves into a thermal wind with a complex structure featuring retrograde rotation at high latitude and either prograde or retrograde rotation at low latitudes (close to the heater), depending on the criticality and the Ekman number. Nevertheless the intensity of the thermal wind measured by the Rossby number scales as $Ro\\sim 0.85(Ra_q^*)^{0.41}$ with the Rayleigh number based on the heat flux $Ra_q^*$. This scaling suggests that the convection in the TC is driven by quasi-geostrophic dynamics, a finding supported by the scaling for the rotation-normalised Nusselt number $Nu^{*} \\sim (Ra_{q}^{*})^{5/9}$.

  3. Chaotic map clustering algorithm for EEG analysis

    NASA Astrophysics Data System (ADS)

    Bellotti, R.; De Carlo, F.; Stramaglia, S.

    2004-03-01

    The non-parametric chaotic map clustering algorithm has been applied to the analysis of electroencephalographic signals, in order to recognize the Huntington's disease, one of the most dangerous pathologies of the central nervous system. The performance of the method has been compared with those obtained through parametric algorithms, as K-means and deterministic annealing, and supervised multi-layer perceptron. While supervised neural networks need a training phase, performed by means of data tagged by the genetic test, and the parametric methods require a prior choice of the number of classes to find, the chaotic map clustering gives a natural evidence of the pathological class, without any training or supervision, thus providing a new efficient methodology for the recognition of patterns affected by the Huntington's disease.

  4. Global land cover mapping: a review and uncertainty analysis

    USGS Publications Warehouse

    Congalton, Russell G.; Gu, Jianyu; Yadav, Kamini; Thenkabail, Prasad S.; Ozdogan, Mutlu

    2014-01-01

    Given the advances in remotely sensed imagery and associated technologies, several global land cover maps have been produced in recent times including IGBP DISCover, UMD Land Cover, Global Land Cover 2000 and GlobCover 2009. However, the utility of these maps for specific applications has often been hampered due to considerable amounts of uncertainties and inconsistencies. A thorough review of these global land cover projects including evaluating the sources of error and uncertainty is prudent and enlightening. Therefore, this paper describes our work in which we compared, summarized and conducted an uncertainty analysis of the four global land cover mapping projects using an error budget approach. The results showed that the classification scheme and the validation methodology had the highest error contribution and implementation priority. A comparison of the classification schemes showed that there are many inconsistencies between the definitions of the map classes. This is especially true for the mixed type classes for which thresholds vary for the attributes/discriminators used in the classification process. Examination of these four global mapping projects provided quite a few important lessons for the future global mapping projects including the need for clear and uniform definitions of the classification scheme and an efficient, practical, and valid design of the accuracy assessment.

  5. Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis

    NASA Technical Reports Server (NTRS)

    Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.

    2012-01-01

    MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.

  6. Sequence analysis by iterated maps, a review.

    PubMed

    Almeida, Jonas S

    2014-05-01

    Among alignment-free methods, Iterated Maps (IMs) are on a particular extreme: they are also scale free (order free). The use of IMs for sequence analysis is also distinct from other alignment-free methodologies in being rooted in statistical mechanics instead of computational linguistics. Both of these roots go back over two decades to the use of fractal geometry in the characterization of phase-space representations. The time series analysis origin of the field is betrayed by the title of the manuscript that started this alignment-free subdomain in 1990, 'Chaos Game Representation'. The clash between the analysis of sequences as continuous series and the better established use of Markovian approaches to discrete series was almost immediate, with a defining critique published in same journal 2 years later. The rest of that decade would go by before the scale-free nature of the IM space was uncovered. The ensuing decade saw this scalability generalized for non-genomic alphabets as well as an interest in its use for graphic representation of biological sequences. Finally, in the past couple of years, in step with the emergence of BigData and MapReduce as a new computational paradigm, there is a surprising third act in the IM story. Multiple reports have described gains in computational efficiency of multiple orders of magnitude over more conventional sequence analysis methodologies. The stage appears to be now set for a recasting of IMs with a central role in processing nextgen sequencing results.

  7. Global Modeling and Data Assimilation. Volume 11; Documentation of the Tangent Linear and Adjoint Models of the Relaxed Arakawa-Schubert Moisture Parameterization of the NASA GEOS-1 GCM; 5.2

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Yang, Wei-Yu; Todling, Ricardo; Navon, I. Michael

    1997-01-01

    A detailed description of the development of the tangent linear model (TLM) and its adjoint model of the Relaxed Arakawa-Schubert moisture parameterization package used in the NASA GEOS-1 C-Grid GCM (Version 5.2) is presented. The notational conventions used in the TLM and its adjoint codes are described in detail.

  8. Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture

    ERIC Educational Resources Information Center

    Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie

    2015-01-01

    This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…

  9. Historical shoreline mapping (II): Application of the Digital Shoreline Mapping and Analysis Systems (DSMS/DSAS) to shoreline change mapping in Puerto Rico

    USGS Publications Warehouse

    Thieler, E. Robert; Danforth, William W.

    1994-01-01

    A new, state-of-the-art method for mapping historical shorelines from maps and aerial photographs, the Digital Shoreline Mapping System (DSMS), has been developed. The DSMS is a freely available, public domain software package that meets the cartographic and photogrammetric requirements of precise coastal mapping, and provides a means to quantify and analyze different sources of error in the mapping process. The DSMS is also capable of resolving imperfections in aerial photography that commonly are assumed to be nonexistent. The DSMS utilizes commonly available computer hardware and software, and permits the entire shoreline mapping process to be executed rapidly by a single person in a small lab. The DSMS generates output shoreline position data that are compatible with a variety of Geographic Information Systems (GIS). A second suite of programs, the Digital Shoreline Analysis System (DSAS) has been developed to calculate shoreline rates-of-change from a series of shoreline data residing in a GIS. Four rate-of-change statistics are calculated simultaneously (end-point rate, average of rates, linear regression and jackknife) at a user-specified interval along the shoreline using a measurement baseline approach. An example of DSMS and DSAS application using historical maps and air photos of Punta Uvero, Puerto Rico provides a basis for assessing the errors associated with the source materials as well as the accuracy of computed shoreline positions and erosion rates. The maps and photos used here represent a common situation in shoreline mapping: marginal-quality source materials. The maps and photos are near the usable upper limit of scale and accuracy, yet the shoreline positions are still accurate ±9.25 m when all sources of error are considered. This level of accuracy yields a resolution of ±0.51 m/yr for shoreline rates-of-change in this example, and is sufficient to identify the short-term trend (36 years) of shoreline change in the study area.

  10. Drainage identification analysis and mapping, phase 2 : technical brief.

    DOT National Transportation Integrated Search

    2017-01-01

    This research studied, tested and rectified the compatibility issue related to the recent upgrades of : NJDOT vendor inspection software, and uploaded all collected data to make Drainage Identification : Analysis and Mapping System (DIAMS) current an...

  11. Hyperspectral imaging—An advanced instrument concept for the EnMAP mission (Environmental Mapping and Analysis Programme)

    NASA Astrophysics Data System (ADS)

    Stuffler, Timo; Förster, Klaus; Hofer, Stefan; Leipold, Manfred; Sang, Bernhard; Kaufmann, Hermann; Penné, Boris; Mueller, Andreas; Chlebek, Christian

    2009-10-01

    In the upcoming generation of satellite sensors, hyperspectral instruments will play a significant role. This payload type is considered world-wide within different future planning. Our team has now successfully finalized the Phase B study for the advanced hyperspectral mission EnMAP (Environmental Mapping and Analysis Programme), Germans next optical satellite being scheduled for launch in 2012. GFZ in Potsdam has the scientific lead on EnMAP, Kayser-Threde in Munich is the industrial prime. The EnMAP instrument provides over 240 continuous spectral bands in the wavelength range between 420 and 2450 nm with a ground resolution of 30 m×30 m. Thus, the broad science and application community can draw from an extensive and highly resolved pool of information supporting the modeling and optimization process on their results. The performance of the hyperspectral instrument allows for a detailed monitoring, characterization and parameter extraction of rock/soil targets, vegetation, and inland and coastal waters on a global scale supporting a wide variety of applications in agriculture, forestry, water management and geology. The operation of an airborne system (ARES) as an element in the HGF hyperspectral network and the ongoing evolution concerning data handling and extraction procedures, will support the later inclusion process of EnMAP into the growing scientist and user communities.

  12. Improved disparity map analysis through the fusion of monocular image segmentations

    NASA Technical Reports Server (NTRS)

    Perlant, Frederic P.; Mckeown, David M.

    1991-01-01

    The focus is to examine how estimates of three dimensional scene structure, as encoded in a scene disparity map, can be improved by the analysis of the original monocular imagery. The utilization of surface illumination information is provided by the segmentation of the monocular image into fine surface patches of nearly homogeneous intensity to remove mismatches generated during stereo matching. These patches are used to guide a statistical analysis of the disparity map based on the assumption that such patches correspond closely with physical surfaces in the scene. Such a technique is quite independent of whether the initial disparity map was generated by automated area-based or feature-based stereo matching. Stereo analysis results are presented on a complex urban scene containing various man-made and natural features. This scene contains a variety of problems including low building height with respect to the stereo baseline, buildings and roads in complex terrain, and highly textured buildings and terrain. The improvements are demonstrated due to monocular fusion with a set of different region-based image segmentations. The generality of this approach to stereo analysis and its utility in the development of general three dimensional scene interpretation systems are also discussed.

  13. MetaMapR: pathway independent metabolomic network analysis incorporating unknowns.

    PubMed

    Grapov, Dmitry; Wanichthanarak, Kwanjeera; Fiehn, Oliver

    2015-08-15

    Metabolic network mapping is a widely used approach for integration of metabolomic experimental results with biological domain knowledge. However, current approaches can be limited by biochemical domain or pathway knowledge which results in sparse disconnected graphs for real world metabolomic experiments. MetaMapR integrates enzymatic transformations with metabolite structural similarity, mass spectral similarity and empirical associations to generate richly connected metabolic networks. This open source, web-based or desktop software, written in the R programming language, leverages KEGG and PubChem databases to derive associations between metabolites even in cases where biochemical domain or molecular annotations are unknown. Network calculation is enhanced through an interface to the Chemical Translation System, which allows metabolite identifier translation between >200 common biochemical databases. Analysis results are presented as interactive visualizations or can be exported as high-quality graphics and numerical tables which can be imported into common network analysis and visualization tools. Freely available at http://dgrapov.github.io/MetaMapR/. Requires R and a modern web browser. Installation instructions, tutorials and application examples are available at http://dgrapov.github.io/MetaMapR/. ofiehn@ucdavis.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Mapping broom snakeweed through image analysis of color-infrared photography and digital imagery.

    PubMed

    Everitt, J H; Yang, C

    2007-11-01

    A study was conducted on a south Texas rangeland area to evaluate aerial color-infrared (CIR) photography and CIR digital imagery combined with unsupervised image analysis techniques to map broom snakeweed [Gutierrezia sarothrae (Pursh.) Britt. and Rusby]. Accuracy assessments performed on computer-classified maps of photographic images from two sites had mean producer's and user's accuracies for broom snakeweed of 98.3 and 88.3%, respectively; whereas, accuracy assessments performed on classified maps from digital images of the same two sites had mean producer's and user's accuracies for broom snakeweed of 98.3 and 92.8%, respectively. These results indicate that CIR photography and CIR digital imagery combined with image analysis techniques can be used successfully to map broom snakeweed infestations on south Texas rangelands.

  15. Temporal mapping and analysis

    NASA Technical Reports Server (NTRS)

    O'Hara, Charles G. (Inventor); Shrestha, Bijay (Inventor); Vijayaraj, Veeraraghavan (Inventor); Mali, Preeti (Inventor)

    2011-01-01

    A compositing process for selecting spatial data collected over a period of time, creating temporal data cubes from the spatial data, and processing and/or analyzing the data using temporal mapping algebra functions. In some embodiments, the temporal data cube is creating a masked cube using the data cubes, and computing a composite from the masked cube by using temporal mapping algebra.

  16. Rosacea assessment by erythema index and principal component analysis segmentation maps

    NASA Astrophysics Data System (ADS)

    Kuzmina, Ilona; Rubins, Uldis; Saknite, Inga; Spigulis, Janis

    2017-12-01

    RGB images of rosacea were analyzed using segmentation maps of principal component analysis (PCA) and erythema index (EI). Areas of segmented clusters were compared to Clinician's Erythema Assessment (CEA) values given by two dermatologists. The results show that visible blood vessels are segmented more precisely on maps of the erythema index and the third principal component (PC3). In many cases, a distribution of clusters on EI and PC3 maps are very similar. Mean values of clusters' areas on these maps show a decrease of the area of blood vessels and erythema and an increase of lighter skin area after the therapy for the patients with diagnosis CEA = 2 on the first visit and CEA=1 on the second visit. This study shows that EI and PC3 maps are more useful than the maps of the first (PC1) and second (PC2) principal components for indicating vascular structures and erythema on the skin of rosacea patients and therapy monitoring.

  17. The isotropic-nematic phase transition of tangent hard-sphere chain fluids—Pure components

    NASA Astrophysics Data System (ADS)

    van Westen, Thijs; Oyarzún, Bernardo; Vlugt, Thijs J. H.; Gross, Joachim

    2013-07-01

    An extension of Onsager's second virial theory is developed to describe the isotropic-nematic phase transition of tangent hard-sphere chain fluids. Flexibility is introduced by the rod-coil model. The effect of chain-flexibility on the second virial coefficient is described using an accurate, analytical approximation for the orientation-dependent pair-excluded volume. The use of this approximation allows for an analytical treatment of intramolecular flexibility by using a single pure-component parameter. Two approaches to approximate the effect of the higher virial coefficients are considered, i.e., the Vega-Lago rescaling and Scaled Particle Theory (SPT). The Onsager trial function is employed to describe the orientational distribution function. Theoretical predictions for the equation of state and orientational order parameter are tested against the results from Monte Carlo (MC) simulations. For linear chains of length 9 and longer, theoretical results are in excellent agreement with MC data. For smaller chain lengths, small errors introduced by the approximation of the higher virial coefficients become apparent, leading to a small under- and overestimation of the pressure and density difference at the phase transition, respectively. For rod-coil fluids of reasonable rigidity, a quantitative comparison between theory and MC simulations is obtained. For more flexible chains, however, both the Vega-Lago rescaling and SPT lead to a small underestimation of the location of the phase transition.

  18. Hyperspectral analysis of cultural heritage artifacts: pigment material diversity in the Gough Map of Britain

    NASA Astrophysics Data System (ADS)

    Bai, Di; Messinger, David W.; Howell, David

    2017-08-01

    The Gough Map, one of the earliest surviving maps of Britain, was created and extensively revised over the 15th century. In 2015, the map was imaged using a hyperspectral imaging system while in the collection at the Bodleian Library, Oxford University. The goal of the collection of the hyperspectral image (HSI) of the Gough Map was to address questions such as enhancement of faded text for reading and analysis of the pigments used during its creation and revision. In particular, pigment analysis of the Gough Map will help historians understand the material diversity of its composition and potentially the timeline of, and methods used in, the creation and revision of the map. Multiple analysis methods are presented to analyze a particular pigment in the Gough Map with an emphasis on understanding the within-material diversity, i.e., the number and spatial layout of distinct red pigments. One approach for understanding the number of distinct materials in a scene (i.e., endmember selection and dimensionality estimation) is the Gram matrix approach. Here, this method is used to study the within-material differences of pigments in the map with common visual color. The application is a pigment analysis tool that extracts visually common pixels (here, the red pigments) from the Gough Map and estimates the material diversity of the pixels. Results show that the Gough Map is composed of at least five kinds of dominant red pigments with a particular spatial pattern. This research provides a useful tool for historical geographers and cartographic historians to analyze the material diversity of HSI of cultural heritage artifacts.

  19. Textural and Mineralogical Analysis of Volcanic Rocks by µ-XRF Mapping.

    PubMed

    Germinario, Luigi; Cossio, Roberto; Maritan, Lara; Borghi, Alessandro; Mazzoli, Claudio

    2016-06-01

    In this study, µ-XRF was applied as a novel surface technique for quick acquisition of elemental X-ray maps of rocks, image analysis of which provides quantitative information on texture and rock-forming minerals. Bench-top µ-XRF is cost-effective, fast, and non-destructive, can be applied to both large (up to a few tens of cm) and fragile samples, and yields major and trace element analysis with good sensitivity. Here, X-ray mapping was performed with a resolution of 103.5 µm and spot size of 30 µm over sample areas of about 5×4 cm of Euganean trachyte, a volcanic porphyritic rock from the Euganean Hills (NE Italy) traditionally used in cultural heritage. The relative abundance of phenocrysts and groundmass, as well as the size and shape of the various mineral phases, were obtained from image analysis of the elemental maps. The quantified petrographic features allowed identification of various extraction sites, revealing an objective method for archaeometric provenance studies exploiting µ-XRF imaging.

  20. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  1. Mapping and analysis of phosphorylation sites: a quick guide for cell biologists

    PubMed Central

    Dephoure, Noah; Gould, Kathleen L.; Gygi, Steven P.; Kellogg, Douglas R.

    2013-01-01

    A mechanistic understanding of signaling networks requires identification and analysis of phosphorylation sites. Mass spectrometry offers a rapid and highly sensitive approach to mapping phosphorylation sites. However, mass spectrometry has significant limitations that must be considered when planning to carry out phosphorylation-site mapping. Here we provide an overview of key information that should be taken into consideration before beginning phosphorylation-site analysis, as well as a step-by-step guide for carrying out successful experiments. PMID:23447708

  2. Use of multiple cluster analysis methods to explore the validity of a community outcomes concept map.

    PubMed

    Orsi, Rebecca

    2017-02-01

    Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Map Interpretation and Terrain Analysis Course (MITAC) for Infantrymen: Illustrated Lectures

    DTIC Science & Technology

    1982-01-01

    Factors Influencing Map Design . . . . . ..... ............ 4 Interpretation of Terrain Relief and Other Topographic Features...Institute (ARI) sponsored a project to design and develop a map interpretation and terrain analysis course (MITAC) to improve the ability of Army...helicopter pilots to navigate accurately when flying at nap-of-the-earth (NOE) altitudes (McGrath, 1975; McGrath & Foster, 1975). MITAC was designed to

  4. Mapping the Future, Mapping Education: An Analysis of the 2011 State of the Union Address

    ERIC Educational Resources Information Center

    Collin, Ross

    2012-01-01

    This article presents a discourse analysis of President Barack Obama's 2011 State of the Union Address. Fredric Jameson's concepts of cognitive mapping, cultural revolution, and the unconscious are employed to examine the president's vision of educational and economic transformation. Ultimately, it is argued this vision evokes a world in which…

  5. Multi-channel Analysis of Passive Surface Waves (MAPS)

    NASA Astrophysics Data System (ADS)

    Xia, J.; Cheng, F. Mr; Xu, Z.; Wang, L.; Shen, C.; Liu, R.; Pan, Y.; Mi, B.; Hu, Y.

    2017-12-01

    Urbanization is an inevitable trend in modernization of human society. In the end of 2013 the Chinese Central Government launched a national urbanization plan—"Three 100 Million People", which aggressively and steadily pushes forward urbanization. Based on the plan, by 2020, approximately 100 million people from rural areas will permanently settle in towns, dwelling conditions of about 100 million people in towns and villages will be improved, and about 100 million people in the central and western China will permanently settle in towns. China's urbanization process will run at the highest speed in the urbanization history of China. Environmentally friendly, non-destructive and non-invasive geophysical assessment method has played an important role in the urbanization process in China. Because human noise and electromagnetic field due to industrial life, geophysical methods already used in urban environments (gravity, magnetics, electricity, seismic) face great challenges. But humanity activity provides an effective source of passive seismic methods. Claerbout pointed out that wavefileds that are received at one point with excitation at the other point can be reconstructed by calculating the cross-correlation of noise records at two surface points. Based on this idea (cross-correlation of two noise records) and the virtual source method, we proposed Multi-channel Analysis of Passive Surface Waves (MAPS). MAPS mainly uses traffic noise recorded with a linear receiver array. Because Multi-channel Analysis of Surface Waves can produces a shear (S) wave velocity model with high resolution in shallow part of the model, MPAS combines acquisition and processing of active source and passive source data in a same flow, which does not require to distinguish them. MAPS is also of ability of real-time quality control of noise recording that is important for near-surface applications in urban environment. The numerical and real-world examples demonstrated that MAPS can be

  6. On the Magnetic Squashing Factor and the Lie Transport of Tangents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Roger B.; Pontin, David I.; Hornig, Gunnar

    The squashing factor (or squashing degree) of a vector field is a quantitative measure of the deformation of the field line mapping between two surfaces. In the context of solar magnetic fields, it is often used to identify gradients in the mapping of elementary magnetic flux tubes between various flux domains. Regions where these gradients in the mapping are large are referred to as quasi-separatrix layers (QSLs), and are a continuous extension of separators and separatrix surfaces. These QSLs are observed to be potential sites for the formation of strong electric currents, and are therefore important for the study ofmore » magnetic reconnection in three dimensions. Since the squashing factor, Q , is defined in terms of the Jacobian of the field line mapping, it is most often calculated by first determining the mapping between two surfaces (or some approximation of it) and then numerically differentiating. Tassev and Savcheva have introduced an alternative method, in which they parameterize the change in separation between adjacent field lines, and then integrate along individual field lines to get an estimate of the Jacobian without the need to numerically differentiate the mapping itself. But while their method offers certain computational advantages, it is formulated on a perturbative description of the field line trajectory, and the accuracy of this method is not entirely clear. Here we show, through an alternative derivation, that this integral formulation is, in principle, exact. We then demonstrate the result in the case of a linear, 3D magnetic null, which allows for an exact analytical description and direct comparison to numerical estimates.« less

  7. Comprehensive Evaluation and Analysis of China's Mainstream Online Map Service Websites

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Jiang, J.; Huang, W.; Wang, Q.; Gu, X.

    2012-08-01

    With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a) the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b) a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c) a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.

  8. Mapping patent classifications: portfolio and statistical analysis, and the comparison of strengths and weaknesses.

    PubMed

    Leydesdorff, Loet; Kogler, Dieter Franz; Yan, Bowen

    2017-01-01

    The Cooperative Patent Classifications (CPC) recently developed cooperatively by the European and US Patent Offices provide a new basis for mapping patents and portfolio analysis. CPC replaces International Patent Classifications (IPC) of the World Intellectual Property Organization. In this study, we update our routines previously based on IPC for CPC and use the occasion for rethinking various parameter choices. The new maps are significantly different from the previous ones, although this may not always be obvious on visual inspection. We provide nested maps online and a routine for generating portfolio overlays on the maps; a new tool is provided for "difference maps" between patent portfolios of organizations or firms. This is illustrated by comparing the portfolios of patents granted to two competing firms-Novartis and MSD-in 2016. Furthermore, the data is organized for the purpose of statistical analysis.

  9. Analysis Sharpens Mars Hydrogen Map, Hinting Equatorial Water Ice

    NASA Image and Video Library

    2017-09-28

    Re-analysis of 2002-2009 data from a hydrogen-finding instrument on NASA's Mars Odyssey orbiter increased the resolution of maps of hydrogen abundance. The reprocessed data (lower map) shows more "water-equivalent hydrogen" (darker blue) in some parts of this equatorial region of Mars. Puzzingly, this suggests the possible presence of water ice just beneath the surface near the equator, though it would not be thermodynamically stable there. The upper map uses raw data from Odyssey's neutron spectrometer instrument, which senses the energy state of neutrons coming from Mars, providing an indication of how much hydrogen is present in the top 3 feet (1 meter) of the surface. Hydrogen detected by Odyssey at high latitudes of Mars in 2002 was confirmed to be in the form of water ice by the follow-up NASA Phoenix Mars Lander mission in 2008. A 2017 reprocessing of the older data applied image-reconstruction techniques often used to reduce blurring from medical imaging data. The results are shown here for an area straddling the equator for about one-fourth the circumference of the planet, centered at 175 degrees west longitude. The white contours outline lobes of a formation called Medusae Fossae, coinciding with some areas of higher hydrogen abundance in the enhanced-resolution analysis. The black line indicates the limit of a relatively young lava plain, coinciding with areas of lower hydrogen abundance in the enhanced-resolution analysis. The color-coding key for hydrogen abundance in both maps is indicated by the horizontal bar, in units expressed as how much water would be present in the ground if the hydrogen is all in the form of water. Units of the equivalent water weight, as a percentage of the material in the ground, are correlated with counts recorded by the spectrometer, ranging from less than 1 weight-percent water equivalent (red) to more than 30 percent (dark blue). https://photojournal.jpl.nasa.gov/catalog/PIA21848

  10. Solution of D dimensional Dirac equation for hyperbolic tangent potential using NU method and its application in material properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suparmi, A., E-mail: soeparmi@staff.uns.ac.id; Cari, C., E-mail: cari@staff.uns.ac.id; Pratiwi, B. N., E-mail: namakubetanurpratiwi@gmail.com

    2016-02-08

    The analytical solution of D-dimensional Dirac equation for hyperbolic tangent potential is investigated using Nikiforov-Uvarov method. In the case of spin symmetry the D dimensional Dirac equation reduces to the D dimensional Schrodinger equation. The D dimensional relativistic energy spectra are obtained from D dimensional relativistic energy eigen value equation by using Mat Lab software. The corresponding D dimensional radial wave functions are formulated in the form of generalized Jacobi polynomials. The thermodynamically properties of materials are generated from the non-relativistic energy eigen-values in the classical limit. In the non-relativistic limit, the relativistic energy equation reduces to the non-relativistic energy.more » The thermal quantities of the system, partition function and specific heat, are expressed in terms of error function and imaginary error function which are numerically calculated using Mat Lab software.« less

  11. Usability analysis of indoor map application in a shopping centre

    NASA Astrophysics Data System (ADS)

    Dewi, R. S.; Hadi, R. K.

    2018-04-01

    Although indoor navigation is still new in Indonesia, its future development is very promising. Similar to the outdoor one, the indoor navigation technology provides several important functions to support route and landmark findings. Furthermore, there is also a need that indoor navigation can support the public safety especially during disaster evacuation process in a building. It is a common that the indoor navigation technologies are built as applications where users can access this technology using their smartphones, tablets, or personal computers. Therefore, a usability analysis is important to ensure the indoor navigation applications can be operated by users with highest functionality. Among several indoor map applications which were available in the market, this study chose to analyse indoor Google Maps due to its availability and popularity in Indonesia. The experiments to test indoor Google Maps was conducted in one of the biggest shopping centre building in Surabaya, Indonesia. The usability was measured by employing System Usability Scale (SUS) questionnaire. The result showed that the SUS score of indoor Google Maps was below the average score of other cellular applications to indicate the users still had high difficulty in operating and learning the features of indoor Google Maps.

  12. White Matter Fiber-based Analysis of T1w/T2w Ratio Map.

    PubMed

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  13. White matter fiber-based analysis of T1w/T2w ratio map

    NASA Astrophysics Data System (ADS)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  14. Clinical Impact and Implication of Real-Time Oscillation Analysis for Language Mapping.

    PubMed

    Ogawa, Hiroshi; Kamada, Kyousuke; Kapeller, Christoph; Prueckl, Robert; Takeuchi, Fumiya; Hiroshima, Satoru; Anei, Ryogo; Guger, Christoph

    2017-01-01

    We developed a functional brain analysis system that enabled us to perform real-time task-related electrocorticography (ECoG) and evaluated its potential in clinical practice. We hypothesized that high gamma activity (HGA) mapping would provide better spatial and temporal resolution with high signal-to-noise ratios. Seven awake craniotomy patients were evaluated. ECoG was recorded during language tasks using subdural grids, and HGA (60-170 Hz) maps were obtained in real time. The patients also underwent electrocortical stimulation (ECS) mapping to validate the suspected functional locations on HGA mapping. The results were compared and calculated to assess the sensitivity and specificity of HGA mapping. For reference, bedside HGA-ECS mapping was performed in 5 epilepsy patients. HGA mapping demonstrated functional brain areas in real time and was comparable with ECS mapping. Sensitivity and specificity for the language area were 90.1% ± 11.2% and 90.0% ± 4.2%, respectively. Most HGA-positive areas were consistent with ECS-positive regions in both groups, and there were no statistical between-group differences. Although this study included a small number of subjects, it showed real-time HGA mapping with the same setting and tasks under different conditions. This study demonstrates the clinical feasibility of real-time HGA mapping. Real-time HGA mapping enabled simple and rapid detection of language functional areas in awake craniotomy. The mapping results were highly accurate, although the mapping environment was noisy. Further studies of HGA mapping may provide the potential to elaborate complex brain functions and networks. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1993-01-01

    This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an advanced nonlinear signal analysis topographical mapping system (ATMS) of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbopump families.

  16. Internal friction between fluid particles of MHD tangent hyperbolic fluid with heat generation: Using coefficients improved by Cash and Karp

    NASA Astrophysics Data System (ADS)

    Salahuddin, T.; Khan, Imad; Malik, M. Y.; Khan, Mair; Hussain, Arif; Awais, Muhammad

    2017-05-01

    The present work examines the internal resistance between fluid particles of tangent hyperbolic fluid flow due to a non-linear stretching sheet with heat generation. Using similarity transformations, the governing system of partial differential equations is transformed into a coupled non-linear ordinary differential system with variable coefficients. Unlike the current analytical works on the flow problems in the literature, the main concern here is to numerically work out and find the solution by using Runge-Kutta-Fehlberg coefficients improved by Cash and Karp (Naseer et al., Alexandria Eng. J. 53, 747 (2014)). To determine the relevant physical features of numerous mechanisms acting on the deliberated problem, it is sufficient to have the velocity profile and temperature field and also the drag force and heat transfer rate all as given in the current paper.

  17. Using mind mapping techniques for rapid qualitative data analysis in public participation processes.

    PubMed

    Burgess-Allen, Jilla; Owen-Smith, Vicci

    2010-12-01

    In a health service environment where timescales for patient participation in service design are short and resources scarce, a balance needs to be achieved between research rigour and the timeliness and utility of the findings of patient participation processes. To develop a pragmatic mind mapping approach to managing the qualitative data from patient participation processes. While this article draws on experience of using mind maps in a variety of participation processes, a single example is used to illustrate the approach. In this example mind maps were created during the course of patient participation focus groups. Two group discussions were also transcribed verbatim to allow comparison of the rapid mind mapping approach with traditional thematic analysis of qualitative data. The illustrative example formed part of a local alcohol service review which included consultation with local alcohol service users, their families and staff groups. The mind mapping approach provided a pleasing graphical format for representing the key themes raised during the focus groups. It helped stimulate and galvanize discussion and keep it on track, enhanced transparency and group ownership of the data analysis process, allowed a rapid dynamic between data collection and feedback, and was considerably faster than traditional methods for the analysis of focus groups, while resulting in similar broad themes. This study suggests that the use of a mind mapping approach to managing qualitative data can provide a pragmatic resolution of the tension between limited resources and quality in patient participation processes. © 2010 The Authors. Health Expectations © 2010 Blackwell Publishing Ltd.

  18. NeatMap--non-clustering heat map alternatives in R.

    PubMed

    Rajaram, Satwik; Oono, Yoshi

    2010-01-22

    The clustered heat map is the most popular means of visualizing genomic data. It compactly displays a large amount of data in an intuitive format that facilitates the detection of hidden structures and relations in the data. However, it is hampered by its use of cluster analysis which does not always respect the intrinsic relations in the data, often requiring non-standardized reordering of rows/columns to be performed post-clustering. This sometimes leads to uninformative and/or misleading conclusions. Often it is more informative to use dimension-reduction algorithms (such as Principal Component Analysis and Multi-Dimensional Scaling) which respect the topology inherent in the data. Yet, despite their proven utility in the analysis of biological data, they are not as widely used. This is at least partially due to the lack of user-friendly visualization methods with the visceral impact of the heat map. NeatMap is an R package designed to meet this need. NeatMap offers a variety of novel plots (in 2 and 3 dimensions) to be used in conjunction with these dimension-reduction techniques. Like the heat map, but unlike traditional displays of such results, it allows the entire dataset to be displayed while visualizing relations between elements. It also allows superimposition of cluster analysis results for mutual validation. NeatMap is shown to be more informative than the traditional heat map with the help of two well-known microarray datasets. NeatMap thus preserves many of the strengths of the clustered heat map while addressing some of its deficiencies. It is hoped that NeatMap will spur the adoption of non-clustering dimension-reduction algorithms.

  19. Frames of reference for helicopter electronic maps - The relevance of spatial cognition and componential analysis

    NASA Technical Reports Server (NTRS)

    Harwood, Kelly; Wickens, Christopher D.

    1991-01-01

    Computer-generated map displays for NOE and low-level helicopter flight were formed according to prior research on maps, navigational problem solving, and spatial cognition in large-scale environments. The north-up map emphasized consistency of object location, wheareas, the track-up map emphasized map-terrain congruency. A component analysis indicates that different cognitive components, e.g., orienting and absolute object location, are supported to varying degrees by properties of different frames of reference.

  20. Electricity Consumption Risk Map - The use of Urban Climate Mapping for smarter analysis: Case study for Birmingham, UK.

    NASA Astrophysics Data System (ADS)

    Antunes Azevedo, Juliana; Burghardt, René; Chapman, Lee; Katzchner, Lutz; Muller, Catherine L.

    2015-04-01

    Climate is a key driving factor in energy consumption. However, income, vegetation, building mass structure, topography also impact on the amount of energy consumption. In a changing climate, increased temperatures are likely to lead to increased electricity consumption, affecting demand, distribution and generation. Furthermore, as the world population becomes more urbanized, increasing numbers of people will need to deal with not only increased temperatures from climate change, but also from the unintentional modification of the urban climate in the form of urban heat islands. Hence, climate and climate change needs to be taken into account for future urban planning aspects to increase the climate and energy resilience of the community and decrease the future social and economic costs. Geographical Information Systems provide a means to create urban climate maps as part of the urban planning process. Geostatistical analyses linking these maps with demographic and social data, enables a geo-statistical analysis to identify linkages to high-risk groups of the community and vulnerable areas of town and cities. Presently, the climatope classification is oriented towards thermal aspects and the ventilation quality (roughness) of the urban areas but can also be adapted to take into account other structural "environmental factors". This study aims to use the climatope approach to predict areas of potential high electricity consumption in Birmingham, UK. Several datasets were used to produce an average surface temperature map, vegetation map, land use map, topography map, building height map, built-up area roughness calculations, an average air temperature map and a domestic electricity consumption map. From the correlations obtained between the layers it is possible to average the importance of each factor and create a map for domestic electricity consumption to understand the influence of environmental aspects on spatial energy consumption. Based on these results city

  1. Object-based image analysis for cadastral mapping using satellite images

    NASA Astrophysics Data System (ADS)

    Kohli, D.; Crommelinck, S.; Bennett, R.; Koeva, M.; Lemmen, C.

    2017-10-01

    Cadasters together with land registry form a core ingredient of any land administration system. Cadastral maps comprise of the extent, ownership and value of land which are essential for recording and updating land records. Traditional methods for cadastral surveying and mapping often prove to be labor, cost and time intensive: alternative approaches are thus being researched for creating such maps. With the advent of very high resolution (VHR) imagery, satellite remote sensing offers a tremendous opportunity for (semi)-automation of cadastral boundaries detection. In this paper, we explore the potential of object-based image analysis (OBIA) approach for this purpose by applying two segmentation methods, i.e. MRS (multi-resolution segmentation) and ESP (estimation of scale parameter) to identify visible cadastral boundaries. Results show that a balance between high percentage of completeness and correctness is hard to achieve: a low error of commission often comes with a high error of omission. However, we conclude that the resulting segments/land use polygons can potentially be used as a base for further aggregation into tenure polygons using participatory mapping.

  2. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  3. A spherical electron-channelling pattern map for use in quartz petrofabric analysis

    USGS Publications Warehouse

    Lloyd, G.E.; Ferguson, C.C.

    1986-01-01

    Electron channelling patterns (ECP's) are formed in the scanning electron microscope (SEM) by the interaction between the incident electrons and the lattice of crystalline specimens. The patterns are unique for a particular crystallographic orientation and are therefore of considerable potential in petrofabric studies provided they can be accurately indexed. Indexing requires an ECP-map of the crystallographic stereogram or unit triangle covering all possible orientations and hence ECP patterns. Due to the presence of long-range distortions in planar ECP-maps, it is more convenient to construct the maps over a spherical surface. This also facilitates the indexing of individual ECP's. A spherical ECP-map for quartz is presented together with an example of its use in petrofabric analysis. ?? 1986.

  4. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  5. An electron tomography algorithm for reconstructing 3D morphology using surface tangents of projected scattering interfaces

    NASA Astrophysics Data System (ADS)

    Petersen, T. C.; Ringer, S. P.

    2010-03-01

    Upon discerning the mere shape of an imaged object, as portrayed by projected perimeters, the full three-dimensional scattering density may not be of particular interest. In this situation considerable simplifications to the reconstruction problem are possible, allowing calculations based upon geometric principles. Here we describe and provide an algorithm which reconstructs the three-dimensional morphology of specimens from tilt series of images for application to electron tomography. Our algorithm uses a differential approach to infer the intersection of projected tangent lines with surfaces which define boundaries between regions of different scattering densities within and around the perimeters of specimens. Details of the algorithm implementation are given and explained using reconstruction calculations from simulations, which are built into the code. An experimental application of the algorithm to a nano-sized Aluminium tip is also presented to demonstrate practical analysis for a real specimen. Program summaryProgram title: STOMO version 1.0 Catalogue identifier: AEFS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2988 No. of bytes in distributed program, including test data, etc.: 191 605 Distribution format: tar.gz Programming language: C/C++ Computer: PC Operating system: Windows XP RAM: Depends upon the size of experimental data as input, ranging from 200 Mb to 1.5 Gb Supplementary material: Sample output files, for the test run provided, are available. Classification: 7.4, 14 External routines: Dev-C++ ( http://www.bloodshed.net/devcpp.html) Nature of problem: Electron tomography of specimens for which conventional back projection may fail and/or data for which there is a limited angular

  6. Map-invariant spectral analysis for the identification of DNA periodicities

    PubMed Central

    2012-01-01

    Many signal processing based methods for finding hidden periodicities in DNA sequences have primarily focused on assigning numerical values to the symbolic DNA sequence and then applying spectral analysis tools such as the short-time discrete Fourier transform (ST-DFT) to locate these repeats. The key results pertaining to this approach are however obtained using a very specific symbolic to numerical map, namely the so-called Voss representation. An important research problem is to therefore quantify the sensitivity of these results to the choice of the symbolic to numerical map. In this article, a novel algebraic approach to the periodicity detection problem is presented and provides a natural framework for studying the role of the symbolic to numerical map in finding these repeats. More specifically, we derive a new matrix-based expression of the DNA spectrum that comprises most of the widely used mappings in the literature as special cases, shows that the DNA spectrum is in fact invariable under all these mappings, and generates a necessary and sufficient condition for the invariance of the DNA spectrum to the symbolic to numerical map. Furthermore, the new algebraic framework decomposes the periodicity detection problem into several fundamental building blocks that are totally independent of each other. Sophisticated digital filters and/or alternate fast data transforms such as the discrete cosine and sine transforms can therefore be always incorporated in the periodicity detection scheme regardless of the choice of the symbolic to numerical map. Although the newly proposed framework is matrix based, identification of these periodicities can be achieved at a low computational cost. PMID:23067324

  7. A Mathematical Analysis of Semantic Maps, with Theoretical and Applied Implications for Blended Learning Software

    ERIC Educational Resources Information Center

    Tang, Michael; David, Hyerle; Byrne, Roxanne; Tran, John

    2012-01-01

    This paper is a mathematical (Boolean) analysis a set of cognitive maps called Thinking Maps[R], based on Albert Upton's semantic principles developed in his seminal works, Design for Thinking (1961) and Creative Analysis (1961). Albert Upton can be seen as a brilliant thinker who was before his time or after his time depending on the future of…

  8. Forecast Vienna Mapping Functions 1 for real-time analysis of space geodetic observations

    NASA Astrophysics Data System (ADS)

    Boehm, J.; Kouba, J.; Schuh, H.

    2009-05-01

    The Vienna Mapping Functions 1 (VMF1) as provided by the Institute of Geodesy and Geophysics (IGG) at the Vienna University of Technology are the most accurate mapping functions for the troposphere delays that are available globally and for the entire history of space geodetic observations. So far, the VMF1 coefficients have been released with a time delay of almost two days; however, many scientific applications require their availability in near real-time, e.g. the Ultra Rapid solutions of the International GNSS Service (IGS) or the analysis of the Intensive sessions of the International VLBI Service (IVS). Here we present coefficients of the VMF1 as well as the hydrostatic and wet zenith delays that have been determined from forecasting data of the European Centre for Medium-Range Weather Forecasts (ECMWF) and provided on global grids. The comparison with parameters derived from ECMWF analysis data shows that the agreement is at the 1 mm level in terms of station height, and that the differences are larger for the wet mapping functions than for the hydrostatic mapping functions and the hydrostatic zenith delays. These new products (VMF1-FC and hydrostatic zenith delays from forecast data) can be used in real-time analysis of geodetic data without significant loss of accuracy.

  9. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    PubMed

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  10. Sensitivity analysis for parametric generalized implicit quasi-variational-like inclusions involving P-[eta]-accretive mappings

    NASA Astrophysics Data System (ADS)

    Kazmi, K. R.; Khan, F. A.

    2008-01-01

    In this paper, using proximal-point mapping technique of P-[eta]-accretive mapping and the property of the fixed-point set of set-valued contractive mappings, we study the behavior and sensitivity analysis of the solution set of a parametric generalized implicit quasi-variational-like inclusion involving P-[eta]-accretive mapping in real uniformly smooth Banach space. Further, under suitable conditions, we discuss the Lipschitz continuity of the solution set with respect to the parameter. The technique and results presented in this paper can be viewed as extension of the techniques and corresponding results given in [R.P. Agarwal, Y.-J. Cho, N.-J. Huang, Sensitivity analysis for strongly nonlinear quasi-variational inclusions, Appl. MathE Lett. 13 (2002) 19-24; S. Dafermos, Sensitivity analysis in variational inequalities, Math. Oper. Res. 13 (1988) 421-434; X.-P. Ding, Sensitivity analysis for generalized nonlinear implicit quasi-variational inclusions, Appl. Math. Lett. 17 (2) (2004) 225-235; X.-P. Ding, Parametric completely generalized mixed implicit quasi-variational inclusions involving h-maximal monotone mappings, J. Comput. Appl. Math. 182 (2) (2005) 252-269; X.-P. Ding, C.L. Luo, On parametric generalized quasi-variational inequalities, J. Optim. Theory Appl. 100 (1999) 195-205; Z. Liu, L. Debnath, S.M. Kang, J.S. Ume, Sensitivity analysis for parametric completely generalized nonlinear implicit quasi-variational inclusions, J. Math. Anal. Appl. 277 (1) (2003) 142-154; R.N. Mukherjee, H.L. Verma, Sensitivity analysis of generalized variational inequalities, J. Math. Anal. Appl. 167 (1992) 299-304; M.A. Noor, Sensitivity analysis framework for general quasi-variational inclusions, Comput. Math. Appl. 44 (2002) 1175-1181; M.A. Noor, Sensitivity analysis for quasivariational inclusions, J. Math. Anal. Appl. 236 (1999) 290-299; J.Y. Park, J.U. Jeong, Parametric generalized mixed variational inequalities, Appl. Math. Lett. 17 (2004) 43-48].

  11. Mapping of Brain Activity by Automated Volume Analysis of Immediate Early Genes.

    PubMed

    Renier, Nicolas; Adams, Eliza L; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E; Kadiri, Lolahon; Umadevi Venkataraju, Kannan; Zhou, Yu; Wang, Victoria X; Tang, Cheuk Y; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-06-16

    Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization, and quantification of the activity of all neurons across the entire brain, which has not, to date, been achieved in the mammalian brain. We introduce a pipeline for high-speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Last, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Mapping of brain activity by automated volume analysis of immediate early genes

    PubMed Central

    Renier, Nicolas; Adams, Eliza L.; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E.; Kadiri, Lolahon; Venkataraju, Kannan Umadevi; Zhou, Yu; Wang, Victoria X.; Tang, Cheuk Y.; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-01-01

    Summary Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization and quantification of the activity of all neurons across the entire brain, which has not to date been achieved in the mammalian brain. We introduce a pipeline for high speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to Haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Lastly, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. PMID:27238021

  13. Diffeomorphic Sulcal Shape Analysis on the Cortex

    PubMed Central

    Joshi, Shantanu H.; Cabeen, Ryan P.; Joshi, Anand A.; Sun, Bo; Dinov, Ivo; Narr, Katherine L.; Toga, Arthur W.; Woods, Roger P.

    2014-01-01

    We present a diffeomorphic approach for constructing intrinsic shape atlases of sulci on the human cortex. Sulci are represented as square-root velocity functions of continuous open curves in ℝ3, and their shapes are studied as functional representations of an infinite-dimensional sphere. This spherical manifold has some advantageous properties – it is equipped with a Riemannian metric on the tangent space and facilitates computational analyses and correspondences between sulcal shapes. Sulcal shape mapping is achieved by computing geodesics in the quotient space of shapes modulo scales, translations, rigid rotations and reparameterizations. The resulting sulcal shape atlas preserves important local geometry inherently present in the sample population. The sulcal shape atlas is integrated in a cortical registration framework and exhibits better geometric matching compared to the conventional euclidean method. We demonstrate experimental results for sulcal shape mapping, cortical surface registration, and sulcal classification for two different surface extraction protocols for separate subject populations. PMID:22328177

  14. Toward standardized mapping for left atrial analysis and cardiac ablation guidance

    NASA Astrophysics Data System (ADS)

    Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.

    2014-03-01

    In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.

  15. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  16. Unsupervised spatiotemporal analysis of fMRI data using graph-based visualizations of self-organizing maps.

    PubMed

    Katwal, Santosh B; Gore, John C; Marois, Rene; Rogers, Baxter P

    2013-09-01

    We present novel graph-based visualizations of self-organizing maps for unsupervised functional magnetic resonance imaging (fMRI) analysis. A self-organizing map is an artificial neural network model that transforms high-dimensional data into a low-dimensional (often a 2-D) map using unsupervised learning. However, a postprocessing scheme is necessary to correctly interpret similarity between neighboring node prototypes (feature vectors) on the output map and delineate clusters and features of interest in the data. In this paper, we used graph-based visualizations to capture fMRI data features based upon 1) the distribution of data across the receptive fields of the prototypes (density-based connectivity); and 2) temporal similarities (correlations) between the prototypes (correlation-based connectivity). We applied this approach to identify task-related brain areas in an fMRI reaction time experiment involving a visuo-manual response task, and we correlated the time-to-peak of the fMRI responses in these areas with reaction time. Visualization of self-organizing maps outperformed independent component analysis and voxelwise univariate linear regression analysis in identifying and classifying relevant brain regions. We conclude that the graph-based visualizations of self-organizing maps help in advanced visualization of cluster boundaries in fMRI data enabling the separation of regions with small differences in the timings of their brain responses.

  17. Analysis of neoplastic lesions in magnetic resonance imaging using self-organizing maps.

    PubMed

    Mei, Paulo Afonso; de Carvalho Carneiro, Cleyton; Fraser, Stephen J; Min, Li Li; Reis, Fabiano

    2015-12-15

    To provide an improved method for the identification and analysis of brain tumors in MRI scans using a semi-automated computational approach, that has the potential to provide a more objective, precise and quantitatively rigorous analysis, compared to human visual analysis. Self-Organizing Maps (SOM) is an unsupervised, exploratory data analysis tool, which can automatically domain an image into selfsimilar regions or clusters, based on measures of similarity. It can be used to perform image-domain of brain tissue on MR images, without prior knowledge. We used SOM to analyze T1, T2 and FLAIR acquisitions from two MRI machines in our service from 14 patients with brain tumors confirmed by biopsies--three lymphomas, six glioblastomas, one meningioma, one ganglioglioma, two oligoastrocytomas and one astrocytoma. The SOM software was used to analyze the data from the three image acquisitions from each patient and generated a self-organized map for each containing 25 clusters. Damaged tissue was separated from the normal tissue using the SOM technique. Furthermore, in some cases it allowed to separate different areas from within the tumor--like edema/peritumoral infiltration and necrosis. In lesions with less precise boundaries in FLAIR, the estimated damaged tissue area in the resulting map appears bigger. Our results showed that SOM has the potential to be a powerful MR imaging analysis technique for the assessment of brain tumors. Copyright © 2015. Published by Elsevier B.V.

  18. Numerical equilibrium analysis for structured consumer resource models.

    PubMed

    de Roos, A M; Diekmann, O; Getto, P; Kirkilionis, M A

    2010-02-01

    In this paper, we present methods for a numerical equilibrium and stability analysis for models of a size structured population competing for an unstructured resource. We concentrate on cases where two model parameters are free, and thus existence boundaries for equilibria and stability boundaries can be defined in the (two-parameter) plane. We numerically trace these implicitly defined curves using alternatingly tangent prediction and Newton correction. Evaluation of the maps defining the curves involves integration over individual size and individual survival probability (and their derivatives) as functions of individual age. Such ingredients are often defined as solutions of ODE, i.e., in general only implicitly. In our case, the right-hand sides of these ODE feature discontinuities that are caused by an abrupt change of behavior at the size where juveniles are assumed to turn adult. So, we combine the numerical solution of these ODE with curve tracing methods. We have implemented the algorithms for "Daphnia consuming algae" models in C-code. The results obtained by way of this implementation are shown in the form of graphs.

  19. Evaluation of linear discriminant analysis for automated Raman histological mapping of esophageal high-grade dysplasia

    NASA Astrophysics Data System (ADS)

    Hutchings, Joanne; Kendall, Catherine; Shepherd, Neil; Barr, Hugh; Stone, Nicholas

    2010-11-01

    Rapid Raman mapping has the potential to be used for automated histopathology diagnosis, providing an adjunct technique to histology diagnosis. The aim of this work is to evaluate the feasibility of automated and objective pathology classification of Raman maps using linear discriminant analysis. Raman maps of esophageal tissue sections are acquired. Principal component (PC)-fed linear discriminant analysis (LDA) is carried out using subsets of the Raman map data (6483 spectra). An overall (validated) training classification model performance of 97.7% (sensitivity 95.0 to 100% and specificity 98.6 to 100%) is obtained. The remainder of the map spectra (131,672 spectra) are projected onto the classification model resulting in Raman images, demonstrating good correlation with contiguous hematoxylin and eosin (HE) sections. Initial results suggest that LDA has the potential to automate pathology diagnosis of esophageal Raman images, but since the classification of test spectra is forced into existing training groups, further work is required to optimize the training model. A small pixel size is advantageous for developing the training datasets using mapping data, despite lengthy mapping times, due to additional morphological information gained, and could facilitate differentiation of further tissue groups, such as the basal cells/lamina propria, in the future, but larger pixels sizes (and faster mapping) may be more feasible for clinical application.

  20. Admixture Aberration Analysis: Application to Mapping in Admixed Population Using Pooled DNA

    NASA Astrophysics Data System (ADS)

    Bercovici, Sivan; Geiger, Dan

    Admixture mapping is a gene mapping approach used for the identification of genomic regions harboring disease susceptibility genes in the case of recently admixed populations such as African Americans. We present a novel method for admixture mapping, called admixture aberration analysis (AAA), that uses a DNA pool of affected admixed individuals. We demonstrate through simulations that AAA is a powerful and economical mapping method under a range of scenarios, capturing complex human diseases such as hypertension and end stage kidney disease. The method has a low false-positive rate and is robust to deviation from model assumptions. Finally, we apply AAA on 600 prostate cancer-affected African Americans, replicating a known risk locus. Simulation results indicate that the method can yield over 96% reduction in genotyping. Our method is implemented as a Java program called AAAmap and is freely available.

  1. Surname distribution in France: a distance analysis by a distorted geographical map.

    PubMed

    Mourrieras, B; Darlu, P; Hochez, J; Hazout, S

    1995-01-01

    The distribution of surnames in 90 distinct regions in France during two successive periods, 1889-1915 and 1916-1940, is analysed from the civil birth registers of the 36,500 administrative units in France. A new approach, called 'Mobile Site Method' (MSM), is developed to allow representation of a surname distance matrix by a distorted geographical map. A surname distance matrix between the various regions in France is first calculated, then a distorted geographical map called the 'surname similarity map' is built up from the surname distances between regions. To interpret this map we draw (a) successive map contours obtained during the step-by-step distortion process, revealing zones of high surname dissimilarity, and (b) maps in grey levels representing the displacement magnitude, and allowing the segmentation of the geographical and surname maps into 'homogeneous surname zones'. By integrating geography and surname information in the same analysis, and by comparing results obtained for the two successive periods, the MSM approach produces convenient maps showing: (a) 'regionalism' of some peripheral populations such as Pays Basque, Alsace, Corsica and Brittany; (b) the presence of preferential axes of communications (Rhodanian corridor, Garonne valley); (c) barriers such as the Central Massif, Vosges; (d) the weak modifications of the distorted maps associated with the two periods studied suggest an extension (but limited) of the tendency of surname uniformity in France. These results are interpreted, in the nineteenth- and twentieth century context, as the consequences of a slow process of local migrations occurring over a long period of time.

  2. Cross-Population Joint Analysis of eQTLs: Fine Mapping and Functional Annotation

    PubMed Central

    Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Mapping expression quantitative trait loci (eQTLs) has been shown as a powerful tool to uncover the genetic underpinnings of many complex traits at molecular level. In this paper, we present an integrative analysis approach that leverages eQTL data collected from multiple population groups. In particular, our approach effectively identifies multiple independent cis-eQTL signals that are consistent across populations, accounting for population heterogeneity in allele frequencies and linkage disequilibrium patterns. Furthermore, by integrating genomic annotations, our analysis framework enables high-resolution functional analysis of eQTLs. We applied our statistical approach to analyze the GEUVADIS data consisting of samples from five population groups. From this analysis, we concluded that i) jointly analysis across population groups greatly improves the power of eQTL discovery and the resolution of fine mapping of causal eQTL ii) many genes harbor multiple independent eQTLs in their cis regions iii) genetic variants that disrupt transcription factor binding are significantly enriched in eQTLs (p-value = 4.93 × 10-22). PMID:25906321

  3. Interactive remote data processing using Pixelize Wavelet Filtration (PWF-method) and PeriodMap analysis

    NASA Astrophysics Data System (ADS)

    Sych, Robert; Nakariakov, Valery; Anfinogentov, Sergey

    Wavelet analysis is suitable for investigating waves and oscillating in solar atmosphere, which are limited in both time and frequency. We have developed an algorithms to detect this waves by use the Pixelize Wavelet Filtration (PWF-method). This method allows to obtain information about the presence of propagating and non-propagating waves in the data observation (cube images), and localize them precisely in time as well in space. We tested the algorithm and found that the results of coronal waves detection are consistent with those obtained by visual inspection. For fast exploration of the data cube, in addition, we applied early-developed Period- Map analysis. This method based on the Fast Fourier Transform and allows on initial stage quickly to look for "hot" regions with the peak harmonic oscillations and determine spatial distribution at the significant harmonics. We propose the detection procedure of coronal waves separate on two parts: at the first part, we apply the PeriodMap analysis (fast preparation) and than, at the second part, use information about spatial distribution of oscillation sources to apply the PWF-method (slow preparation). There are two possible algorithms working with the data: in automatic and hands-on operation mode. Firstly we use multiply PWF analysis as a preparation narrowband maps at frequency subbands multiply two and/or harmonic PWF analysis for separate harmonics in a spectrum. Secondly we manually select necessary spectral subband and temporal interval and than construct narrowband maps. For practical implementation of the proposed methods, we have developed the remote data processing system at Institute of Solar-Terrestrial Physics, Irkutsk. The system based on the data processing server - http://pwf.iszf.irk.ru. The main aim of this resource is calculation in remote access through the local and/or global network (Internet) narrowband maps of wave's sources both in whole spectral band and at significant harmonics. In addition

  4. A Mapping from the Human Factors Analysis and Classification System (DOD-HFACS) to the Domains of Human Systems Integration (HSI)

    DTIC Science & Technology

    2009-11-01

    Equation Chapter 1 Section 1 A MAPPING FROM THE HUMAN FACTORS ANALYSIS AND CLASSIFICATION SYSTEM (DOD...OMB control number. 1. REPORT DATE NOV 2009 2. REPORT TYPE 3. DATES COVERED 4. TITLE AND SUBTITLE A Mapping from the Human Factors Analysis ...7 The Human Factors Analysis and Classification System .................................................. 7 Mapping of DoD

  5. Syntax-directed content analysis of videotext: application to a map detection recognition system

    NASA Astrophysics Data System (ADS)

    Aradhye, Hrishikesh; Herson, James A.; Myers, Gregory

    2003-01-01

    Video is an increasingly important and ever-growing source of information to the intelligence and homeland defense analyst. A capability to automatically identify the contents of video imagery would enable the analyst to index relevant foreign and domestic news videos in a convenient and meaningful way. To this end, the proposed system aims to help determine the geographic focus of a news story directly from video imagery by detecting and geographically localizing political maps from news broadcasts, using the results of videotext recognition in lieu of a computationally expensive, scale-independent shape recognizer. Our novel method for the geographic localization of a map is based on the premise that the relative placement of text superimposed on a map roughly corresponds to the geographic coordinates of the locations the text represents. Our scheme extracts and recognizes videotext, and iteratively identifies the geographic area, while allowing for OCR errors and artistic freedom. The fast and reliable recognition of such maps by our system may provide valuable context and supporting evidence for other sources, such as speech recognition transcripts. The concepts of syntax-directed content analysis of videotext presented here can be extended to other content analysis systems.

  6. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  7. Connectome analysis for pre-operative brain mapping in neurosurgery

    PubMed Central

    Hart, Michael G.; Price, Stephen J.; Suckling, John

    2016-01-01

    Abstract Object: Brain mapping has entered a new era focusing on complex network connectivity. Central to this is the search for the connectome or the brains ‘wiring diagram’. Graph theory analysis of the connectome allows understanding of the importance of regions to network function, and the consequences of their impairment or excision. Our goal was to apply connectome analysis in patients with brain tumours to characterise overall network topology and individual patterns of connectivity alterations. Methods: Resting-state functional MRI data were acquired using multi-echo, echo planar imaging pre-operatively from five participants each with a right temporal–parietal–occipital glioblastoma. Complex networks analysis was initiated by parcellating the brain into anatomically regions amongst which connections were identified by retaining the most significant correlations between the respective wavelet decomposed time-series. Results: Key characteristics of complex networks described in healthy controls were preserved in these patients, including ubiquitous small world organization. An exponentially truncated power law fit to the degree distribution predicted findings of general network robustness to injury but with a core of hubs exhibiting disproportionate vulnerability. Tumours produced a consistent reduction in local and long-range connectivity with distinct patterns of connection loss depending on lesion location. Conclusions: Connectome analysis is a feasible and novel approach to brain mapping in individual patients with brain tumours. Applications to pre-surgical planning include identifying regions critical to network function that should be preserved and visualising connections at risk from tumour resection. In the future one could use such data to model functional plasticity and recovery of cognitive deficits. PMID:27447756

  8. Quality and rigor of the concept mapping methodology: a pooled study analysis.

    PubMed

    Rosas, Scott R; Kane, Mary

    2012-05-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Mapping Informative Clusters in a Hierarchial Framework of fMRI Multivariate Analysis

    PubMed Central

    Xu, Rui; Zhen, Zonglei; Liu, Jia

    2010-01-01

    Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies. PMID:21152081

  10. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  11. Groundwater favorability map using GIS multicriteria data analysis on crystalline terrain, São Paulo State, Brazil

    NASA Astrophysics Data System (ADS)

    Madrucci, Vanessa; Taioli, Fabio; de Araújo, Carlos César

    2008-08-01

    SummaryThis paper presents the groundwater favorability mapping on a fractured terrain in the eastern portion of São Paulo State, Brazil. Remote sensing, airborne geophysical data, photogeologic interpretation, geologic and geomorphologic maps and geographic information system (GIS) techniques have been used. The results of cross-tabulation between these maps and well yield data allowed groundwater prospective parameters in a fractured-bedrock aquifer. These prospective parameters are the base for the favorability analysis whose principle is based on the knowledge-driven method. The multicriteria analysis (weighted linear combination) was carried out to give a groundwater favorability map, because the prospective parameters have different weights of importance and different classes of each parameter. The groundwater favorability map was tested by cross-tabulation with new well yield data and spring occurrence. The wells with the highest values of productivity, as well as all the springs occurrence are situated in the excellent and good favorability mapped areas. It shows good coherence between the prospective parameters and the well yield and the importance of GIS techniques for definition of target areas for detail study and wells location.

  12. Parametric mapping using spectral analysis for 11C-PBR28 PET reveals neuroinflammation in mild cognitive impairment subjects.

    PubMed

    Fan, Zhen; Dani, Melanie; Femminella, Grazia D; Wood, Melanie; Calsolaro, Valeria; Veronese, Mattia; Turkheimer, Federico; Gentleman, Steve; Brooks, David J; Hinz, Rainer; Edison, Paul

    2018-07-01

    Neuroinflammation and microglial activation play an important role in amnestic mild cognitive impairment (MCI) and Alzheimer's disease. In this study, we investigated the spatial distribution of neuroinflammation in MCI subjects, using spectral analysis (SA) to generate parametric maps and quantify 11 C-PBR28 PET, and compared these with compartmental and other kinetic models of quantification. Thirteen MCI and nine healthy controls were enrolled in this study. Subjects underwent 11 C-PBR28 PET scans with arterial cannulation. Spectral analysis with an arterial plasma input function was used to generate 11 C-PBR28 parametric maps. These maps were then compared with regional 11 C-PBR28 V T (volume of distribution) using a two-tissue compartment model and Logan graphic analysis. Amyloid load was also assessed with 18 F-Flutemetamol PET. With SA, three component peaks were identified in addition to blood volume. The 11 C-PBR28 impulse response function (IRF) at 90 min produced the lowest coefficient of variation. Single-subject analysis using this IRF demonstrated microglial activation in five out of seven amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake revealed a group-wise significant increase in neuroinflammation in amyloid-positive MCI subjects versus HC in multiple cortical association areas, and particularly in the temporal lobe. Interestingly, compartmental analysis detected group-wise increase in 11 C-PBR28 binding in the thalamus of amyloid-positive MCI subjects, while Logan parametric maps did not perform well. This study demonstrates for the first time that spectral analysis can be used to generate parametric maps of 11 C-PBR28 uptake, and is able to detect microglial activation in amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake allow voxel-wise single-subject analysis and could be used to evaluate microglial activation in individual subjects.

  13. Advanced electrophysiologic mapping systems: an evidence-based analysis.

    PubMed

    2006-01-01

    any of the advanced systems to fluoroscopy-guided ablation of tachycardia. English-language studies with sample sizes greater than or equal to 20 that were published between 2000 and 2005 were included. Observational studies on safety of advanced mapping systems and fluoroscopy were also included. Outcomes of interest were acute success, defined as termination of arrhythmia immediately following ablation; long-term success, defined as being arrhythmia free at follow-up; total procedure time; fluoroscopy time; radiation dose; number of radiofrequency pulses; complications; cost; and the cost-effectiveness ratio. Quality of the individual studies was assessed using established criteria. Quality of the overall evidence was determined by applying the GRADE evaluation system. (3) Qualitative synthesis of the data was performed. Quantitative analysis using Revman 4.2 was performed when appropriate. Quality of the Studies Thirty-four studies met the inclusion criteria. These comprised 18 studies on CARTO (4 randomized controlled trials [RCTs] and 14 non-RCTs), 3 RCTs on EnSite NavX, 4 studies on LocaLisa Navigational System (1 RCT and 3 non-RCTs), 2 studies on EnSite and CARTO, 1 on Polar Constellation basket catheter, and 7 studies on radiation safety. The quality of the studies ranged from moderate to low. Most of the studies had small sample sizes with selection bias, and there was no blinding of patients or care providers in any of the studies. Duration of follow-up ranged from 6 weeks to 29 months, with most having at least 6 months of follow-up. There was heterogeneity with respect to the approach to ablation, definition of success, and drug management before and after the ablation procedure. Evidence is based on a small number of small RCTS and non-RCTS with methodological flaws.Advanced nonfluoroscopy mapping/navigation systems provided real time 3-dimensional images with integration of anatomic and electrical potential information that enable better visualization of

  14. Dynamic Analysis of the Carotid-Kundalini Map

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Liang, Qingyong; Meng, Juan

    The nature of the fixed points of the Carotid-Kundalini (C-K) map was studied and the boundary equation of the first bifurcation of the C-K map in the parameter plane is presented. Using the quantitative criterion and rule of chaotic system, the paper reveals the general features of the C-K Map transforming from regularity to chaos. The following conclusions are obtained: (i) chaotic patterns of the C-K map may emerge out of double-periodic bifurcation; (ii) the chaotic crisis phenomena are found. At the same time, the authors analyzed the orbit of critical point of the complex C-K Map and put forward the definition of Mandelbrot-Julia set of the complex C-K Map. The authors generalized the Welstead and Cromer's periodic scanning technique and using this technology constructed a series of the Mandelbrot-Julia sets of the complex C-K Map. Based on the experimental mathematics method of combining the theory of analytic function of one complex variable with computer aided drawing, we investigated the symmetry of the Mandelbrot-Julia set and studied the topological inflexibility of distribution of the periodic region in the Mandelbrot set, and found that the Mandelbrot set contains abundant information of the structure of Julia sets by finding the whole portray of Julia sets based on Mandelbrot set qualitatively.

  15. Topographic map analysis to determine Arjuno-Welirang volcanostratigraphy and implication for geothermal exploration

    NASA Astrophysics Data System (ADS)

    Apriani, Lestari; Satriana, Joshua; Aulian Chalik, Citra; Syahputra Mulyana, Reza; Hafidz, Muhammad; Suryantini

    2017-12-01

    Volcanostratigraphy study is used for supporting geothermal exploration on preliminary survey. This study is important to identify volcanic eruption center which shows potential area of geothermal heat source. The purpose of volcanostratigraphy study in research area is going to distinguish the characteristics of volcanic eruption product that construct the volcanic body. The analysis of Arjuno-Welirang volcanostratigraphy identification are based on topographic maps of Malang sheet with 1:100.000 scale, 1:50.000 scale, and a geological map. Regarding to the delineation of ridge and river, we determine five crowns, three hummocks, one brigade and one super brigade. The crowns consist of Ringgit, Welirang, Arjuno, Kawi, and Penanggungan, the hummocks comprise of Kembar III, Kembar II, and Kembar I, the brigade is Arjuno-Welirang, and the super brigade is Tengger. Based on topographic map interpretation and geothermal prospect evaluation method analysis, shows that Arjuno-Welirang prospect area have good geothermal resource potential.

  16. Concept Maps as Instructional Tools for Improving Learning of Phase Transitions in Object-Oriented Analysis and Design

    ERIC Educational Resources Information Center

    Shin, Shin-Shing

    2016-01-01

    Students attending object-oriented analysis and design (OOAD) courses typically encounter difficulties transitioning from requirements analysis to logical design and then to physical design. Concept maps have been widely used in studies of user learning. The study reported here, based on the relationship of concept maps to learning theory and…

  17. Application of automated multispectral analysis to Delaware's coastal vegetation mapping

    NASA Technical Reports Server (NTRS)

    Klemas, V. (Principal Investigator); Daiber, D.; Bartlett, D. S.; Crichton, O. W.; Fornes, A. O.

    1973-01-01

    There are no author-identified significant results in this report. Overlay maps of Delaware's wetlands have been prepared, showing the dominant species or group of species of vegetation present. Five such categories of vegetation were used indicating marshes dominated by: (1) salt marsh cord grass; (2) salt marsh hay and spike grass; (3) reed grass; (4) high tide bush and sea myrtle; and (5) a group of fresh water species found in impounded areas built to attract water fowl. Fifteen such maps cover Delaware's wetlands from the Pennsylvania to the Maryland borders. The mapping technique employed utilizes the General Electric multispectral data processing system. This system is a hybrid analog-digital system designed as an analysis tool to be used by an operator whose own judgment and knowledge of ground truth can be incorporated at any time into the analyzing process. The result is a high speed, cost effective method for producing enhanced photomaps showing a number of spectral classes, each enhanced spectral class being representative of a vegetative species or group of species.

  18. Task-evoked brain functional magnetic susceptibility mapping by independent component analysis (χICA).

    PubMed

    Chen, Zikuan; Calhoun, Vince D

    2016-03-01

    Conventionally, independent component analysis (ICA) is performed on an fMRI magnitude dataset to analyze brain functional mapping (AICA). By solving the inverse problem of fMRI, we can reconstruct the brain magnetic susceptibility (χ) functional states. Upon the reconstructed χ dataspace, we propose an ICA-based brain functional χ mapping method (χICA) to extract task-evoked brain functional map. A complex division algorithm is applied to a timeseries of fMRI phase images to extract temporal phase changes (relative to an OFF-state snapshot). A computed inverse MRI (CIMRI) model is used to reconstruct a 4D brain χ response dataset. χICA is implemented by applying a spatial InfoMax ICA algorithm to the reconstructed 4D χ dataspace. With finger-tapping experiments on a 7T system, the χICA-extracted χ-depicted functional map is similar to the SPM-inferred functional χ map by a spatial correlation of 0.67 ± 0.05. In comparison, the AICA-extracted magnitude-depicted map is correlated with the SPM magnitude map by 0.81 ± 0.05. The understanding of the inferiority of χICA to AICA for task-evoked functional map is an ongoing research topic. For task-evoked brain functional mapping, we compare the data-driven ICA method with the task-correlated SPM method. In particular, we compare χICA with AICA for extracting task-correlated timecourses and functional maps. χICA can extract a χ-depicted task-evoked brain functional map from a reconstructed χ dataspace without the knowledge about brain hemodynamic responses. The χICA-extracted brain functional χ map reveals a bidirectional BOLD response pattern that is unavailable (or different) from AICA. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Mapping forest inventory and analysis data attributes within the framework of double sampling for stratification design

    Treesearch

    David C. Chojnacky; Randolph H. Wynne; Christine E. Blinn

    2009-01-01

    Methodology is lacking to easily map Forest Inventory and Analysis (FIA) inventory statistics for all attribute variables without having to develop separate models and methods for each variable. We developed a mapping method that can directly transfer tabular data to a map on which pixels can be added any way desired to estimate carbon (or any other variable) for a...

  20. GIS and statistical analysis for landslide susceptibility mapping in the Daunia area, Italy

    NASA Astrophysics Data System (ADS)

    Mancini, F.; Ceppi, C.; Ritrovato, G.

    2010-09-01

    This study focuses on landslide susceptibility mapping in the Daunia area (Apulian Apennines, Italy) and achieves this by using a multivariate statistical method and data processing in a Geographical Information System (GIS). The Logistic Regression (hereafter LR) method was chosen to produce a susceptibility map over an area of 130 000 ha where small settlements are historically threatened by landslide phenomena. By means of LR analysis, the tendency to landslide occurrences was, therefore, assessed by relating a landslide inventory (dependent variable) to a series of causal factors (independent variables) which were managed in the GIS, while the statistical analyses were performed by means of the SPSS (Statistical Package for the Social Sciences) software. The LR analysis produced a reliable susceptibility map of the investigated area and the probability level of landslide occurrence was ranked in four classes. The overall performance achieved by the LR analysis was assessed by local comparison between the expected susceptibility and an independent dataset extrapolated from the landslide inventory. Of the samples classified as susceptible to landslide occurrences, 85% correspond to areas where landslide phenomena have actually occurred. In addition, the consideration of the regression coefficients provided by the analysis demonstrated that a major role is played by the "land cover" and "lithology" causal factors in determining the occurrence and distribution of landslide phenomena in the Apulian Apennines.

  1. Classification and assessment of retrieved electron density maps in coherent X-ray diffraction imaging using multivariate analysis.

    PubMed

    Sekiguchi, Yuki; Oroguchi, Tomotaka; Nakasako, Masayoshi

    2016-01-01

    Coherent X-ray diffraction imaging (CXDI) is one of the techniques used to visualize structures of non-crystalline particles of micrometer to submicrometer size from materials and biological science. In the structural analysis of CXDI, the electron density map of a sample particle can theoretically be reconstructed from a diffraction pattern by using phase-retrieval (PR) algorithms. However, in practice, the reconstruction is difficult because diffraction patterns are affected by Poisson noise and miss data in small-angle regions due to the beam stop and the saturation of detector pixels. In contrast to X-ray protein crystallography, in which the phases of diffracted waves are experimentally estimated, phase retrieval in CXDI relies entirely on the computational procedure driven by the PR algorithms. Thus, objective criteria and methods to assess the accuracy of retrieved electron density maps are necessary in addition to conventional parameters monitoring the convergence of PR calculations. Here, a data analysis scheme, named ASURA, is proposed which selects the most probable electron density maps from a set of maps retrieved from 1000 different random seeds for a diffraction pattern. Each electron density map composed of J pixels is expressed as a point in a J-dimensional space. Principal component analysis is applied to describe characteristics in the distribution of the maps in the J-dimensional space. When the distribution is characterized by a small number of principal components, the distribution is classified using the k-means clustering method. The classified maps are evaluated by several parameters to assess the quality of the maps. Using the proposed scheme, structure analysis of a diffraction pattern from a non-crystalline particle is conducted in two stages: estimation of the overall shape and determination of the fine structure inside the support shape. In each stage, the most accurate and probable density maps are objectively selected. The validity

  2. Linkage analysis by genotyping of sibling populations: a genetic map for the potato cyst nematode constructed using a "pseudo-F2" mapping strategy.

    PubMed

    Rouppe van der Voort, J N; van Eck, H J; van Zandvoort, P M; Overmars, H; Helder, J; Bakker, J

    1999-07-01

    A mapping strategy is described for the construction of a linkage map of a non-inbred species in which individual offspring genotypes are not amenable to marker analysis. After one extra generation of random mating, the segregating progeny was propagated, and bulked populations of offspring were analyzed. Although the resulting population structure is different from that of commonly used mapping populations, we show that the maximum likelihood formula for a normal F2 is applicable for the estimation of recombination. This "pseudo-F2" mapping strategy, in combination with the development of an AFLP assay for single cysts, facilitated the construction of a linkage map for the potato cyst nematode Globodera rostochiensis. Using 12 pre-selected AFLP primer combinations, a total of 66 segregating markers were identified, 62 of which were mapped to nine linkage groups. These 62 AFLP markers are randomly distributed and cover about 65% of the genome. An estimate of the physical size of the Globodera genome was obtained from comparisons of the number of AFLP fragments obtained with the values for Caenorhabditis elegans. The methodology presented here resulted in the first genomic map for a cyst nematode. The low value of the kilobase/centimorgan (kb/cM) ratio for the Globodera genome will facilitate map-based cloning of genes that mediate the interaction between the nematode and its host plant.

  3. Morphometric analysis and neuroanatomical mapping of the zebrafish brain.

    PubMed

    Gupta, Tripti; Marquart, Gregory D; Horstick, Eric J; Tabor, Kathryn M; Pajevic, Sinisa; Burgess, Harold A

    2018-06-21

    Large-scale genomic studies have recently identified genetic variants causative for major neurodevelopmental disorders, such as intellectual disability and autism. However, determining how underlying developmental processes are affected by these mutations remains a significant challenge in the field. Zebrafish is an established model system in developmental neurogenetics that may be useful in uncovering the mechanisms of these mutations. Here we describe the use of voxel-intensity, deformation field, and volume-based morphometric techniques for the systematic and unbiased analysis of gene knock-down and environmental exposure-induced phenotypes in zebrafish. We first present a computational method for brain segmentation based on transgene expression patterns to create a comprehensive neuroanatomical map. This map allowed us to disclose statistically significant changes in brain microstructure and composition in neurodevelopmental models. We demonstrate the effectiveness of morphometric techniques in measuring changes in the relative size of neuroanatomical subdivisions in atoh7 morphant larvae and in identifying phenotypes in larvae treated with valproic acid, a chemical demonstrated to increase the risk of autism in humans. These tools enable rigorous evaluation of the effects of gene mutations and environmental exposures on neural development, providing an entry point for cellular and molecular analysis of basic developmental processes as well as neurodevelopmental and neurodegenerative disorders. Published by Elsevier Inc.

  4. An experimental study of the effects of aft blowing on a 3.0 caliber tangent ogive body at high angles of attack

    NASA Technical Reports Server (NTRS)

    Gittner, Nathan M.; Chokani, Ndaona

    1991-01-01

    An experimental study of the effects of aft blowing on the forebody vortex asymmetry over a 3.0 caliber tangent ogive body at high angles of attack was conducted. The tip of the ogive body was equipped with a single blowing nozzle whose position could be adjusted. The tests were conducted in a subsonic wind tunnel at laminar flow conditions. The effects of model roll, angle of attack, blowing coefficient, and blowing nozzle axial position were independently studied. Surface pressure measurements and flow visualization results were obtained. Aft blowing was observed to alleviate the degree of vortex asymmetry at all angles of attack. The blowing was found to be more effective at the higher angles of attack. However, proportional control of the degree of vortex asymmetry was not observed, because the initial flowfield was highly asymmetric.

  5. Stakeholder analysis and mapping as targeted communication strategy.

    PubMed

    Shirey, Maria R

    2012-09-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author highlights the importance of stakeholder theory and discusses how to apply the theory to conduct a stakeholder analysis. This article also provides an explanation of how to use related stakeholder mapping techniques with targeted communication strategies.

  6. Numerical study of a thermally stratified flow of a tangent hyperbolic fluid induced by a stretching cylindrical surface

    NASA Astrophysics Data System (ADS)

    Ur Rehman, Khali; Ali Khan, Abid; Malik, M. Y.; Hussain, Arif

    2017-09-01

    The effects of temperature stratification on a tangent hyperbolic fluid flow over a stretching cylindrical surface are studied. The fluid flow is achieved by taking the no-slip condition into account. The mathematical modelling of the physical problem yields a nonlinear set of partial differential equations. These obtained partial differential equations are converted in terms of ordinary differential equations. Numerical investigation is done to identify the effects of the involved physical parameters on the dimensionless velocity and temperature profiles. In the presence of temperature stratification it is noticed that the curvature parameter makes both the fluid velocity and fluid temperature increase. In addition, positive variations in the thermal stratification parameter produce retardation with respect to the fluid flow, as a result the fluid temperature drops. The skin friction coefficient shows a decreasing nature for increasing value of both power law index and Weissenberg number, whereas the local Nusselt number is an increasing function of the Prandtl number, but opposite trends are found with respect to the thermal stratification parameter. The obtained results are validated by making a comparison with the existing literature which brings support to the presently developed model.

  7. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Time Series Analysis OF SAR Image Fractal Maps: The Somma-Vesuvio Volcanic Complex Case Study

    NASA Astrophysics Data System (ADS)

    Pepe, Antonio; De Luca, Claudio; Di Martino, Gerardo; Iodice, Antonio; Manzo, Mariarosaria; Pepe, Susi; Riccio, Daniele; Ruello, Giuseppe; Sansosti, Eugenio; Zinno, Ivana

    2016-04-01

    The fractal dimension is a significant geophysical parameter describing natural surfaces representing the distribution of the roughness over different spatial scale; in case of volcanic structures, it has been related to the specific nature of materials and to the effects of active geodynamic processes. In this work, we present the analysis of the temporal behavior of the fractal dimension estimates generated from multi-pass SAR images relevant to the Somma-Vesuvio volcanic complex (South Italy). To this aim, we consider a Cosmo-SkyMed data-set of 42 stripmap images acquired from ascending orbits between October 2009 and December 2012. Starting from these images, we generate a three-dimensional stack composed by the corresponding fractal maps (ordered according to the acquisition dates), after a proper co-registration. The time-series of the pixel-by-pixel estimated fractal dimension values show that, over invariant natural areas, the fractal dimension values do not reveal significant changes; on the contrary, over urban areas, it correctly assumes values outside the natural surfaces fractality range and show strong fluctuations. As a final result of our analysis, we generate a fractal map that includes only the areas where the fractal dimension is considered reliable and stable (i.e., whose standard deviation computed over the time series is reasonably small). The so-obtained fractal dimension map is then used to identify areas that are homogeneous from a fractal viewpoint. Indeed, the analysis of this map reveals the presence of two distinctive landscape units corresponding to the Mt. Vesuvio and Gran Cono. The comparison with the (simplified) geological map clearly shows the presence in these two areas of volcanic products of different age. The presented fractal dimension map analysis demonstrates the ability to get a figure about the evolution degree of the monitored volcanic edifice and can be profitably extended in the future to other volcanic systems with

  9. Curvilinear component analysis: a self-organizing neural network for nonlinear mapping of data sets.

    PubMed

    Demartines, P; Herault, J

    1997-01-01

    We present a new strategy called "curvilinear component analysis" (CCA) for dimensionality reduction and representation of multidimensional data sets. The principle of CCA is a self-organized neural network performing two tasks: vector quantization (VQ) of the submanifold in the data set (input space); and nonlinear projection (P) of these quantizing vectors toward an output space, providing a revealing unfolding of the submanifold. After learning, the network has the ability to continuously map any new point from one space into another: forward mapping of new points in the input space, or backward mapping of an arbitrary position in the output space.

  10. Standardized unfold mapping: a technique to permit left atrial regional data display and analysis.

    PubMed

    Williams, Steven E; Tobon-Gomez, Catalina; Zuluaga, Maria A; Chubb, Henry; Butakoff, Constantine; Karim, Rashed; Ahmed, Elena; Camara, Oscar; Rhode, Kawal S

    2017-10-01

    Left atrial arrhythmia substrate assessment can involve multiple imaging and electrical modalities, but visual analysis of data on 3D surfaces is time-consuming and suffers from limited reproducibility. Unfold maps (e.g., the left ventricular bull's eye plot) allow 2D visualization, facilitate multimodal data representation, and provide a common reference space for inter-subject comparison. The aim of this work is to develop a method for automatic representation of multimodal information on a left atrial standardized unfold map (LA-SUM). The LA-SUM technique was developed and validated using 18 electroanatomic mapping (EAM) LA geometries before being applied to ten cardiac magnetic resonance/EAM paired geometries. The LA-SUM was defined as an unfold template of an average LA mesh, and registration of clinical data to this mesh facilitated creation of new LA-SUMs by surface parameterization. The LA-SUM represents 24 LA regions on a flattened surface. Intra-observer variability of LA-SUMs for both EAM and CMR datasets was minimal; root-mean square difference of 0.008 ± 0.010 and 0.007 ± 0.005 ms (local activation time maps), 0.068 ± 0.063 gs (force-time integral maps), and 0.031 ± 0.026 (CMR LGE signal intensity maps). Following validation, LA-SUMs were used for automatic quantification of post-ablation scar formation using CMR imaging, demonstrating a weak but significant relationship between ablation force-time integral and scar coverage (R 2  = 0.18, P < 0.0001). The proposed LA-SUM displays an integrated unfold map for multimodal information. The method is applicable to any LA surface, including those derived from imaging and EAM systems. The LA-SUM would facilitate standardization of future research studies involving segmental analysis of the LA.

  11. CAD system for automatic analysis of CT perfusion maps

    NASA Astrophysics Data System (ADS)

    Hachaj, T.; Ogiela, M. R.

    2011-03-01

    In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.

  12. cudaMap: a GPU accelerated program for gene expression connectivity mapping

    PubMed Central

    2013-01-01

    Background Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. Results cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Conclusion Emerging ‘omics’ technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap

  13. cudaMap: a GPU accelerated program for gene expression connectivity mapping.

    PubMed

    McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong

    2013-10-11

    Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.

  14. Video attention deviation estimation using inter-frame visual saliency map analysis

    NASA Astrophysics Data System (ADS)

    Feng, Yunlong; Cheung, Gene; Le Callet, Patrick; Ji, Yusheng

    2012-01-01

    A viewer's visual attention during video playback is the matching of his eye gaze movement to the changing video content over time. If the gaze movement matches the video content (e.g., follow a rolling soccer ball), then the viewer keeps his visual attention. If the gaze location moves from one video object to another, then the viewer shifts his visual attention. A video that causes a viewer to shift his attention often is a "busy" video. Determination of which video content is busy is an important practical problem; a busy video is difficult for encoder to deploy region of interest (ROI)-based bit allocation, and hard for content provider to insert additional overlays like advertisements, making the video even busier. One way to determine the busyness of video content is to conduct eye gaze experiments with a sizable group of test subjects, but this is time-consuming and costineffective. In this paper, we propose an alternative method to determine the busyness of video-formally called video attention deviation (VAD): analyze the spatial visual saliency maps of the video frames across time. We first derive transition probabilities of a Markov model for eye gaze using saliency maps of a number of consecutive frames. We then compute steady state probability of the saccade state in the model-our estimate of VAD. We demonstrate that the computed steady state probability for saccade using saliency map analysis matches that computed using actual gaze traces for a range of videos with different degrees of busyness. Further, our analysis can also be used to segment video into shorter clips of different degrees of busyness by computing the Kullback-Leibler divergence using consecutive motion compensated saliency maps.

  15. Topological data analysis of contagion maps for examining spreading processes on networks.

    PubMed

    Taylor, Dane; Klimm, Florian; Harrington, Heather A; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A; Mucha, Peter J

    2015-07-21

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges-for example, due to airline transportation or communication media-allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct 'contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  16. Topological data analysis of contagion maps for examining spreading processes on networks

    NASA Astrophysics Data System (ADS)

    Taylor, Dane; Klimm, Florian; Harrington, Heather A.; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A.; Mucha, Peter J.

    2015-07-01

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges--for example, due to airline transportation or communication media--allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct `contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  17. Interindividual registration and dose mapping for voxelwise population analysis of rectal toxicity in prostate cancer radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dréan, Gaël; Acosta, Oscar, E-mail: Oscar.Acosta@univ-rennes1.fr; Simon, Antoine

    2016-06-15

    Purpose: Recent studies revealed a trend toward voxelwise population analysis in order to understand the local dose/toxicity relationships in prostate cancer radiotherapy. Such approaches require, however, an accurate interindividual mapping of the anatomies and 3D dose distributions toward a common coordinate system. This step is challenging due to the high interindividual variability. In this paper, the authors propose a method designed for interindividual nonrigid registration of the rectum and dose mapping for population analysis. Methods: The method is based on the computation of a normalized structural description of the rectum using a Laplacian-based model. This description takes advantage of themore » tubular structure of the rectum and its centerline to be embedded in a nonrigid registration-based scheme. The performances of the method were evaluated on 30 individuals treated for prostate cancer in a leave-one-out cross validation. Results: Performance was measured using classical metrics (Dice score and Hausdorff distance), along with new metrics devised to better assess dose mapping in relation with structural deformation (dose-organ overlap). Considering these scores, the proposed method outperforms intensity-based and distance maps-based registration methods. Conclusions: The proposed method allows for accurately mapping interindividual 3D dose distributions toward a single anatomical template, opening the way for further voxelwise statistical analysis.« less

  18. Development of a Coordinate Transformation method for direct georeferencing in map projection frames

    NASA Astrophysics Data System (ADS)

    Zhao, Haitao; Zhang, Bing; Wu, Changshan; Zuo, Zhengli; Chen, Zhengchao

    2013-03-01

    This paper develops a novel Coordinate Transformation method (CT-method), with which the orientation angles (roll, pitch, heading) of the local tangent frame of the GPS/INS system are transformed into those (omega, phi, kappa) of the map projection frame for direct georeferencing (DG). Especially, the orientation angles in the map projection frame were derived from a sequence of coordinate transformations. The effectiveness of orientation angles transformation was verified through comparing with DG results obtained from conventional methods (Legat method and POSPac method) using empirical data. Moreover, the CT-method was also validated with simulated data. One advantage of the proposed method is that the orientation angles can be acquired simultaneously while calculating position elements of exterior orientation (EO) parameters and auxiliary points coordinates by coordinate transformation. These three methods were demonstrated and compared using empirical data. Empirical results show that the CT-method is both as sound and effective as Legat method. Compared with POSPac method, the CT-method is more suitable for calculating EO parameters for DG in map projection frames. DG accuracy of the CT-method and Legat method are at the same level. DG results of all these three methods have systematic errors in height due to inconsistent length projection distortion in the vertical and horizontal components, and these errors can be significantly reduced using the EO height correction technique in Legat's approach. Similar to the results obtained with empirical data, the effectiveness of the CT-method was also proved with simulated data. POSPac method: The method is presented by Applanix POSPac software technical note (Hutton and Savina, 1997). It is implemented in the POSEO module of POSPac software.

  19. QTL mapping and transcriptome analysis of cowpea reveals candidate genes for root-knot nematode resistance.

    PubMed

    Santos, Jansen Rodrigo Pereira; Ndeve, Arsenio Daniel; Huynh, Bao-Lam; Matthews, William Charles; Roberts, Philip Alan

    2018-01-01

    Cowpea is one of the most important food and forage legumes in drier regions of the tropics and subtropics. However, cowpea yield worldwide is markedly below the known potential due to abiotic and biotic stresses, including parasitism by root-knot nematodes (Meloidogyne spp., RKN). Two resistance genes with dominant effect, Rk and Rk2, have been reported to provide resistance against RKN in cowpea. Despite their description and use in breeding for resistance to RKN and particularly genetic mapping of the Rk locus, the exact genes conferring resistance to RKN remain unknown. In the present work, QTL mapping using recombinant inbred line (RIL) population 524B x IT84S-2049 segregating for a newly mapped locus and analysis of the transcriptome changes in two cowpea near-isogenic lines (NIL) were used to identify candidate genes for Rk and the newly mapped locus. A major QTL, designated QRk-vu9.1, associated with resistance to Meloidogyne javanica reproduction, was detected and mapped on linkage group LG9 at position 13.37 cM using egg production data. Transcriptome analysis on resistant and susceptible NILs 3 and 9 days after inoculation revealed up-regulation of 109 and 98 genes and down-regulation of 110 and 89 genes, respectively, out of 19,922 unique genes mapped to the common bean reference genome. Among the differentially expressed genes, four and nine genes were found within the QRk-vu9.1 and QRk-vu11.1 QTL intervals, respectively. Six of these genes belong to the TIR-NBS-LRR family of resistance genes and three were upregulated at one or more time-points. Quantitative RT-PCR validated gene expression to be positively correlated with RNA-seq expression pattern for eight genes. Future functional analysis of these cowpea genes will enhance our understanding of Rk-mediated resistance and identify the specific gene responsible for the resistance.

  20. Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis

    PubMed Central

    Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.

    2006-01-01

    In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709

  1. A Comparison of Spatial Analysis Methods for the Construction of Topographic Maps of Retinal Cell Density

    PubMed Central

    Garza-Gisholt, Eduardo; Hemmi, Jan M.; Hart, Nathan S.; Collin, Shaun P.

    2014-01-01

    Topographic maps that illustrate variations in the density of different neuronal sub-types across the retina are valuable tools for understanding the adaptive significance of retinal specialisations in different species of vertebrates. To date, such maps have been created from raw count data that have been subjected to only limited analysis (linear interpolation) and, in many cases, have been presented as iso-density contour maps with contour lines that have been smoothed ‘by eye’. With the use of stereological approach to count neuronal distribution, a more rigorous approach to analysing the count data is warranted and potentially provides a more accurate representation of the neuron distribution pattern. Moreover, a formal spatial analysis of retinal topography permits a more robust comparison of topographic maps within and between species. In this paper, we present a new R-script for analysing the topography of retinal neurons and compare methods of interpolating and smoothing count data for the construction of topographic maps. We compare four methods for spatial analysis of cell count data: Akima interpolation, thin plate spline interpolation, thin plate spline smoothing and Gaussian kernel smoothing. The use of interpolation ‘respects’ the observed data and simply calculates the intermediate values required to create iso-density contour maps. Interpolation preserves more of the data but, consequently includes outliers, sampling errors and/or other experimental artefacts. In contrast, smoothing the data reduces the ‘noise’ caused by artefacts and permits a clearer representation of the dominant, ‘real’ distribution. This is particularly useful where cell density gradients are shallow and small variations in local density may dramatically influence the perceived spatial pattern of neuronal topography. The thin plate spline and the Gaussian kernel methods both produce similar retinal topography maps but the smoothing parameters used may affect

  2. A comparison of spatial analysis methods for the construction of topographic maps of retinal cell density.

    PubMed

    Garza-Gisholt, Eduardo; Hemmi, Jan M; Hart, Nathan S; Collin, Shaun P

    2014-01-01

    Topographic maps that illustrate variations in the density of different neuronal sub-types across the retina are valuable tools for understanding the adaptive significance of retinal specialisations in different species of vertebrates. To date, such maps have been created from raw count data that have been subjected to only limited analysis (linear interpolation) and, in many cases, have been presented as iso-density contour maps with contour lines that have been smoothed 'by eye'. With the use of stereological approach to count neuronal distribution, a more rigorous approach to analysing the count data is warranted and potentially provides a more accurate representation of the neuron distribution pattern. Moreover, a formal spatial analysis of retinal topography permits a more robust comparison of topographic maps within and between species. In this paper, we present a new R-script for analysing the topography of retinal neurons and compare methods of interpolating and smoothing count data for the construction of topographic maps. We compare four methods for spatial analysis of cell count data: Akima interpolation, thin plate spline interpolation, thin plate spline smoothing and Gaussian kernel smoothing. The use of interpolation 'respects' the observed data and simply calculates the intermediate values required to create iso-density contour maps. Interpolation preserves more of the data but, consequently includes outliers, sampling errors and/or other experimental artefacts. In contrast, smoothing the data reduces the 'noise' caused by artefacts and permits a clearer representation of the dominant, 'real' distribution. This is particularly useful where cell density gradients are shallow and small variations in local density may dramatically influence the perceived spatial pattern of neuronal topography. The thin plate spline and the Gaussian kernel methods both produce similar retinal topography maps but the smoothing parameters used may affect the outcome.

  3. Analyzing thematic maps and mapping for accuracy

    USGS Publications Warehouse

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by

  4. Mapping analysis and planning system for the John F. Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Hall, C. R.; Barkaszi, M. J.; Provancha, M. J.; Reddick, N. A.; Hinkle, C. R.; Engel, B. A.; Summerfield, B. R.

    1994-01-01

    Environmental management, impact assessment, research and monitoring are multidisciplinary activities which are ideally suited to incorporate a multi-media approach to environmental problem solving. Geographic information systems (GIS), simulation models, neural networks and expert-system software are some of the advancing technologies being used for data management, query, analysis and display. At the 140,000 acre John F. Kennedy Space Center, the Advanced Software Technology group has been supporting development and implementation of a program that integrates these and other rapidly evolving hardware and software capabilities into a comprehensive Mapping, Analysis and Planning System (MAPS) based in a workstation/local are network environment. An expert-system shell is being developed to link the various databases to guide users through the numerous stages of a facility siting and environmental assessment. The expert-system shell approach is appealing for its ease of data access by management-level decision makers while maintaining the involvement of the data specialists. This, as well as increased efficiency and accuracy in data analysis and report preparation, can benefit any organization involved in natural resources management.

  5. Mapping ash properties using principal components analysis

    NASA Astrophysics Data System (ADS)

    Pereira, Paulo; Brevik, Eric; Cerda, Artemi; Ubeda, Xavier; Novara, Agata; Francos, Marcos; Rodrigo-Comino, Jesus; Bogunovic, Igor; Khaledian, Yones

    2017-04-01

    In post-fire environments ash has important benefits for soils, such as protection and source of nutrients, crucial for vegetation recuperation (Jordan et al., 2016; Pereira et al., 2015a; 2016a,b). The thickness and distribution of ash are fundamental aspects for soil protection (Cerdà and Doerr, 2008; Pereira et al., 2015b) and the severity at which was produced is important for the type and amount of elements that is released in soil solution (Bodi et al., 2014). Ash is very mobile material, and it is important were it will be deposited. Until the first rainfalls are is very mobile. After it, bind in the soil surface and is harder to erode. Mapping ash properties in the immediate period after fire is complex, since it is constantly moving (Pereira et al., 2015b). However, is an important task, since according the amount and type of ash produced we can identify the degree of soil protection and the nutrients that will be dissolved. The objective of this work is to apply to map ash properties (CaCO3, pH, and select extractable elements) using a principal component analysis (PCA) in the immediate period after the fire. Four days after the fire we established a grid in a 9x27 m area and took ash samples every 3 meters for a total of 40 sampling points (Pereira et al., 2017). The PCA identified 5 different factors. Factor 1 identified high loadings in electrical conductivity, calcium, and magnesium and negative with aluminum and iron, while Factor 3 had high positive loadings in total phosphorous and silica. Factor 3 showed high positive loadings in sodium and potassium, factor 4 high negative loadings in CaCO3 and pH, and factor 5 high loadings in sodium and potassium. The experimental variograms of the extracted factors showed that the Gaussian model was the most precise to model factor 1, the linear to model factor 2 and the wave hole effect to model factor 3, 4 and 5. The maps produced confirm the patternd observed in the experimental variograms. Factor 1 and 2

  6. Genetic linkage map and comparative genome analysis for the estuarine Atlantic killifish (Fundulus heteroclitus)

    EPA Pesticide Factsheets

    Genetic linkage maps are valuable tools in evolutionary biology; however, their availability for wild populations is extremely limited. Fundulus heteroclitus (Atlantic killifish) is a non-migratory estuarine fish that exhibits high allelic and phenotypic diversity partitioned among subpopulations that reside in disparate environmental conditions. An ideal candidate model organism for studying gene-environment interactions, the molecular toolbox for F. heteroclitus is limited. We identified hundreds of novel microsatellites which, when combined with existing microsatellites and single nucleotide polymorphisms (SNPs), were used to construct the first genetic linkage map for this species. By integrating independent linkage maps from three genetic crosses, we developed a consensus map containing 24 linkage groups, consistent with the number of chromosomes reported for this species. These linkage groups span 2300 centimorgans (cM) of recombinant genomic space, intermediate in size relative to the current linkage maps for the teleosts, medaka and zebrafish. Comparisons between fish genomes support a high degree of synteny between the consensus F. heteroclitus linkage map and the medaka and (to a lesser extent) zebrafish physical genome assemblies.This dataset is associated with the following publication:Waits , E., J. Martinson , B. Rinner, S. Morris, D. Proestou, D. Champlin , and D. Nacci. Genetic linkage map and comparative genome analysis for the estuarine Atlanti

  7. Preliminary northeast Asia geodynamics map

    USGS Publications Warehouse

    Parfenov, Leonid M.; Khanchuk, Alexander I.; Badarch, Gombosuren; Miller, Robert J.; Naumova, Vera V.; Nokleberg, Warren J.; Ogasawara, Masatsugu; Prokopiev, Andrei V.; Yan, Hongquan

    2003-01-01

    This map portrays the geodynamics of Northeast Asia at a scale of 1:5,000,000 using the concepts of plate tectonics and analysis of terranes and overlap assemblages. The map is the result of a detailed compilation and synthesis at 5 million scale and is part of a major international collaborative study of the Mineral Resources, Metallogenesis, and Tectonics of Northeast Asia conducted from 1997 through 2002 by geologists from earth science agencies and universities in Russia, Mongolia, Northeastern China, South Korea, Japan, and the USA. This map is the result of extensive geologic mapping and associated tectonic studies in Northeast Asia in the last few decades and is the first collaborative compilation of the geology of the region at a scale of 1:5,000,000 by geologists from Russia, Mongolia, Northeastern China, South Korea, Japan, and the USA. The map was compiled by a large group of international geologists using the below concepts and definitions during collaborative workshops over a six-year period. The map is a major new compilation and re-interpretation of pre-existing geologic maps of the region. The map is designed to be used for several purposes, including regional tectonic analyses, mineral resource and metallogenic analysis, petroleum resource analysis, neotectonic analysis, and analysis of seismic hazards and volcanic hazards. The map consists of two sheets. Sheet 1 displays the map at a scale of 1:5,000,000, explanation. Sheet 2 displays the introduction, list of map units, and source references. Detailed descriptions of map units and stratigraphic columns are being published separately. This map is one of a series of publications on the mineral resources, metallogenesis, and geodynamics,of Northeast Asia. Companion studies and other articles and maps , and various detailed reports are: (1) a compilation of major mineral deposit models (Rodionov and Nokleberg, 2000; Rodionov and others, 2000; Obolenskiy and others, in press a); (2) a series of

  8. Semi-automatic mapping of geological Structures using UAV-based photogrammetric data: An image analysis approach

    NASA Astrophysics Data System (ADS)

    Vasuki, Yathunanthan; Holden, Eun-Jung; Kovesi, Peter; Micklethwaite, Steven

    2014-08-01

    Recent advances in data acquisition technologies, such as Unmanned Aerial Vehicles (UAVs), have led to a growing interest in capturing high-resolution rock surface images. However, due to the large volumes of data that can be captured in a short flight, efficient analysis of this data brings new challenges, especially the time it takes to digitise maps and extract orientation data. We outline a semi-automated method that allows efficient mapping of geological faults using photogrammetric data of rock surfaces, which was generated from aerial photographs collected by a UAV. Our method harnesses advanced automated image analysis techniques and human data interaction to rapidly map structures and then calculate their dip and dip directions. Geological structures (faults, joints and fractures) are first detected from the primary photographic dataset and the equivalent three dimensional (3D) structures are then identified within a 3D surface model generated by structure from motion (SfM). From this information the location, dip and dip direction of the geological structures are calculated. A structure map generated by our semi-automated method obtained a recall rate of 79.8% when compared against a fault map produced using expert manual digitising and interpretation methods. The semi-automated structure map was produced in 10 min whereas the manual method took approximately 7 h. In addition, the dip and dip direction calculation, using our automated method, shows a mean±standard error of 1.9°±2.2° and 4.4°±2.6° respectively with field measurements. This shows the potential of using our semi-automated method for accurate and efficient mapping of geological structures, particularly from remote, inaccessible or hazardous sites.

  9. Nasa's Planetary Geologic Mapping Program: Overview

    NASA Astrophysics Data System (ADS)

    Williams, D. A.

    2016-06-01

    NASA's Planetary Science Division supports the geologic mapping of planetary surfaces through a distinct organizational structure and a series of research and analysis (R&A) funding programs. Cartography and geologic mapping issues for NASA's planetary science programs are overseen by the Mapping and Planetary Spatial Infrastructure Team (MAPSIT), which is an assessment group for cartography similar to the Mars Exploration Program Assessment Group (MEPAG) for Mars exploration. MAPSIT's Steering Committee includes specialists in geological mapping, who make up the Geologic Mapping Subcommittee (GEMS). I am the GEMS Chair, and with a group of 3-4 community mappers we advise the U.S. Geological Survey Planetary Geologic Mapping Coordinator (Dr. James Skinner) and develop policy and procedures to aid the planetary geologic mapping community. GEMS meets twice a year, at the Annual Lunar and Planetary Science Conference in March, and at the Annual Planetary Mappers' Meeting in June (attendance is required by all NASA-funded geologic mappers). Funding programs under NASA's current R&A structure to propose geological mapping projects include Mars Data Analysis (Mars), Lunar Data Analysis (Moon), Discovery Data Analysis (Mercury, Vesta, Ceres), Cassini Data Analysis (Saturn moons), Solar System Workings (Venus or Jupiter moons), and the Planetary Data Archiving, Restoration, and Tools (PDART) program. Current NASA policy requires all funded geologic mapping projects to be done digitally using Geographic Information Systems (GIS) software. In this presentation we will discuss details on how geologic mapping is done consistent with current NASA policy and USGS guidelines.

  10. RadMAP: The Radiological Multi-sensor Analysis Platform

    NASA Astrophysics Data System (ADS)

    Bandstra, Mark S.; Aucott, Timothy J.; Brubaker, Erik; Chivers, Daniel H.; Cooper, Reynold J.; Curtis, Joseph C.; Davis, John R.; Joshi, Tenzing H.; Kua, John; Meyer, Ross; Negut, Victor; Quinlan, Michael; Quiter, Brian J.; Srinivasan, Shreyas; Zakhor, Avideh; Zhang, Richard; Vetter, Kai

    2016-12-01

    The variability of gamma-ray and neutron background during the operation of a mobile detector system greatly limits the ability of the system to detect weak radiological and nuclear threats. The natural radiation background measured by a mobile detector system is the result of many factors, including the radioactivity of nearby materials, the geometric configuration of those materials and the system, the presence of absorbing materials, and atmospheric conditions. Background variations tend to be highly non-Poissonian, making it difficult to set robust detection thresholds using knowledge of the mean background rate alone. The Radiological Multi-sensor Analysis Platform (RadMAP) system is designed to allow the systematic study of natural radiological background variations and to serve as a development platform for emerging concepts in mobile radiation detection and imaging. To do this, RadMAP has been used to acquire extensive, systematic background measurements and correlated contextual data that can be used to test algorithms and detector modalities at low false alarm rates. By combining gamma-ray and neutron detector systems with data from contextual sensors, the system enables the fusion of data from multiple sensors into novel data products. The data are curated in a common format that allows for rapid querying across all sensors, creating detailed multi-sensor datasets that are used to study correlations between radiological and contextual data, and develop and test novel techniques in mobile detection and imaging. In this paper we will describe the instruments that comprise the RadMAP system, the effort to curate and provide access to multi-sensor data, and some initial results on the fusion of contextual and radiological data.

  11. Visual Analysis Based on the Data of Chinese Surveying and Mapping Journals

    NASA Astrophysics Data System (ADS)

    Li, Jing; Liu, Haiyan; Guo, Wenyue; Yu, Anzhu

    2016-06-01

    Taking four influential Chinese surveying and mapping journals as the data source, 5863 papers published during the period of 2003-2013 were obtained. Using the method of bibliometrics and visual analysis, summarizing the surveying and mapping papers in the past ten years (2003-2013), research themes, authors, and geographical distribution were analyzed. In the study, the papers of geodesy, cartography and GIS are 59.9%, more than half of all the papers. We also determine that the core author group has 131 authors, mainly of whom are from big cities. 90% of top ten cities on the number of publishing papers are capital cities or municipalities directly under the central government.In conclusion, we found that the research focus was different every year, and the research content was richness, the content of geodesy, cartography and GIS were widely researched, and the development of surveying and mapping is imbalanced in China.

  12. Assessing map accuracy in a remotely sensed, ecoregion-scale cover map

    USGS Publications Warehouse

    Edwards, T.C.; Moisen, Gretchen G.; Cutler, D.R.

    1998-01-01

    Landscape- and ecoregion-based conservation efforts increasingly use a spatial component to organize data for analysis and interpretation. A challenge particular to remotely sensed cover maps generated from these efforts is how best to assess the accuracy of the cover maps, especially when they can exceed 1000 s/km2 in size. Here we develop and describe a methodological approach for assessing the accuracy of large-area cover maps, using as a test case the 21.9 million ha cover map developed for Utah Gap Analysis. As part of our design process, we first reviewed the effect of intracluster correlation and a simple cost function on the relative efficiency of cluster sample designs to simple random designs. Our design ultimately combined clustered and subsampled field data stratified by ecological modeling unit and accessibility (hereafter a mixed design). We next outline estimation formulas for simple map accuracy measures under our mixed design and report results for eight major cover types and the three ecoregions mapped as part of the Utah Gap Analysis. Overall accuracy of the map was 83.2% (SE=1.4). Within ecoregions, accuracy ranged from 78.9% to 85.0%. Accuracy by cover type varied, ranging from a low of 50.4% for barren to a high of 90.6% for man modified. In addition, we examined gains in efficiency of our mixed design compared with a simple random sample approach. In regard to precision, our mixed design was more precise than a simple random design, given fixed sample costs. We close with a discussion of the logistical constraints facing attempts to assess the accuracy of large-area, remotely sensed cover maps.

  13. Impact of different NWM-derived mapping functions on VLBI and GPS analysis

    NASA Astrophysics Data System (ADS)

    Nikolaidou, Thalia; Balidakis, Kyriakos; Nievinski, Felipe; Santos, Marcelo; Schuh, Harald

    2018-06-01

    In recent years, numerical weather models have shown the potential to provide a good representation of the electrically neutral atmosphere. This fact has been exploited for the modeling of space geodetic observations. The Vienna Mapping Functions 1 (VMF1) are the NWM-based model recommended by the latest IERS Conventions. The VMF1 are being produced 6 hourly based on the European Centre for Medium-Range Weather Forecasts operational model. UNB-VMF1 provide meteorological parameters aiding neutral atmosphere modeling for VLBI and GNSS, based on the same concept but utilizing the Canadian Meteorological Centre model. This study presents comparisons between the VMF1 and the UNB-VMF1 in both delay and position domains, using global networks of VLBI and GPS stations. It is shown that the zenith delays agree better than 3.5 mm (hydrostatic) and 20 mm (wet) which implies an equivalent predicted height error of less than 2 mm. In the position domain and VLBI analysis, comparison of the weighted root-mean-square error (wrms) of the height component showed a maximum difference of 1.7 mm. For 48% of the stations, the use of VMF1 reduced the height wrms of the stations by 2.6% on average compared to a respective reduction of 1.7% for 41% of the stations employing the UNB-VMF1. For the subset of VLBI stations participating in a large number of sessions, neither mapping function outranked the other. GPS analysis using Precise Point Positioning had a sub-mm respective difference, while the wrms of the individual solutions had a maximum value of 12 mm for the 1-year-long analysis. A clear advantage of one NWM over the other was not shown, and the statistics proved that the two mapping functions yield equal results in geodetic analysis.

  14. Cartographic mapping study

    NASA Technical Reports Server (NTRS)

    Wilson, C.; Dye, R.; Reed, L.

    1982-01-01

    The errors associated with planimetric mapping of the United States using satellite remote sensing techniques are analyzed. Assumptions concerning the state of the art achievable for satellite mapping systems and platforms in the 1995 time frame are made. An analysis of these performance parameters is made using an interactive cartographic satellite computer model, after first validating the model using LANDSAT 1 through 3 performance parameters. An investigation of current large scale (1:24,000) US National mapping techniques is made. Using the results of this investigation, and current national mapping accuracy standards, the 1995 satellite mapping system is evaluated for its ability to meet US mapping standards for planimetric and topographic mapping at scales of 1:24,000 and smaller.

  15. A temperature and vegetation adjusted NTL urban index for urban area mapping and analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Xiya; Li, Peijun

    2018-01-01

    Accurate and timely information regarding the extent and spatial distribution of urban areas on regional and global scales is crucially important for both scientific and policy-making communities. Stable nighttime light (NTL) data from the Defense Meteorological Satellite Program (DMSP) Operational Linescan System (OLS) provides a unique proxy of human settlement and activity, which has been used in the mapping and analysis of urban areas and urbanization dynamics. However, blooming and saturation effects of DMSP/OLS NTL data are two unresolved problems in regional urban area mapping and analysis. This study proposed a new urban index termed the Temperature and Vegetation Adjusted NTL Urban Index (TVANUI). It is intended to reduce blooming and saturation effects and to enhance urban features by combining DMSP/OLS NTL data with Normalized Difference Vegetation Index (NDVI) and land surface temperature (LST) data from the Moderate Resolution Imaging Spectroradiometer onboard the Terra satellite. The proposed index was evaluated in two study areas by comparison with established urban indices. The results demonstrated the proposed TVANUI was effective in enhancing the variation of DMSP/OLS light in urban areas and in reducing blooming and saturation effects, showing better performance than three established urban indices. The TVANUI also significantly outperformed the established urban indices in urban area mapping using both the global-fixed threshold and the local-optimal threshold methods. Thus, the proposed TVANUI provides a useful variable for urban area mapping and analysis on regional scale, as well as for urbanization dynamics using time-series DMSP/OLS and related satellite data.

  16. Sentinel lymph node mapping in endometrial cancer: a systematic review and meta-analysis.

    PubMed

    Lin, Hefeng; Ding, Zheyuan; Kota, Vishnu Goutham; Zhang, Xiaoming; Zhou, Jianwei

    2017-07-11

    Endometrial cancer is the most frequent tumor in the female reproductive system, while the sentinel lymph node (SLN) mapping for diagnostic efficacy of endometrial cancer is still controversial. This meta-analysis was conducted to evaluate the diagnostic value of SLN in the assessment of lymph nodal involvement in endometrial cancer. Forty-four studies including 2,236 cases were identified. The pooled overall detection rate was 83% (95% CI: 80-86%). The pooled sensitivity was 91% (95% CI: 87-95%). The bilateral pelvic node detection rate was 56% (95% CI: 48-64%). Use of indocyanine green (ICG) increased the overall detection rate to 93% (95% CI: 89-96%) and robotic-assisted surgery also increased the overall detection rate to 86% (95% CI: 79-93%). In summary, our meta-analysis provides strong evidence that sentinel node mapping is an accurate and feasible method that performs well diagnostically for the assessment of lymph nodal involvement in endometrial cancer. Cervical injection, robot-assisted surgery, as well as using ICG, optimized the sensitivity and detection rate of the technique. Sentinel lymph mapping may potentially leading to a greater utilization by gynecologic surgeons in the future.

  17. Chemical data visualization and analysis with incremental generative topographic mapping: big data challenge.

    PubMed

    Gaspar, Héléna A; Baskin, Igor I; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre

    2015-01-26

    This paper is devoted to the analysis and visualization in 2-dimensional space of large data sets of millions of compounds using the incremental version of generative topographic mapping (iGTM). The iGTM algorithm implemented in the in-house ISIDA-GTM program was applied to a database of more than 2 million compounds combining data sets of 36 chemicals suppliers and the NCI collection, encoded either by MOE descriptors or by MACCS keys. Taking advantage of the probabilistic nature of GTM, several approaches to data analysis were proposed. The chemical space coverage was evaluated using the normalized Shannon entropy. Different views of the data (property landscapes) were obtained by mapping various physical and chemical properties (molecular weight, aqueous solubility, LogP, etc.) onto the iGTM map. The superposition of these views helped to identify the regions in the chemical space populated by compounds with desirable physicochemical profiles and the suppliers providing them. The data sets similarity in the latent space was assessed by applying several metrics (Euclidean distance, Tanimoto and Bhattacharyya coefficients) to data probability distributions based on cumulated responsibility vectors. As a complementary approach, data sets were compared by considering them as individual objects on a meta-GTM map, built on cumulated responsibility vectors or property landscapes produced with iGTM. We believe that the iGTM methodology described in this article represents a fast and reliable way to analyze and visualize large chemical databases.

  18. Analysis of multiplex gene expression maps obtained by voxelation.

    PubMed

    An, Li; Xie, Hongbo; Chin, Mark H; Obradovic, Zoran; Smith, Desmond J; Megalooikonomou, Vasileios

    2009-04-29

    Gene expression signatures in the mammalian brain hold the key to understanding neural development and neurological disease. Researchers have previously used voxelation in combination with microarrays for acquisition of genome-wide atlases of expression patterns in the mouse brain. On the other hand, some work has been performed on studying gene functions, without taking into account the location information of a gene's expression in a mouse brain. In this paper, we present an approach for identifying the relation between gene expression maps obtained by voxelation and gene functions. To analyze the dataset, we chose typical genes as queries and aimed at discovering similar gene groups. Gene similarity was determined by using the wavelet features extracted from the left and right hemispheres averaged gene expression maps, and by the Euclidean distance between each pair of feature vectors. We also performed a multiple clustering approach on the gene expression maps, combined with hierarchical clustering. Among each group of similar genes and clusters, the gene function similarity was measured by calculating the average gene function distances in the gene ontology structure. By applying our methodology to find similar genes to certain target genes we were able to improve our understanding of gene expression patterns and gene functions. By applying the clustering analysis method, we obtained significant clusters, which have both very similar gene expression maps and very similar gene functions respectively to their corresponding gene ontologies. The cellular component ontology resulted in prominent clusters expressed in cortex and corpus callosum. The molecular function ontology gave prominent clusters in cortex, corpus callosum and hypothalamus. The biological process ontology resulted in clusters in cortex, hypothalamus and choroid plexus. Clusters from all three ontologies combined were most prominently expressed in cortex and corpus callosum. The experimental

  19. Local Subspace Classifier with Transform-Invariance for Image Classification

    NASA Astrophysics Data System (ADS)

    Hotta, Seiji

    A family of linear subspace classifiers called local subspace classifier (LSC) outperforms the k-nearest neighbor rule (kNN) and conventional subspace classifiers in handwritten digit classification. However, LSC suffers very high sensitivity to image transformations because it uses projection and the Euclidean distances for classification. In this paper, I present a combination of a local subspace classifier (LSC) and a tangent distance (TD) for improving accuracy of handwritten digit recognition. In this classification rule, we can deal with transform-invariance easily because we are able to use tangent vectors for approximation of transformations. However, we cannot use tangent vectors in other type of images such as color images. Hence, kernel LSC (KLSC) is proposed for incorporating transform-invariance into LSC via kernel mapping. The performance of the proposed methods is verified with the experiments on handwritten digit and color image classification.

  20. Digital floodplain mapping and an analysis of errors involved

    USGS Publications Warehouse

    Hamblen, C.S.; Soong, D.T.; Cai, X.

    2007-01-01

    Mapping floodplain boundaries using geographical information system (GIS) and digital elevation models (DEMs) was completed in a recent study. However convenient this method may appear at first, the resulting maps potentially can have unaccounted errors. Mapping the floodplain using GIS is faster than mapping manually, and digital mapping is expected to be more common in the future. When mapping is done manually, the experience and judgment of the engineer or geographer completing the mapping and the contour resolution of the surface topography are critical in determining the flood-plain and floodway boundaries between cross sections. When mapping is done digitally, discrepancies can result from the use of the computing algorithm and digital topographic datasets. Understanding the possible sources of error and how the error accumulates through these processes is necessary for the validation of automated digital mapping. This study will evaluate the procedure of floodplain mapping using GIS and a 3 m by 3 m resolution DEM with a focus on the accumulated errors involved in the process. Within the GIS environment of this mapping method, the procedural steps of most interest, initially, include: (1) the accurate spatial representation of the stream centerline and cross sections, (2) properly using a triangulated irregular network (TIN) model for the flood elevations of the studied cross sections, the interpolated elevations between them and the extrapolated flood elevations beyond the cross sections, and (3) the comparison of the flood elevation TIN with the ground elevation DEM, from which the appropriate inundation boundaries are delineated. The study area involved is of relatively low topographic relief; thereby, making it representative of common suburban development and a prime setting for the need of accurately mapped floodplains. This paper emphasizes the impacts of integrating supplemental digital terrain data between cross sections on floodplain delineation

  1. Atlas of Cancer Signalling Network: a systems biology resource for integrative analysis of cancer data with Google Maps

    PubMed Central

    Kuperstein, I; Bonnet, E; Nguyen, H-A; Cohen, D; Viara, E; Grieco, L; Fourquet, S; Calzone, L; Russo, C; Kondratova, M; Dutreix, M; Barillot, E; Zinovyev, A

    2015-01-01

    Cancerogenesis is driven by mutations leading to aberrant functioning of a complex network of molecular interactions and simultaneously affecting multiple cellular functions. Therefore, the successful application of bioinformatics and systems biology methods for analysis of high-throughput data in cancer research heavily depends on availability of global and detailed reconstructions of signalling networks amenable for computational analysis. We present here the Atlas of Cancer Signalling Network (ACSN), an interactive and comprehensive map of molecular mechanisms implicated in cancer. The resource includes tools for map navigation, visualization and analysis of molecular data in the context of signalling network maps. Constructing and updating ACSN involves careful manual curation of molecular biology literature and participation of experts in the corresponding fields. The cancer-oriented content of ACSN is completely original and covers major mechanisms involved in cancer progression, including DNA repair, cell survival, apoptosis, cell cycle, EMT and cell motility. Cell signalling mechanisms are depicted in detail, together creating a seamless ‘geographic-like' map of molecular interactions frequently deregulated in cancer. The map is browsable using NaviCell web interface using the Google Maps engine and semantic zooming principle. The associated web-blog provides a forum for commenting and curating the ACSN content. ACSN allows uploading heterogeneous omics data from users on top of the maps for visualization and performing functional analyses. We suggest several scenarios for ACSN application in cancer research, particularly for visualizing high-throughput data, starting from small interfering RNA-based screening results or mutation frequencies to innovative ways of exploring transcriptomes and phosphoproteomes. Integration and analysis of these data in the context of ACSN may help interpret their biological significance and formulate mechanistic hypotheses

  2. Resolution Measurement from a Single Reconstructed Cryo-EM Density Map with Multiscale Spectral Analysis.

    PubMed

    Yang, Yu-Jiao; Wang, Shuai; Zhang, Biao; Shen, Hong-Bin

    2018-06-25

    As a relatively new technology to solve the three-dimensional (3D) structure of a protein or protein complex, single-particle reconstruction (SPR) of cryogenic electron microscopy (cryo-EM) images shows much superiority and is in a rapidly developing stage. Resolution measurement in SPR, which evaluates the quality of a reconstructed 3D density map, plays a critical role in promoting methodology development of SPR and structural biology. Because there is no benchmark map in the generation of a new structure, how to realize the resolution estimation of a new map is still an open problem. Existing approaches try to generate a hypothetical benchmark map by reconstructing two 3D models from two halves of the original 2D images for cross-reference, which may result in a premature estimation with a half-data model. In this paper, we report a new self-reference-based resolution estimation protocol, called SRes, that requires only a single reconstructed 3D map. The core idea of SRes is to perform a multiscale spectral analysis (MSSA) on the map through multiple size-variable masks segmenting the map. The MSSA-derived multiscale spectral signal-to-noise ratios (mSSNRs) reveal that their corresponding estimated resolutions will show a cliff jump phenomenon, indicating a significant change in the SSNR properties. The critical point on the cliff borderline is demonstrated to be the right estimator for the resolution of the map.

  3. Laboratory Studies of Temperature and Relative Humidity Dependence of Aerosol Nucleation during the TANGENT 2017 IOP Study

    NASA Astrophysics Data System (ADS)

    Ouyang, Q.; Tiszenkel, L.; Stangl, C. M.; Krasnomowitz, J.; Johnston, M. V.; Lee, S.

    2017-12-01

    In this poster, we will present recent measurements of temperature and relative humidity dependence of aerosol nucleation of sulfuric acid under the conditions representative of the ground level to the free troposphere. Aerosol nucleation is critically dependent on temperature, but the current global aerosol models use nucleation algorithms that are independent of temperature and relative humidity due to the lack of experimental data. Thus, these models fail to simulate nucleation in a wide range of altitude and latitude conditions. We are currently conducting the Tandem Aerosol Nucleation and Growth Environment Tube (TANGENT) the intense observation period (IOP) experiments to investigate the aerosol nucleation and growth properties independently, during nucleation and growth. Nucleation takes place from sulfuric acid, water and some base compounds in a fast flow nucleation tube (FT-1). Nucleation precursors are detected with two chemical ionization mass spectrometers (CIMS) and newly nucleated particles are measured with a particle size magnifier (PSM) and a scanning mobility particle sizers (SMPS). Then these particles grow further in the second flow tube (FT-2) in the presence of oxidants of biogenic organic compounds. Chemical compositions of grown particles are further analyzed with a nano-aerosol mass spectrometer (NAMS). Our experimental results will provide a robust algorithm for aerosol nucleation and growth rates as a function of temperature and relative humidity.

  4. Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis

    NASA Astrophysics Data System (ADS)

    Caulkins, J. L.

    2010-12-01

    We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever

  5. QTL mapping and transcriptome analysis of cowpea reveals candidate genes for root-knot nematode resistance

    PubMed Central

    Ndeve, Arsenio Daniel; Huynh, Bao-Lam; Matthews, William Charles; Roberts, Philip Alan

    2018-01-01

    Cowpea is one of the most important food and forage legumes in drier regions of the tropics and subtropics. However, cowpea yield worldwide is markedly below the known potential due to abiotic and biotic stresses, including parasitism by root-knot nematodes (Meloidogyne spp., RKN). Two resistance genes with dominant effect, Rk and Rk2, have been reported to provide resistance against RKN in cowpea. Despite their description and use in breeding for resistance to RKN and particularly genetic mapping of the Rk locus, the exact genes conferring resistance to RKN remain unknown. In the present work, QTL mapping using recombinant inbred line (RIL) population 524B x IT84S-2049 segregating for a newly mapped locus and analysis of the transcriptome changes in two cowpea near-isogenic lines (NIL) were used to identify candidate genes for Rk and the newly mapped locus. A major QTL, designated QRk-vu9.1, associated with resistance to Meloidogyne javanica reproduction, was detected and mapped on linkage group LG9 at position 13.37 cM using egg production data. Transcriptome analysis on resistant and susceptible NILs 3 and 9 days after inoculation revealed up-regulation of 109 and 98 genes and down-regulation of 110 and 89 genes, respectively, out of 19,922 unique genes mapped to the common bean reference genome. Among the differentially expressed genes, four and nine genes were found within the QRk-vu9.1 and QRk-vu11.1 QTL intervals, respectively. Six of these genes belong to the TIR-NBS-LRR family of resistance genes and three were upregulated at one or more time-points. Quantitative RT-PCR validated gene expression to be positively correlated with RNA-seq expression pattern for eight genes. Future functional analysis of these cowpea genes will enhance our understanding of Rk-mediated resistance and identify the specific gene responsible for the resistance. PMID:29300744

  6. Comprehensive analysis of structure and temperature, frequency and concentration-dependent dielectric properties of lithium-substituted cobalt ferrites (Li x Co1- x Fe2O4)

    NASA Astrophysics Data System (ADS)

    Anjum, Safia; Nisa, Mehru; Sabah, Aneeqa; Rafique, M. S.; Zia, Rehana

    2017-08-01

    This paper has been dedicated to the synthesis and characterization of a series of lithium-substituted cobalt ferrites Li x Co1- x Fe2O4 ( x = 0, 0.2, 0.4, 0.6, 0.8, 1). These samples have been prepared using simple ball milling machine through powder metallurgy route. The structural analysis is carried out using X-ray diffractometer and their 3D vitalization is simulated using diamond software. The frequency and temperature-dependent dielectric properties of prepared samples have been measured using inductor capacitor resistor (LCR) meter. The structural analysis confirms that all the prepared samples have inverse cubic spinel structure. It is also revealed that the crystallite size and lattice parameter decrease with the increasing concentration of lithium (Li+1) ions, it is due to the smaller ionic radii of lithium ions. The comprehensive analysis of frequency, concentration and temperature-dependent dielectric properties of prepared samples is described in this paper. It is observed that the dielectric constant and tangent loss have decreased and conductivity increased as the frequency increases. It is also revealed that the dielectric constant, tangent loss and AC conductivity increase as the concentration of lithium increases due to its lower electronegativity value. Temperature plays a vital role in enhancing the dielectric constant, tangent loss and AC conductivity because the mobility of ions increases as the temperature increases.

  7. Genome-wide SNP identification for the construction of a high-resolution genetic map of Japanese flounder (Paralichthys olivaceus): applications to QTL mapping of Vibrio anguillarum disease resistance and comparative genomic analysis

    PubMed Central

    Shao, Changwei; Niu, Yongchao; Rastas, Pasi; Liu, Yang; Xie, Zhiyuan; Li, Hengde; Wang, Lei; Jiang, Yong; Tai, Shuaishuai; Tian, Yongsheng; Sakamoto, Takashi; Chen, Songlin

    2015-01-01

    High-resolution genetic maps are essential for fine mapping of complex traits, genome assembly, and comparative genomic analysis. Single-nucleotide polymorphisms (SNPs) are the primary molecular markers used for genetic map construction. In this study, we identified 13,362 SNPs evenly distributed across the Japanese flounder (Paralichthys olivaceus) genome. Of these SNPs, 12,712 high-confidence SNPs were subjected to high-throughput genotyping and assigned to 24 consensus linkage groups (LGs). The total length of the genetic linkage map was 3,497.29 cM with an average distance of 0.47 cM between loci, thereby representing the densest genetic map currently reported for Japanese flounder. Nine positive quantitative trait loci (QTLs) forming two main clusters for Vibrio anguillarum disease resistance were detected. All QTLs could explain 5.1–8.38% of the total phenotypic variation. Synteny analysis of the QTL regions on the genome assembly revealed 12 immune-related genes, among them 4 genes strongly associated with V. anguillarum disease resistance. In addition, 246 genome assembly scaffolds with an average size of 21.79 Mb were anchored onto the LGs; these scaffolds, comprising 522.99 Mb, represented 95.78% of assembled genomic sequences. The mapped assembly scaffolds in Japanese flounder were used for genome synteny analyses against zebrafish (Danio rerio) and medaka (Oryzias latipes). Flounder and medaka were found to possess almost one-to-one synteny, whereas flounder and zebrafish exhibited a multi-syntenic correspondence. The newly developed high-resolution genetic map, which will facilitate QTL mapping, scaffold assembly, and genome synteny analysis of Japanese flounder, marks a milestone in the ongoing genome project for this species. PMID:25762582

  8. Contralateral Breast Dose After Whole-Breast Irradiation: An Analysis by Treatment Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Terence M.; Moran, Jean M., E-mail: jmmoran@med.umich.edu; Hsu, Shu-Hui

    2012-04-01

    Purpose: To investigate the contralateral breast dose (CBD) across a continuum of breast-conservation therapy techniques. Methods and Materials: An anthropomorphic phantom was CT-simulated, and six treatment plans were generated: open tangents, tangents with an external wedge on the lateral beam, tangents with lateral and medial external wedges, a simple segment plan (three segments per tangent), a complex segmental intensity-modulated radiotherapy (IMRT) plan (five segments per tangent), and a beamlet IMRT plan (>100 segments). For all techniques, the breast on the phantom was irradiated to 5000 cGy. Contralateral breast dose was measured at a uniform depth at the center and eachmore » quadrant using thermoluminescent detectors. Results: Contralateral breast dose varied with position and was 50 {+-} 7.3 cGy in the inner half, 24 {+-} 4.1 cGy at the center, and 16 {+-} 2.2 cGy in the outer half for the open tangential plan. Compared with an average dose of 31 cGy across all points for the open field, the average doses were simple segment 32 cGy (range, 99-105% compared with open technique), complex segment 34 cGy (range, 103-117% compared with open technique), beamlet IMRT 34 cGy (range, 103-124% compared with open technique), lateral wedge only 46 cGy (range, 133-175% compared with open technique), and medial and lateral wedge 96 cGy (range, 282-370% compared with open technique). Conclusions: Single or dual wedge techniques resulted in the highest CBD increases compared with open tangents. To obtain the desired homogeneity to the treated breast while minimizing CBD, segmental and IMRT techniques should be encouraged over external physical compensators.« less

  9. Internal Physical Features of a Land Surface Model Employing a Tangent Linear Model

    NASA Technical Reports Server (NTRS)

    Yang, Runhua; Cohn, Stephen E.; daSilva, Arlindo; Joiner, Joanna; Houser, Paul R.

    1997-01-01

    The Earth's land surface, including its biomass, is an integral part of the Earth's weather and climate system. Land surface heterogeneity, such as the type and amount of vegetative covering., has a profound effect on local weather variability and therefore on regional variations of the global climate. Surface conditions affect local weather and climate through a number of mechanisms. First, they determine the re-distribution of the net radiative energy received at the surface, through the atmosphere, from the sun. A certain fraction of this energy increases the surface ground temperature, another warms the near-surface atmosphere, and the rest evaporates surface water, which in turn creates clouds and causes precipitation. Second, they determine how much rainfall and snowmelt can be stored in the soil and how much instead runs off into waterways. Finally, surface conditions influence the near-surface concentration and distribution of greenhouse gases such as carbon dioxide. The processes through which these mechanisms interact with the atmosphere can be modeled mathematically, to within some degree of uncertainty, on the basis of underlying physical principles. Such a land surface model provides predictive capability for surface variables including ground temperature, surface humidity, and soil moisture and temperature. This information is important for agriculture and industry, as well as for addressing fundamental scientific questions concerning global and local climate change. In this study we apply a methodology known as tangent linear modeling to help us understand more deeply, the behavior of the Mosaic land surface model, a model that has been developed over the past several years at NASA/GSFC. This methodology allows us to examine, directly and quantitatively, the dependence of prediction errors in land surface variables upon different vegetation conditions. The work also highlights the importance of accurate soil moisture information. Although surface

  10. BM-Map: Bayesian Mapping of Multireads for Next-Generation Sequencing Data

    PubMed Central

    Ji, Yuan; Xu, Yanxun; Zhang, Qiong; Tsui, Kam-Wah; Yuan, Yuan; Norris, Clift; Liang, Shoudan; Liang, Han

    2011-01-01

    Summary Next-generation sequencing (NGS) technology generates millions of short reads, which provide valuable information for various aspects of cellular activities and biological functions. A key step in NGS applications (e.g., RNA-Seq) is to map short reads to correct genomic locations within the source genome. While most reads are mapped to a unique location, a significant proportion of reads align to multiple genomic locations with equal or similar numbers of mismatches; these are called multireads. The ambiguity in mapping the multireads may lead to bias in downstream analyses. Currently, most practitioners discard the multireads in their analysis, resulting in a loss of valuable information, especially for the genes with similar sequences. To refine the read mapping, we develop a Bayesian model that computes the posterior probability of mapping a multiread to each competing location. The probabilities are used for downstream analyses, such as the quantification of gene expression. We show through simulation studies and RNA-Seq analysis of real life data that the Bayesian method yields better mapping than the current leading methods. We provide a C++ program for downloading that is being packaged into a user-friendly software. PMID:21517792

  11. Trajectory analysis of land use and land cover maps to improve spatial-temporal patterns, and impact assessment on groundwater recharge

    NASA Astrophysics Data System (ADS)

    Zomlot, Z.; Verbeiren, B.; Huysmans, M.; Batelaan, O.

    2017-11-01

    Land use/land cover (LULC) change is a consequence of human-induced global environmental change. It is also considered one of the major factors affecting groundwater recharge. Uncertainties and inconsistencies in LULC maps are one of the difficulties that LULC timeseries analysis face and which have a significant effect on hydrological impact analysis. Therefore, an accuracy assessment approach of LULC timeseries is needed for a more reliable hydrological analysis and prediction. The objective of this paper is to assess the impact of land use uncertainty and to improve the accuracy of a timeseries of CORINE (coordination of information on the environment) land cover maps by using a new approach of identifying spatial-temporal LULC change trajectories as a pre-processing tool. This ensures consistency of model input when dealing with land-use dynamics and as such improves the accuracy of land use maps and consequently groundwater recharge estimation. As a case study the impact of consistent land use changes from 1990 until 2013 on groundwater recharge for the Flanders-Brussels region is assessed. The change trajectory analysis successfully assigned a rational trajectory to 99% of all pixels. The methodology is shown to be powerful in correcting interpretation inconsistencies and overestimation errors in CORINE land cover maps. The overall kappa (cell-by-cell map comparison) improved from 0.6 to 0.8 and from 0.2 to 0.7 for forest and pasture land use classes respectively. The study shows that the inconsistencies in the land use maps introduce uncertainty in groundwater recharge estimation in a range of 10-30%. The analysis showed that during the period of 1990-2013 the LULC changes were mainly driven by urban expansion. The results show that the resolution at which the spatial analysis is performed is important; the recharge differences using original and corrected CORINE land cover maps increase considerably with increasing spatial resolution. This study indicates

  12. Mapping the total electron content over Malaysia using Spherical Cap Harmonic Analysis

    NASA Astrophysics Data System (ADS)

    Bahari, S.; Abdullah, M.; Bouya, Z.; Musa, T. A.

    2017-12-01

    The ionosphere over Malaysia is unique because of her location which is in close proximity to the geomagnetic equator and is in the equatorial regions. In this region, the magnetic field is horizontally oriented from south to north and field aligned direction is in the meridional plane (ExB) which becomes the source of equatorial ionospheric anomaly occurrence such as plasma bubble, fountain effects and others. Until today, there is no model that has been developed over Malaysia to study the ionosphere. Due to that, the main objective of this paper is to develop a new technique for mapping the total electron content (TEC) from GPS measurements. Data by myRTKnet network of GPS receiver over Malaysia were used in this study. A new methodology, based on modified spherical cap harmonic analysis (SCHA), was developed to estimate diurnal vertical TEC over the region using GPS observations. The SCHA model is based on longitudinal expansion in Fourier series and fractional Legendre co-latitudinal functions over a spherical cap-like region. The TEC map with spatial resolution of 0.15 ° x 0.15 ° in latitude and longitude with the time resolution of 30 seconds are derived. TEC maps from the SCHA model were compared with the global ionospheric map and other regional models. Result shows that during low solar activity, SCHA model had a better mapping with the accuracy of less than 1 TECU compared to other regional models.

  13. Hybrid Semantic Analysis for Mapping Adverse Drug Reaction Mentions in Tweets to Medical Terminology.

    PubMed

    Emadzadeh, Ehsan; Sarker, Abeed; Nikfarjam, Azadeh; Gonzalez, Graciela

    2017-01-01

    Social networks, such as Twitter, have become important sources for active monitoring of user-reported adverse drug reactions (ADRs). Automatic extraction of ADR information can be crucial for healthcare providers, drug manufacturers, and consumers. However, because of the non-standard nature of social media language, automatically extracted ADR mentions need to be mapped to standard forms before they can be used by operational pharmacovigilance systems. We propose a modular natural language processing pipeline for mapping (normalizing) colloquial mentions of ADRs to their corresponding standardized identifiers. We seek to accomplish this task and enable customization of the pipeline so that distinct unlabeled free text resources can be incorporated to use the system for other normalization tasks. Our approach, which we call Hybrid Semantic Analysis (HSA), sequentially employs rule-based and semantic matching algorithms for mapping user-generated mentions to concept IDs in the Unified Medical Language System vocabulary. The semantic matching component of HSA is adaptive in nature and uses a regression model to combine various measures of semantic relatedness and resources to optimize normalization performance on the selected data source. On a publicly available corpus, our normalization method achieves 0.502 recall and 0.823 precision (F-measure: 0.624). Our proposed method outperforms a baseline based on latent semantic analysis and another that uses MetaMap.

  14. Geologic map of Mars

    USGS Publications Warehouse

    Tanaka, Kenneth L.; Skinner, James A.; Dohm, James M.; Irwin, Rossman P.; Kolb, Eric J.; Fortezzo, Corey M.; Platz, Thomas; Michael, Gregory G.; Hare, Trent M.

    2014-01-01

    This global geologic map of Mars, which records the distribution of geologic units and landforms on the planet's surface through time, is based on unprecedented variety, quality, and quantity of remotely sensed data acquired since the Viking Orbiters. These data have provided morphologic, topographic, spectral, thermophysical, radar sounding, and other observations for integration, analysis, and interpretation in support of geologic mapping. In particular, the precise topographic mapping now available has enabled consistent morphologic portrayal of the surface for global mapping (whereas previously used visual-range image bases were less effective, because they combined morphologic and albedo information and, locally, atmospheric haze). Also, thermal infrared image bases used for this map tended to be less affected by atmospheric haze and thus are reliable for analysis of surface morphology and texture at even higher resolution than the topographic products.

  15. Mapping asphalt pavement aging and condition using multiple endmember spectral mixture analysis in Beijing, China

    NASA Astrophysics Data System (ADS)

    Pan, Yifan; Zhang, Xianfeng; Tian, Jie; Jin, Xu; Luo, Lun; Yang, Ke

    2017-01-01

    Asphalt road reflectance spectra change as pavement ages. This provides the possibility for remote sensing to be used to monitor a change in asphalt pavement conditions. However, the relatively narrow geometry of roads and the relatively coarse spatial resolution of remotely sensed imagery result in mixtures between pavement and adjacent landcovers (e.g., vegetation, buildings, and soil), increasing uncertainties in spectral analysis. To overcome this problem, multiple endmember spectral mixture analysis (MESMA) was used to map the asphalt pavement condition using Worldview-2 satellite imagery in this study. Based on extensive field investigation and in situ measurements, aged asphalt pavements were categorized into four stages-preliminarily aged, moderately aged, heavily aged, and distressed. The spectral characteristics in the first three stages were further analyzed, and a MESMA unmixing analysis was conducted to map these three kinds of pavement conditions from the Worldview-2 image. The results showed that the road pavement conditions could be detected well and mapped with an overall accuracy of 81.71% and Kappa coefficient of 0.77. Finally, a quantitative assessment of the pavement conditions for each road segment in this study area was conducted to inform road maintenance management.

  16. Urban ventilation corridors mapping using surface morphology data based GIS analysis

    NASA Astrophysics Data System (ADS)

    Wicht, Marzena; Wicht, Andreas; Osińska-Skotak, Katarzyna

    2017-04-01

    This paper describes deriving the most appropriate method for mapping urban ventilation corridors, which, if properly designed, reduce heat stress, air pollution and increase air quality, as well as increase the horizontal wind speed. Urban areas are - in terms of surface texture - recognized as one of the roughest surfaces, which results in wind obstruction and decreased ventilation of densely built up areas. As urban heat islands, private household and traffic emissions or large scale industries occur frequently in many cities, both in temperate and tropical regions. A proper ventilation system has been suggested as an appropriate mitigation mean [1] . Two concepts of morphometric analyses of the urban environment are used on the example of Warsaw, representing a dense, urban environment, located in the temperate zone. The utilized methods include firstly a roughness mapping calculation [2] , which analyses zero plane displacement height (zd) and roughness length (z0) and their distribution for the eight (inter-)cardinal wind directions and secondly a grid-based frontal area index mapping approach [3] , which uses least cost path analysis. Utilizing the advantages and minimizing the disadvantages of those two concepts, we propose a hybrid approach. All concepts are based on a 3D building database obtained from satellite imagery, aided by a cadastral building database. Derived areas (ventilation corridors), that facilitate the ventilation system, should be considered by the local authorities as worth preserving, if not expanding, in order to improve the air quality in the city. The results also include designation of the problematic areas, which greatly obscure the ventilation and might be investigated as to reshape or rebuilt to introduce the air flow in particularly dense areas like city centers. Keywords: roughness mapping; GIS; ventilation corridors; frontal area index Rizwan, A. M., Dennis, L. Y., & Chunho, L. I. U. (2008). A review on the generation

  17. Why Map Issues? On Controversy Analysis as a Digital Method

    PubMed Central

    2015-01-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital “move beyond impartiality.” I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter. PMID:26336325

  18. Cubic map algebra functions for spatio-temporal analysis

    USGS Publications Warehouse

    Mennis, J.; Viger, R.; Tomlin, C.D.

    2005-01-01

    We propose an extension of map algebra to three dimensions for spatio-temporal data handling. This approach yields a new class of map algebra functions that we call "cube functions." Whereas conventional map algebra functions operate on data layers representing two-dimensional space, cube functions operate on data cubes representing two-dimensional space over a third-dimensional period of time. We describe the prototype implementation of a spatio-temporal data structure and selected cube function versions of conventional local, focal, and zonal map algebra functions. The utility of cube functions is demonstrated through a case study analyzing the spatio-temporal variability of remotely sensed, southeastern U.S. vegetation character over various land covers and during different El Nin??o/Southern Oscillation (ENSO) phases. Like conventional map algebra, the application of cube functions may demand significant data preprocessing when integrating diverse data sets, and are subject to limitations related to data storage and algorithm performance. Solutions to these issues include extending data compression and computing strategies for calculations on very large data volumes to spatio-temporal data handling.

  19. Genome-wide SNP identification for the construction of a high-resolution genetic map of Japanese flounder (Paralichthys olivaceus): applications to QTL mapping of Vibrio anguillarum disease resistance and comparative genomic analysis.

    PubMed

    Shao, Changwei; Niu, Yongchao; Rastas, Pasi; Liu, Yang; Xie, Zhiyuan; Li, Hengde; Wang, Lei; Jiang, Yong; Tai, Shuaishuai; Tian, Yongsheng; Sakamoto, Takashi; Chen, Songlin

    2015-04-01

    High-resolution genetic maps are essential for fine mapping of complex traits, genome assembly, and comparative genomic analysis. Single-nucleotide polymorphisms (SNPs) are the primary molecular markers used for genetic map construction. In this study, we identified 13,362 SNPs evenly distributed across the Japanese flounder (Paralichthys olivaceus) genome. Of these SNPs, 12,712 high-confidence SNPs were subjected to high-throughput genotyping and assigned to 24 consensus linkage groups (LGs). The total length of the genetic linkage map was 3,497.29 cM with an average distance of 0.47 cM between loci, thereby representing the densest genetic map currently reported for Japanese flounder. Nine positive quantitative trait loci (QTLs) forming two main clusters for Vibrio anguillarum disease resistance were detected. All QTLs could explain 5.1-8.38% of the total phenotypic variation. Synteny analysis of the QTL regions on the genome assembly revealed 12 immune-related genes, among them 4 genes strongly associated with V. anguillarum disease resistance. In addition, 246 genome assembly scaffolds with an average size of 21.79 Mb were anchored onto the LGs; these scaffolds, comprising 522.99 Mb, represented 95.78% of assembled genomic sequences. The mapped assembly scaffolds in Japanese flounder were used for genome synteny analyses against zebrafish (Danio rerio) and medaka (Oryzias latipes). Flounder and medaka were found to possess almost one-to-one synteny, whereas flounder and zebrafish exhibited a multi-syntenic correspondence. The newly developed high-resolution genetic map, which will facilitate QTL mapping, scaffold assembly, and genome synteny analysis of Japanese flounder, marks a milestone in the ongoing genome project for this species. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  20. Allelic Analysis of Sheath Blight Resistance with Association Mapping in Rice

    PubMed Central

    Jia, Limeng; Yan, Wengui; Zhu, Chengsong; Agrama, Hesham A.; Jackson, Aaron; Yeater, Kathleen; Li, Xiaobai; Huang, Bihu; Hu, Biaolin; McClung, Anna; Wu, Dianxing

    2012-01-01

    Sheath blight (ShB) caused by the soil-borne pathogen Rhizoctonia solani is one of the most devastating diseases in rice world-wide. Global attention has focused on examining individual mapping populations for quantitative trait loci (QTLs) for ShB resistance, but to date no study has taken advantage of association mapping to examine hundreds of lines for potentially novel QTLs. Our objective was to identify ShB QTLs via association mapping in rice using 217 sub-core entries from the USDA rice core collection, which were phenotyped with a micro-chamber screening method and genotyped with 155 genome-wide markers. Structure analysis divided the mapping panel into five groups, and model comparison revealed that PCA5 with genomic control was the best model for association mapping of ShB. Ten marker loci on seven chromosomes were significantly associated with response to the ShB pathogen. Among multiple alleles in each identified loci, the allele contributing the greatest effect to ShB resistance was named the putative resistant allele. Among 217 entries, entry GSOR 310389 contained the most putative resistant alleles, eight out of ten. The number of putative resistant alleles presented in an entry was highly and significantly correlated with the decrease of ShB rating (r = −0.535) or the increase of ShB resistance. Majority of the resistant entries that contained a large number of the putative resistant alleles belonged to indica, which is consistent with a general observation that most ShB resistant accessions are of indica origin. These findings demonstrate the potential to improve breeding efficiency by using marker-assisted selection to pyramid putative resistant alleles from various loci in a cultivar for enhanced ShB resistance in rice. PMID:22427867

  1. School Mapping and Geospatial Analysis of the Schools in Jasra Development Block of India

    NASA Astrophysics Data System (ADS)

    Agrawal, S.; Gupta, R. D.

    2016-06-01

    GIS is a collection of tools and techniques that works on the geospatial data and is used in the analysis and decision making. Education is an inherent part of any civil society. Proper educational facilities generate the high quality human resource for any nation. Therefore, government needs an efficient system that can help in analysing the current state of education and its progress. Government also needs a system that can support in decision making and policy framing. GIS can serve the mentioned requirements not only for government but also for the general public. In order to meet the standards of human development, it is necessary for the government and decision makers to have a close watch on the existing education policy and its implementation condition. School mapping plays an important role in this aspect. School mapping consists of building the geospatial database of schools that supports in the infrastructure development, policy analysis and decision making. The present research work is an attempt for supporting Right to Education (RTE) and Sarv Sikha Abhiyaan (SSA) programmes run by Government of India through the use of GIS. School mapping of the study area is performed which is followed by the geospatial analysis. This research work will help in assessing the present status of educational infrastructure in Jasra block of Allahabad district, India.

  2. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  3. Using Web Maps to Analyze the Construction of Global Scale Cognitive Maps

    ERIC Educational Resources Information Center

    Pingel, Thomas J.

    2018-01-01

    Game-based Web sites and applications are changing the ways in which students learn the world map. In this study, a Web map-based digital learning tool was used as a study aid for a university-level geography course in order to examine the way in which global scale cognitive maps are constructed. A network analysis revealed that clicks were…

  4. GenomeLandscaper: Landscape analysis of genome-fingerprints maps assessing chromosome architecture.

    PubMed

    Ai, Hannan; Ai, Yuncan; Meng, Fanmei

    2018-01-18

    Assessing correctness of an assembled chromosome architecture is a central challenge. We create a geometric analysis method (called GenomeLandscaper) to conduct landscape analysis of genome-fingerprints maps (GFM), trace large-scale repetitive regions, and assess their impacts on the global architectures of assembled chromosomes. We develop an alignment-free method for phylogenetics analysis. The human Y chromosomes (GRCh.chrY, HuRef.chrY and YH.chrY) are analysed as a proof-of-concept study. We construct a galaxy of genome-fingerprints maps (GGFM) for them, and a landscape compatibility among relatives is observed. But a long sharp straight line on the GGFM breaks such a landscape compatibility, distinguishing GRCh38p1.chrY (and throughout GRCh38p7.chrY) from GRCh37p13.chrY, HuRef.chrY and YH.chrY. We delete a 1.30-Mbp target segment to rescue the landscape compatibility, matching the antecedent GRCh37p13.chrY. We re-locate it into the modelled centromeric and pericentromeric region of GRCh38p10.chrY, matching a gap placeholder of GRCh37p13.chrY. We decompose it into sub-constituents (such as BACs, interspersed repeats, and tandem repeats) and trace their homologues by phylogenetics analysis. We elucidate that most examined tandem repeats are of reasonable quality, but the BAC-sized repeats, 173U1020C (176.46 Kbp) and 5U41068C (205.34 Kbp), are likely over-repeated. These results offer unique insights into the centromeric and pericentromeric regions of the human Y chromosomes.

  5. The effectiveness of concept mapping on development of critical thinking in nursing education: A systematic review and meta-analysis.

    PubMed

    Yue, Meng; Zhang, Meng; Zhang, Chunmei; Jin, Changde

    2017-05-01

    As an essential skill in daily clinical nursing practice, critical thinking ability has been an important objective in nursing education. Concept mapping enables nursing students connect new information to existing knowledge and integrates interdisciplinary knowledge. However, there is a lack of evidence related to critical thinking ability and concept mapping in nursing education. The purpose of this systematic review and meta-analysis was to assess the effect of concept mapping in developing critical thinking in nursing education. This systematic review was reported in line with Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA). A search was conducted in PubMed, Web of science, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Cumulative Index to Nursing and Allied Health (CINAHL) and China National Knowledge Infrastructure (CNKI). Randomized controlled trials (RCT) comparing concept mapping and traditional teaching method were retrieved. Data were collected by two reviewers according to the data extraction tables. The methodological quality of included studies was assessed by other two reviewers. The results of meta-analysis were presented using mean difference (MD). Thirteen trials were summarized in the systematic review and eleven trials were included in the meta-analysis. The pooled effect size showed that, comparing with traditional methods, concept mapping could improve subjects' critical thinking ability measured by California Critical Thinking Disposition Inventory (CCTDI), California Critical Thinking Skill Test (CCTST) and Critical Thinking Scale (CTS). The subgroup analyses showed that concept mapping improved the score of all subscales. The result of this review indicated that concept mapping could affect the critical thinking affective dispositions and critical thinking cognitive skills. Further high quality research using uniform evaluation is required. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Lunar terrain mapping and relative-roughness analysis

    NASA Technical Reports Server (NTRS)

    Rowan, L. C.; Mccauley, J. F.; Holm, E. A.

    1971-01-01

    Terrain maps of the equatorial zone were prepared at scales of 1:2,000,000 and 1:1,000,000 to classify lunar terrain with respect to roughness and to provide a basis for selecting sites for Surveyor and Apollo landings, as well as for Ranger and Lunar Orbiter photographs. Lunar terrain was described by qualitative and quantitative methods and divided into four fundamental classes: maria, terrae, craters, and linear features. Some 35 subdivisions were defined and mapped throughout the equatorial zone, and, in addition, most of the map units were illustrated by photographs. The terrain types were analyzed quantitatively to characterize and order their relative roughness characteristics. For some morphologically homogeneous mare areas, relative roughness can be extrapolated to the large scales from measurements at small scales.

  7. Mapping the dengue scientific landscape worldwide: a bibliometric and network analysis.

    PubMed

    Mota, Fabio Batista; Fonseca, Bruna de Paula Fonseca E; Galina, Andréia Cristina; Silva, Roseli Monteiro da

    2017-05-01

    Despite the current global trend of reduction in the morbidity and mortality of neglected diseases, dengue's incidence has increased and occurrence areas have expanded. Dengue also persists as a scientific and technological challenge since there is no effective treatment, vaccine, vector control or public health intervention. Combining bibliometrics and social network analysis methods can support the mapping of dengue research and development (R&D) activities worldwide. The aim of this paper is to map the scientific scenario related to dengue research worldwide. We use scientific publication data from Web of Science Core Collection - articles indexed in Science Citation Index Expanded (SCI-EXPANDED) - and combine bibliometrics and social network analysis techniques to identify the most relevant journals, scientific references, research areas, countries and research organisations in the dengue scientific landscape. Our results show a significant increase of dengue publications over time; tropical medicine and virology as the most frequent research areas and biochemistry and molecular biology as the most central area in the network; USA and Brazil as the most productive countries; and Mahidol University and Fundação Oswaldo Cruz as the main research organisations and the Centres for Disease Control and Prevention as the most central organisation in the collaboration network. Our findings can be used to strengthen a global knowledge platform guiding policy, planning and funding decisions as well as to providing directions to researchers and institutions. So that, by offering to the scientific community, policy makers and public health practitioners a mapping of the dengue scientific landscape, this paper has aimed to contribute to upcoming debates, decision-making and planning on dengue R&D and public health strategies worldwide.

  8. Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training

    NASA Technical Reports Server (NTRS)

    Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil

    2008-01-01

    We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].

  9. RIMS: An Integrated Mapping and Analysis System with Applications to Earth Sciences and Hydrology

    NASA Astrophysics Data System (ADS)

    Proussevitch, A. A.; Glidden, S.; Shiklomanov, A. I.; Lammers, R. B.

    2011-12-01

    A web-based information and computational system for analysis of spatially distributed Earth system, climate, and hydrologic data have been developed. The System allows visualization, data exploration, querying, manipulation and arbitrary calculations with any loaded gridded or vector polygon dataset. The system's acronym, RIMS, stands for its core functionality as a Rapid Integrated Mapping System. The system can be deployed for a Global scale projects as well as for regional hydrology and climatology studies. In particular, the Water Systems Analysis Group of the University of New Hampshire developed the global and regional (Northern Eurasia, pan-Arctic) versions of the system with different map projections and specific data. The system has demonstrated its potential for applications in other fields of Earth sciences and education. The key Web server/client components of the framework include (a) a visualization engine built on Open Source libraries (GDAL, PROJ.4, etc.) that are utilized in a MapServer; (b) multi-level data querying tools built on XML server-client communication protocols that allow downloading map data on-the-fly to a client web browser; and (c) data manipulation and grid cell level calculation tools that mimic desktop GIS software functionality via a web interface. Server side data management of the system is designed around a simple database of dataset metadata facilitating mounting of new data to the system and maintaining existing data in an easy manner. RIMS contains "built-in" river network data that allows for query of upstream areas on-demand which can be used for spatial data aggregation and analysis of sub-basin areas. RIMS is an ongoing effort and currently being used to serve a number of websites hosting a suite of hydrologic, environmental and other GIS data.

  10. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    PubMed

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  11. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  12. The Necessity for Routine Pre-operative Ultrasound Mapping Before Arteriovenous Fistula Creation: A Meta-analysis.

    PubMed

    Georgiadis, G S; Charalampidis, D G; Argyriou, C; Georgakarakos, E I; Lazarides, M K

    2015-05-01

    Existing guidelines suggest routine use of pre-operative color Doppler ultrasound (DUS) vessel mapping before the creation of arteriovenous fistulae (AVF); however, there is controversy about its benefit over traditional clinical examination or selective ultrasound use. This was a systematic review and meta-analysis of randomized controlled trials (RCTs) comparing routine DUS mapping before the creation of AVF with patients for whom the decision for AVF placement was based on clinical examination and selective ultrasound use. A search of MEDLINE/PubMed, SCOPUS, and the Cochrane Library was carried out in June 2014. The analyzed outcome measures were the immediate failure rate and the early/midterm adequacy of the fistula for hemodialysis. Additionally, assessment of the methodological quality of the included studies was carried out. Five studies (574 patients) were analyzed. A random effects model was used to pool the data. The pooled odds ratio (OR) for the immediate failure rate was 0.32 (95% confidence interval [CI] 0.17-0.60; p < .01), which was significantly in favor of the DUS mapping group. The pooled OR for the early/midterm adequacy for hemodialysis was 0.66 (95% CI 0.42-1.03; p = .06), with a trend in favor of the DUS mapping group; however, subgroup analysis revealed that routine DUS mapping was more beneficial than selective DUS (p < .05). The available evidence, based mainly on moderate quality RCTs, suggests that the pre-operative clinical examination should always be supplemented with routine DUS mapping before AVF creation. This policy avoids negative surgical explorations and significantly reduces the immediate AVF failure rate. Copyright © 2015 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  13. Deriving pathway maps from automated text analysis using a grammar-based approach.

    PubMed

    Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn

    2006-04-01

    We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases.

  14. An overview of animal science research 1945-2011 through science mapping analysis.

    PubMed

    Rodriguez-Ledesma, A; Cobo, M J; Lopez-Pujalte, C; Herrera-Viedma, E

    2015-12-01

    The conceptual structure of the field of Animal Science (AS) research is examined by means of a longitudinal science mapping analysis. The whole of the AS research field is analysed, revealing its conceptual evolution. To this end, an automatic approach to detecting and visualizing hidden themes or topics and their evolution across a consecutive span of years was applied to AS publications of the JCR category 'Agriculture, Dairy & Animal Science' during the period 1945-2011. This automatic approach was based on a coword analysis and combines performance analysis and science mapping. To observe the conceptual evolution of AS, six consecutive periods were defined: 1945-1969, 1970-1979, 1980-1989, 1990-1999, 2000-2005 and 2006-2011. Research in AS was identified as having focused on ten main thematic areas: ANIMAL-FEEDING, SMALL-RUMINANTS, ANIMAL-REPRODUCTION, DAIRY-PRODUCTION, MEAT-QUALITY, SWINE-PRODUCTION, GENETICS-AND-ANIMAL-BREEDING, POULTRY, ANIMAL-WELFARE and GROWTH-FACTORS-AND-FATTY-ACIDS. The results show how genomic studies gain in weight and integrate with other thematic areas. The whole of AS research has become oriented towards an overall framework in which animal welfare, sustainable management and human health play a major role. All this would affect the future structure and management of livestock farming. © 2014 Blackwell Verlag GmbH.

  15. Analysis of microarray leukemia data using an efficient MapReduce-based K-nearest-neighbor classifier.

    PubMed

    Kumar, Mukesh; Rath, Nitish Kumar; Rath, Santanu Kumar

    2016-04-01

    Microarray-based gene expression profiling has emerged as an efficient technique for classification, prognosis, diagnosis, and treatment of cancer. Frequent changes in the behavior of this disease generates an enormous volume of data. Microarray data satisfies both the veracity and velocity properties of big data, as it keeps changing with time. Therefore, the analysis of microarray datasets in a small amount of time is essential. They often contain a large amount of expression, but only a fraction of it comprises genes that are significantly expressed. The precise identification of genes of interest that are responsible for causing cancer are imperative in microarray data analysis. Most existing schemes employ a two-phase process such as feature selection/extraction followed by classification. In this paper, various statistical methods (tests) based on MapReduce are proposed for selecting relevant features. After feature selection, a MapReduce-based K-nearest neighbor (mrKNN) classifier is also employed to classify microarray data. These algorithms are successfully implemented in a Hadoop framework. A comparative analysis is done on these MapReduce-based models using microarray datasets of various dimensions. From the obtained results, it is observed that these models consume much less execution time than conventional models in processing big data. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Integrating recent land cover mapping efforts to update the National Gap Analysis Program's species habitat map

    USGS Publications Warehouse

    McKerrow, Alexa; Davidson, A.; Earnhardt, Todd; Benson, Abigail L.; Toth, Charles; Holm, Thomas; Jutz, Boris

    2014-01-01

    Over the past decade, great progress has been made to develop national extent land cover mapping products to address natural resource issues. One of the core products of the GAP Program is range-wide species distribution models for nearly 2000 terrestrial vertebrate species in the U.S. We rely on deductive modeling of habitat affinities using these products to create models of habitat availability. That approach requires that we have a thematically rich and ecologically meaningful map legend to support the modeling effort. In this work, we tested the integration of the Multi-Resolution Landscape Characterization Consortium's National Land Cover Database 2011 and LANDFIRE's Disturbance Products to update the 2001 National GAP Vegetation Dataset to reflect 2011 conditions. The revised product can then be used to update the species models. We tested the update approach in three geographic areas (Northeast, Southeast, and Interior Northwest). We used the NLCD product to identify areas where the cover type mapped in 2011 was different from what was in the 2001 land cover map. We used Google Earth and ArcGIS base maps as reference imagery in order to label areas identified as "changed" to the appropriate class from our map legend. Areas mapped as urban or water in the 2011 NLCD map that were mapped differently in the 2001 GAP map were accepted without further validation and recoded to the corresponding GAP class. We used LANDFIRE's Disturbance products to identify changes that are the result of recent disturbance and to inform the reassignment of areas to their updated thematic label. We ran species habitat models for three species including Lewis's Woodpecker (Melanerpes lewis) and the White-tailed Jack Rabbit (Lepus townsendii) and Brown Headed nuthatch (Sitta pusilla). For each of three vertebrate species we found important differences in the amount and location of suitable habitat between the 2001 and 2011 habitat maps. Specifically, Brown headed nuthatch habitat in

  17. Machinery running state identification based on discriminant semi-supervised local tangent space alignment for feature fusion and extraction

    NASA Astrophysics Data System (ADS)

    Su, Zuqiang; Xiao, Hong; Zhang, Yi; Tang, Baoping; Jiang, Yonghua

    2017-04-01

    Extraction of sensitive features is a challenging but key task in data-driven machinery running state identification. Aimed at solving this problem, a method for machinery running state identification that applies discriminant semi-supervised local tangent space alignment (DSS-LTSA) for feature fusion and extraction is proposed. Firstly, in order to extract more distinct features, the vibration signals are decomposed by wavelet packet decomposition WPD, and a mixed-domain feature set consisted of statistical features, autoregressive (AR) model coefficients, instantaneous amplitude Shannon entropy and WPD energy spectrum is extracted to comprehensively characterize the properties of machinery running state(s). Then, the mixed-dimension feature set is inputted into DSS-LTSA for feature fusion and extraction to eliminate redundant information and interference noise. The proposed DSS-LTSA can extract intrinsic structure information of both labeled and unlabeled state samples, and as a result the over-fitting problem of supervised manifold learning and blindness problem of unsupervised manifold learning are overcome. Simultaneously, class discrimination information is integrated within the dimension reduction process in a semi-supervised manner to improve sensitivity of the extracted fusion features. Lastly, the extracted fusion features are inputted into a pattern recognition algorithm to achieve the running state identification. The effectiveness of the proposed method is verified by a running state identification case in a gearbox, and the results confirm the improved accuracy of the running state identification.

  18. DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.

    PubMed

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/

  19. Analysis issues due to mapped conditions changing over time

    Treesearch

    Paul. Van Deusen

    2015-01-01

    Plot mapping is one of the innovations that were implemented when FIA moved to the annual forest inventory system. Mapped plots can improve the precision of estimates if the mapped conditions are carefully chosen and used judiciously. However, after plots are remeasured multiple times, it can be difficult to properly track changes in conditions and incorporate this...

  20. FMAP: Functional Mapping and Analysis Pipeline for metagenomics and metatranscriptomics studies.

    PubMed

    Kim, Jiwoong; Kim, Min Soo; Koh, Andrew Y; Xie, Yang; Zhan, Xiaowei

    2016-10-10

    Given the lack of a complete and comprehensive library of microbial reference genomes, determining the functional profile of diverse microbial communities is challenging. The available functional analysis pipelines lack several key features: (i) an integrated alignment tool, (ii) operon-level analysis, and (iii) the ability to process large datasets. Here we introduce our open-sourced, stand-alone functional analysis pipeline for analyzing whole metagenomic and metatranscriptomic sequencing data, FMAP (Functional Mapping and Analysis Pipeline). FMAP performs alignment, gene family abundance calculations, and statistical analysis (three levels of analyses are provided: differentially-abundant genes, operons and pathways). The resulting output can be easily visualized with heatmaps and functional pathway diagrams. FMAP functional predictions are consistent with currently available functional analysis pipelines. FMAP is a comprehensive tool for providing functional analysis of metagenomic/metatranscriptomic sequencing data. With the added features of integrated alignment, operon-level analysis, and the ability to process large datasets, FMAP will be a valuable addition to the currently available functional analysis toolbox. We believe that this software will be of great value to the wider biology and bioinformatics communities.

  1. Microsatellite-centromere mapping in Japanese scallop ( Patinopecten yessoensis) through half-tetrad analysis in gynogenetic diploid families

    NASA Astrophysics Data System (ADS)

    Li, Qi; Qi, Mingjun; Nie, Hongtao; Kong, Lingfeng; Yu, Hong

    2016-06-01

    Gene-centromere mapping is an essential prerequisite for understanding the composition and structure of genomes. Half-tetrad analysis is a powerful tool for mapping genes and understanding chromosomal behavior during meiosis. The Japanese scallop ( Patinopecten yessoensis), a cold-tolerant species inhabiting the northwestern Pacific coast, is a commercially important marine bivalve in Asian countries. In this study, inheritance of 32 informative microsatellite loci was examined in 70-h D-shaped larvae of three induced meiogynogenetic diploid families of P. yessoensis for centromere mapping using half-tetrad analysis. The ratio of gynogenetic diploids was proven to be 100%, 100% and 96% in the three families, respectively. Inheritance analysis in the control crosses showed that 51 of the 53 genotypic ratios observed were in accordance with Mendelian expectations at the 5% level after Bonferroni correction. Seven of the 32 microsatellite loci showed the existence of null alleles in control crosses. The second division segregation frequency ( y) of the microsatellite loci ranged from 0.07 to 0.85 with a mean of 0.38, suggesting the existence of positive interference after a single chiasma formation in some chromosomes in the scallop. Microsatellite-centromere distances ranged from 4 cM to 42 cM under the assumption of complete interference. Information on the positions of centromeres in relation to the microsatellite loci will represent a contribution towards the assembly of genetic maps in the commercially important scallop species.

  2. Visualized analysis of mixed numeric and categorical data via extended self-organizing map.

    PubMed

    Hsu, Chung-Chian; Lin, Shu-Han

    2012-01-01

    Many real-world datasets are of mixed types, having numeric and categorical attributes. Even though difficult, analyzing mixed-type datasets is important. In this paper, we propose an extended self-organizing map (SOM), called MixSOM, which utilizes a data structure distance hierarchy to facilitate the handling of numeric and categorical values in a direct, unified manner. Moreover, the extended model regularizes the prototype distance between neighboring neurons in proportion to their map distance so that structures of the clusters can be portrayed better on the map. Extensive experiments on several synthetic and real-world datasets are conducted to demonstrate the capability of the model and to compare MixSOM with several existing models including Kohonen's SOM, the generalized SOM and visualization-induced SOM. The results show that MixSOM is superior to the other models in reflecting the structure of the mixed-type data and facilitates further analysis of the data such as exploration at various levels of granularity.

  3. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  4. The Mapping X-ray Fluorescence Spectrometer (MapX)

    NASA Astrophysics Data System (ADS)

    Sarrazin, P.; Blake, D. F.; Marchis, F.; Bristow, T.; Thompson, K.

    2017-12-01

    Many planetary surface processes leave traces of their actions as features in the size range 10s to 100s of microns. The Mapping X-ray Fluorescence Spectrometer (MapX) will provide elemental imaging at 100 micron spatial resolution, yielding elemental chemistry at a scale where many relict physical, chemical, or biological features can be imaged and interpreted in ancient rocks on planetary bodies and planetesimals. MapX is an arm-based instrument positioned on a rock or regolith with touch sensors. During an analysis, an X-ray source (tube or radioisotope) bombards the sample with X-rays or alpha-particles / gamma-rays, resulting in sample X-ray Fluorescence (XRF). X-rays emitted in the direction of an X-ray sensitive CCD imager pass through a 1:1 focusing lens (X-ray micro-pore Optic (MPO)) that projects a spatially resolved image of the X-rays onto the CCD. The CCD is operated in single photon counting mode so that the energies and positions of individual X-ray photons are recorded. In a single analysis, several thousand frames are both stored and processed in real-time. Higher level data products include single-element maps with a lateral spatial resolution of 100 microns and quantitative XRF spectra from ground- or instrument- selected Regions of Interest (ROI). XRF spectra from ROI are compared with known rock and mineral compositions to extrapolate the data to rock types and putative mineralogies. When applied to airless bodies and implemented with an appropriate radioisotope source for alpha-particle excitation, MapX will be able to analyze biogenic elements C, N, O, P, S, in addition to the cations of the rock-forming elements >Na, accessible with either X-ray or gamma-ray excitation. The MapX concept has been demonstrated with a series of lab-based prototypes and is currently under refinement and TRL maturation.

  5. HiC-spector: a matrix library for spectral and reproducibility analysis of Hi-C contact maps.

    PubMed

    Yan, Koon-Kiu; Yardimci, Galip Gürkan; Yan, Chengfei; Noble, William S; Gerstein, Mark

    2017-07-15

    Genome-wide proximity ligation based assays like Hi-C have opened a window to the 3D organization of the genome. In so doing, they present data structures that are different from conventional 1D signal tracks. To exploit the 2D nature of Hi-C contact maps, matrix techniques like spectral analysis are particularly useful. Here, we present HiC-spector, a collection of matrix-related functions for analyzing Hi-C contact maps. In particular, we introduce a novel reproducibility metric for quantifying the similarity between contact maps based on spectral decomposition. The metric successfully separates contact maps mapped from Hi-C data coming from biological replicates, pseudo-replicates and different cell types. Source code in Julia and Python, and detailed documentation is available at https://github.com/gersteinlab/HiC-spector . koonkiu.yan@gmail.com or mark@gersteinlab.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  6. TRAM (Transcriptome Mapper): database-driven creation and analysis of transcriptome maps from multiple sources

    PubMed Central

    2011-01-01

    Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format) and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper) is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays), implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile), useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples) and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene clusters with

  7. Computer-composite mapping for geologists

    USGS Publications Warehouse

    van Driel, J.N.

    1980-01-01

    A computer program for overlaying maps has been tested and evaluated as a means for producing geologic derivative maps. Four maps of the Sugar House Quadrangle, Utah, were combined, using the Multi-Scale Data Analysis and Mapping Program, in a single composite map that shows the relative stability of the land surface during earthquakes. Computer-composite mapping can provide geologists with a powerful analytical tool and a flexible graphic display technique. Digitized map units can be shown singly, grouped with different units from the same map, or combined with units from other source maps to produce composite maps. The mapping program permits the user to assign various values to the map units and to specify symbology for the final map. Because of its flexible storage, easy manipulation, and capabilities of graphic output, the composite-mapping technique can readily be applied to mapping projects in sedimentary and crystalline terranes, as well as to maps showing mineral resource potential. ?? 1980 Springer-Verlag New York Inc.

  8. The Salient Map Analysis for Research and Teaching (SMART) method: Powerful potential as a formative assessment in the biomedical sciences

    NASA Astrophysics Data System (ADS)

    Cathcart, Laura Anne

    This dissertation consists of two studies: 1) development and characterization of the Salient Map Analysis for Research and Teaching (SMART) method as a formative assessment tool and 2) a case study exploring how a paramedic instructor's beliefs about learners affect her utilization of the SMART method and vice versa. The first study explored: How can a novel concept map analysis method be designed as an effective formative assessment tool? The SMART method improves upon existing concept map analysis methods because it does not require hierarchically structured concept maps and it preserves the rich content of the maps instead of reducing each map down to a numerical score. The SMART method is performed by comparing a set of students' maps to each other and to an instructor's map. The resulting composite map depicts, in percentages and highlighted colors, the similarities and differences between all of the maps. Some advantages of the SMART method as a formative assessment tool include its ability to highlight changes across time, problematic or alternative conceptions, and patterns of student responses at a glance. Study two explored: How do a paramedic instructor's beliefs about students and learning affect---and become affected by---her use of the SMART method as a formative assessment tool? This case study of Angel, an expert paramedic instructor, begins to address a gap in the emergency medical services (EMS) education literature, which contains almost no research on teachers or pedagogy. Angel and I worked together as participant co-researchers (Heron & Reason, 1997) exploring the affordances of the SMART method. This study, based on those interactions with Angel, involved using open coding to identify themes (Strauss & Corbin, 1998) from Angel's views of students and use of the SMART method. Angel views learning as a sense-making process. She has a multi-faceted view of her students as novices and invests substantial time trying to understand their concept

  9. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the

  10. Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows

    NASA Astrophysics Data System (ADS)

    Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.

    2017-06-01

    The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.

  11. Molecular mapping and candidate gene analysis for resistance to powdery mildew in Cucumis sativus stem.

    PubMed

    Liu, P N; Miao, H; Lu, H W; Cui, J Y; Tian, G L; Wehner, T C; Gu, X F; Zhang, S P

    2017-08-31

    Powdery mildew (PM) of cucumber (Cucumis sativus), caused by Podosphaera xanthii, is a major foliar disease worldwide and resistance is one of the main objectives in cucumber breeding programs. The resistance to PM in cucumber stem is important to the resistance for the whole plant. In this study, genetic analysis and gene mapping were implemented with cucumber inbred lines NCG-122 (with resistance to PM in the stem) and NCG-121 (with susceptibility in the stem). Genetic analysis showed that resistance to PM in the stem of NCG-122 was qualitative and controlled by a single-recessive nuclear gene (pm-s). Susceptibility was dominant to resistance. In the initial genetic mapping of the pm-s gene, 10 SSR markers were discovered to be linked to pm-s, which was mapped to chromosome 5 (Chr.5) of cucumber. The pm-s gene's closest flanking markers were SSR20486 and SSR06184/SSR13237 with genetic distances of 0.9 and 1.8 cM, respectively. One hundred and fifty-seven pairs of new SSR primers were exploited by the sequence information in the initial mapping region of pm-s. The analysis on the F 2 mapping population using the new molecular markers showed that 17 SSR markers were confirmed to be linked to the pm-s gene. The two closest flanking markers, pmSSR27and pmSSR17, were 0.1 and 0.7 cM from pm-s, respectively, confirming the location of this gene on Chr.5. The physical length of the genomic region containing pm-s was 135.7 kb harboring 21 predicted genes. Among these genes, the gene Csa5G623470 annotated as encoding Mlo-related protein was defined as the most probable candidate gene for the pm-s. The results of this study will provide a basis for marker-assisted selection, and make the benefit for the cloning of the resistance gene.

  12. Urban Land Cover Mapping Accuracy Assessment - A Cost-benefit Analysis Approach

    NASA Astrophysics Data System (ADS)

    Xiao, T.

    2012-12-01

    One of the most important components in urban land cover mapping is mapping accuracy assessment. Many statistical models have been developed to help design simple schemes based on both accuracy and confidence levels. It is intuitive that an increased number of samples increases the accuracy as well as the cost of an assessment. Understanding cost and sampling size is crucial in implementing efficient and effective of field data collection. Few studies have included a cost calculation component as part of the assessment. In this study, a cost-benefit sampling analysis model was created by combining sample size design and sampling cost calculation. The sampling cost included transportation cost, field data collection cost, and laboratory data analysis cost. Simple Random Sampling (SRS) and Modified Systematic Sampling (MSS) methods were used to design sample locations and to extract land cover data in ArcGIS. High resolution land cover data layers of Denver, CO and Sacramento, CA, street networks, and parcel GIS data layers were used in this study to test and verify the model. The relationship between the cost and accuracy was used to determine the effectiveness of each sample method. The results of this study can be applied to other environmental studies that require spatial sampling.

  13. DistMap: A Toolkit for Distributed Short Read Mapping on a Hadoop Cluster

    PubMed Central

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/ PMID:24009693

  14. Cartographic standards to improve maps produced by the Forest Inventory and Analysis program

    Treesearch

    Charles H. (Hobie) Perry; Mark D. Nelson

    2009-01-01

    The Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is incorporating an increasing number of cartographic products in reports, publications, and presentations. To create greater quality and consistency within the national FIA program, a Geospatial Standards team developed cartographic design standards for FIA map...

  15. Genome contact map explorer: a platform for the comparison, interactive visualization and analysis of genome contact maps

    PubMed Central

    Kumar, Rajendra; Sobhy, Haitham

    2017-01-01

    Abstract Hi-C experiments generate data in form of large genome contact maps (Hi-C maps). These show that chromosomes are arranged in a hierarchy of three-dimensional compartments. But to understand how these compartments form and by how much they affect genetic processes such as gene regulation, biologists and bioinformaticians need efficient tools to visualize and analyze Hi-C data. However, this is technically challenging because these maps are big. In this paper, we remedied this problem, partly by implementing an efficient file format and developed the genome contact map explorer platform. Apart from tools to process Hi-C data, such as normalization methods and a programmable interface, we made a graphical interface that let users browse, scroll and zoom Hi-C maps to visually search for patterns in the Hi-C data. In the software, it is also possible to browse several maps simultaneously and plot related genomic data. The software is openly accessible to the scientific community. PMID:28973466

  16. International Maps | Geospatial Data Science | NREL

    Science.gov Websites

    International Maps International Maps This map collection provides examples of how geographic information system modeling is used in international resource analysis. The images below are samples of

  17. Performance analysis of different database in new internet mapping system

    NASA Astrophysics Data System (ADS)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  18. A triangular thin shell finite element: Nonlinear analysis. [structural analysis

    NASA Technical Reports Server (NTRS)

    Thomas, G. R.; Gallagher, R. H.

    1975-01-01

    Aspects of the formulation of a triangular thin shell finite element which pertain to geometrically nonlinear (small strain, finite displacement) behavior are described. The procedure for solution of the resulting nonlinear algebraic equations combines a one-step incremental (tangent stiffness) approach with one iteration in the Newton-Raphson mode. A method is presented which permits a rational estimation of step size in this procedure. Limit points are calculated by means of a superposition scheme coupled to the incremental side of the solution procedure while bifurcation points are calculated through a process of interpolation of the determinants of the tangent-stiffness matrix. Numerical results are obtained for a flat plate and two curved shell problems and are compared with alternative solutions.

  19. Use of density equalizing map projections (DEMP) in the analysis of childhood cancer in four California counties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrill, D.W.; Selvin, S.; Close, E.R.

    In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less

  20. The Mathematical Event: Mapping the Axiomatic and the Problematic in School Mathematics

    ERIC Educational Resources Information Center

    de Freitas, Elizabeth

    2013-01-01

    Traditional philosophy of mathematics has been concerned with the nature of mathematical objects rather than events. This traditional focus on reified objects is reflected in dominant theories of learning mathematics whereby the learner is meant to acquire familiarity with ideal mathematical objects, such as number, polygon, or tangent. I argue…

  1. Venus mapping

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Morgan, H. F.; Sucharski, Robert

    1991-01-01

    Semicontrolled image mosaics of Venus, based on Magellan data, are being compiled at 1:50,000,000, 1:10,000,000, 1:5,000,000, and 1:1,000,000 scales to support the Magellan Radar Investigator (RADIG) team. The mosaics are semicontrolled in the sense that data gaps were not filled and significant cosmetic inconsistencies exist. Contours are based on preliminary radar altimetry data that is subjected to revision and improvement. Final maps to support geologic mapping and other scientific investigations, to be compiled as the dataset becomes complete, will be sponsored by the Planetary Geology and Geophysics Program and/or the Venus Data Analysis Program. All maps, both semicontrolled and final, will be published as I-maps by the United States Geological Survey. All of the mapping is based on existing knowledge of the spacecraft orbit; photogrammetric triangulation, a traditional basis for geodetic control on planets where framing cameras were used, is not feasible with the radar images of Venus, although an eventual shift of coordinate system to a revised spin-axis location is anticipated. This is expected to be small enough that it will affect only large-scale maps.

  2. Statistical Significance of Optical Map Alignments

    PubMed Central

    Sarkar, Deepayan; Goldstein, Steve; Schwartz, David C.

    2012-01-01

    Abstract The Optical Mapping System constructs ordered restriction maps spanning entire genomes through the assembly and analysis of large datasets comprising individually analyzed genomic DNA molecules. Such restriction maps uniquely reveal mammalian genome structure and variation, but also raise computational and statistical questions beyond those that have been solved in the analysis of smaller, microbial genomes. We address the problem of how to filter maps that align poorly to a reference genome. We obtain map-specific thresholds that control errors and improve iterative assembly. We also show how an optimal self-alignment score provides an accurate approximation to the probability of alignment, which is useful in applications seeking to identify structural genomic abnormalities. PMID:22506568

  3. Back analysis of Swiss flood danger map to define local flood hazards

    NASA Astrophysics Data System (ADS)

    Choffet, Marc; Derron, Marc-Henri; Jaboyedoff, Michel; Leroi, Eric; Mayis, Arnaud

    2010-05-01

    The flood hazard maps for the entire Switzerland will be available at the end of 2011. Furthermore, the Swiss territory has been covered by aerial laser scanning (ALS) providing high resolution digital elevation model (DEM). This paper describes the development of a method for analyzing the local flood hazard based on Swiss hazard maps and HR-DEM. In their original state, Swiss hazard maps are constructed on the basis of an aggregation of information, a matrix intensity, and frequency. The degree of danger represented by the yellow, blue and red zones gives no information on the water level at each point of the territory. The developed method is based on a superposition of the danger map with the HR-DEM to determine the water level in a hazard area. To perform this method, (1) a triangulation is based on the intersection of the hazard map with the HR-DEM. It uses the limits of area where information is contrain. The hazard map perimeter and the boundaries of hazard areas give information on the widest possible overflow in case of flooding. It is also possible to associate it with a return period. (2) Based on these areas and the difference with the DEM, it is possible to calibrate the highest flood level and the extract water levels for the entire area. This analysis of existing documents opens up interesting perspectives for understanding how infrastructures are threatened by flood hazard by predicting water levels and potential damages to buildings while proposing remedial measures. Indeed, this method allows estimating the water level at each point of a building in case of flooding. It is designed to provide spatial information on water height levels; this offers a different approach of buildings in danger zones. Indeed, it is possible to discern several elements, such as areas of water accumulation involving longer flood duration, possible structural damages to buildings due to high hydrostatic pressure, determination of a local hazard, or the display of water

  4. CAFÉ-Map: Context Aware Feature Mapping for mining high dimensional biomedical data.

    PubMed

    Minhas, Fayyaz Ul Amir Afsar; Asif, Amina; Arif, Muhammad

    2016-12-01

    Feature selection and ranking is of great importance in the analysis of biomedical data. In addition to reducing the number of features used in classification or other machine learning tasks, it allows us to extract meaningful biological and medical information from a machine learning model. Most existing approaches in this domain do not directly model the fact that the relative importance of features can be different in different regions of the feature space. In this work, we present a context aware feature ranking algorithm called CAFÉ-Map. CAFÉ-Map is a locally linear feature ranking framework that allows recognition of important features in any given region of the feature space or for any individual example. This allows for simultaneous classification and feature ranking in an interpretable manner. We have benchmarked CAFÉ-Map on a number of toy and real world biomedical data sets. Our comparative study with a number of published methods shows that CAFÉ-Map achieves better accuracies on these data sets. The top ranking features obtained through CAFÉ-Map in a gene profiling study correlate very well with the importance of different genes reported in the literature. Furthermore, CAFÉ-Map provides a more in-depth analysis of feature ranking at the level of individual examples. CAFÉ-Map Python code is available at: http://faculty.pieas.edu.pk/fayyaz/software.html#cafemap . The CAFÉ-Map package supports parallelization and sparse data and provides example scripts for classification. This code can be used to reconstruct the results given in this paper. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Mapping Infected Area after a Flash-Flooding Storm Using Multi Criteria Analysis and Spectral Indices

    NASA Astrophysics Data System (ADS)

    Al-Akad, S.; Akensous, Y.; Hakdaoui, M.

    2017-11-01

    This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.

  6. The Circumpolar Arctic Vegetation Map: A tool for analysis of change in permafrost regions

    NASA Astrophysics Data System (ADS)

    Walker, D. A.; Raynolds, M. K.; Maier, H. A.

    2003-12-01

    Arctic vegetation occurs beyond the northern limit of trees, in areas that have an Arctic climate and Arctic flora. Here we present an overview of the recently published Circumpolar Arctic Vegetation Map (CAVM), an area analysis of the vegetation map, and a discussion of its potential for analysis of change in the Arctic. Six countries have Arctic tundra vegetation, Canada, Greenland, Iceland, Russia, Norway (Svalbard), and the US (Total Arctic area = 7.1 million km2). Some treeless areas, such as most of Iceland and the Aluetian Islands are excluded from the map because they lack an Arctic climate. The CAVM divides the Arctic into five bioclimate subzones, A thru E (Subzone A is the coldest and Subzone E is the warmest), based on a combination of summer temperature and vegetation. Fifteen vegetation types are mapped based on the dominant plant growth forms. More detailed, plant-community-level, information is contained in the database used to construct the map. The reverse side of the vegetation map has a false-color infrared image constructed from Advanced Very-High Resolution (AVHRR) satellite-derived raster data, and maps of bioclimate subzones, elevation, landscape types, lake cover, substrate chemistry, floristic provinces, the maximum normalized difference vegetation index (NDVI), and aboveground phytomass. The vegetation map was analyzed by vegetation type and biomass for each county, bioclimate subzone, and floristic province. Biomass distribution was analyzed by means of a correlation between aboveground phytomass and the normalized difference vegetation index (NDVI), a remote-sensing index of surface greenness. Biomass on zonal surfaces roughly doubles within each successively warmer subzone, from about 50 g m-2 in Subzone A to 800 g m-2- in Subzone E. But the pattern of vegetation increase is highly variable, and depends on a number of other factors. The most important appears to be the glacial history of the landscape. Areas that were glaciated during

  7. A complete mass spectrometric map for the analysis of the yeast proteome and its application to quantitative trait analysis

    PubMed Central

    Picotti, Paola; Clement-Ziza, Mathieu; Lam, Henry; Campbell, David S.; Schmidt, Alexander; Deutsch, Eric W.; Röst, Hannes; Sun, Zhi; Rinner, Oliver; Reiter, Lukas; Shen, Qin; Michaelson, Jacob J.; Frei, Andreas; Alberti, Simon; Kusebauch, Ulrike; Wollscheid, Bernd; Moritz, Robert; Beyer, Andreas; Aebersold, Ruedi

    2013-01-01

    Complete reference maps or datasets, like the genomic map of an organism, are highly beneficial tools for biological and biomedical research. Attempts to generate such reference datasets for a proteome so far failed to reach complete proteome coverage, with saturation apparent at approximately two thirds of the proteomes tested, even for the most thoroughly characterized proteomes. Here, we used a strategy based on high-throughput peptide synthesis and mass spectrometry to generate a close to complete reference map (97% of the genome-predicted proteins) of the S. cerevisiae proteome. We generated two versions of this mass spectrometric map one supporting discovery- (shotgun) and the other hypothesis-driven (targeted) proteomic measurements. The two versions of the map, therefore, constitute a complete set of proteomic assays to support most studies performed with contemporary proteomic technologies. The reference libraries can be browsed via a web-based repository and associated navigation tools. To demonstrate the utility of the reference libraries we applied them to a protein quantitative trait locus (pQTL) analysis, which requires measurement of the same peptides over a large number of samples with high precision. Protein measurements over a set of 78 S. cerevisiae strains revealed a complex relationship between independent genetic loci, impacting on the levels of related proteins. Our results suggest that selective pressure favors the acquisition of sets of polymorphisms that maintain the stoichiometry of protein complexes and pathways. PMID:23334424

  8. Mapping agroecological zones and time lag in vegetation growth by means of Fourier analysis of time series of NDVI images

    NASA Technical Reports Server (NTRS)

    Menenti, M.; Azzali, S.; Verhoef, W.; Van Swol, R.

    1993-01-01

    Examples are presented of applications of a fast Fourier transform algorithm to analyze time series of images of Normalized Difference Vegetation Index values. The results obtained for a case study on Zambia indicated that differences in vegetation development among map units of an existing agroclimatic map were not significant, while reliable differences were observed among the map units obtained using the Fourier analysis.

  9. Geomorphometric analysis of cave ceiling channels mapped with 3-D terrestrial laser scanning

    NASA Astrophysics Data System (ADS)

    Gallay, Michal; Hochmuth, Zdenko; Kaňuk, Ján; Hofierka, Jaroslav

    2016-05-01

    The change of hydrological conditions during the evolution of caves in carbonate rocks often results in a complex subterranean geomorphology, which comprises specific landforms such as ceiling channels, anastomosing half tubes, or speleothems organized vertically in different levels. Studying such complex environments traditionally requires tedious mapping; however, this is being replaced with terrestrial laser scanning technology. Laser scanning overcomes the problem of reaching high ceilings, providing new options to map underground landscapes with unprecedented level of detail and accuracy. The acquired point cloud can be handled conveniently with dedicated software, but applying traditional geomorphometry to analyse the cave surface is limited. This is because geomorphometry has been focused on parameterization and analysis of surficial terrain. The theoretical and methodological concept has been based on two-dimensional (2-D) scalar fields, which are sufficient for most cases of the surficial terrain. The terrain surface is modelled with a bivariate function of altitude (elevation) and represented by a raster digital elevation model. However, the cave is a 3-D entity; therefore, a different approach is required for geomorphometric analysis. In this paper, we demonstrate the benefits of high-resolution cave mapping and 3-D modelling to better understand the palaeohydrography of the Domica cave in Slovakia. This methodological approach adopted traditional geomorphometric methods in a unique manner and also new methods used in 3-D computer graphics, which can be applied to study other 3-D geomorphological forms.

  10. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present new challenges to state of the art image analysis tools, asking for automated methods specifically suited to map landslide events on VHR optical images. In this work we develop and test a methodology for semi-automatic landslide recognition and mapping of landslide source and transport areas. The method combines object-based image analysis and a Support Vector Machine supervised learning algorithm, and was tested using a GeoEye-1 multispectral image, sensed 3 days after a damaging landslide event in Madeira Island, together with a pre-event LiDAR DEM. Our approach has proved successful in the recognition of landslides on a 15 Km2-wide study area, with 81 out of 85 landslides detected in its validation regions. The classifier also showed reasonable performance (false positive rate 60% and false positive rate below 36% in both validation regions) in the internal mapping of landslide source and transport areas, in particular in the sunnier east-facing slopes. In the less illuminated areas the classifier is still able to accurately map the source areas, but performs poorly in the mapping of landslide transport areas.

  11. Toward a national fuels mapping strategy: Lessons from selected mapping programs

    USGS Publications Warehouse

    Loveland, Thomas R.

    2001-01-01

    The establishment of a robust national fuels mapping program must be based on pertinent lessons from relevant national mapping programs. Many large-area mapping programs are under way in numerous Federal agencies. Each of these programs follows unique strategies to achieve mapping goals and objectives. Implementation approaches range from highly centralized programs that use tightly integrated standards and dedicated staff, to dispersed programs that permit considerable flexibility. One model facilitates national consistency, while the other allows accommodation of locally relevant conditions and issues. An examination of the programmatic strategies of four national vegetation and land cover mapping initiatives can identify the unique approaches, accomplishments, and lessons of each that should be considered in the design of a national fuel mapping program. The first three programs are the U.S. Geological Survey Gap Analysis Program, the U.S. Geological Survey National Land Cover Characterization Program, and the U.S. Fish and Wildlife Survey National Wetlands Inventory. A fourth program, the interagency Multiresolution Land Characterization Program, offers insights in the use of partnerships to accomplish mapping goals. Collectively, the programs provide lessons, guiding principles, and other basic concepts that can be used to design a successful national fuels mapping initiative.

  12. Geopan AT@S: a Brokering Based Gateway to Georeferenced Historical Maps for Risk Analysis

    NASA Astrophysics Data System (ADS)

    Previtali, M.

    2017-08-01

    Importance of ancient and historical maps is nowadays recognized in many applications (e.g., urban planning, landscape valorisation and preservation, land changes identification, etc.). In the last years a great effort has been done by different institutions, such as Geographical Institutes, Public Administrations, and collaborative communities, for digitizing and publishing online collections of historical maps. In spite of this variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required. In addition, problems of interconnection between different data sources and their restricted interoperability may arise. This paper describe a new brokering based gateway developed to assure interoperability between data, in particular georeferenced historical maps and geographic data, gathered from different data providers, with various features and referring to different historical periods. The developed approach is exemplified by a new application named GeoPAN Atl@s that is aimed at linking in Northern Italy area land changes with risk analysis (local seismicity amplification and flooding risk) by using multi-temporal data sources and historic maps.

  13. A rat genetic map constructed by representational difference analysis markers with suitability for large-scale typing.

    PubMed Central

    Toyota, M; Canzian, F; Ushijima, T; Hosoya, Y; Kuramoto, T; Serikawa, T; Imai, K; Sugimura, T; Nagao, M

    1996-01-01

    Representational difference analysis (RDA) was applied to isolate chromosomal markers in the rat. Four series of RDA [restriction enzymes, BamHI and HindIII; subtraction of ACI/N (ACI) amplicon from BUF/Nac (BUF) amplicon and vice versa] yielded 131 polymorphic markers; 125 of these markers were mapped to all chromosomes except for chromosome X. This was done by using a mapping panel of 105 ACI x BUF F2 rats. To complement the relative paucity of chromosomal markers in the rat, genetically directed RDA, which allows isolation of polymorphic markers in the specific chromosomal region, was performed. By changing the F2 driver-DNA allele frequency around the region, four markers were isolated from the D1Ncc1 locus. Twenty-five of 27 RDA markers were informative regarding the dot blot analysis of amplicons, hybridizing only with tester amplicons. Dot blot analysis at a high density per unit of area made it possible to process a large number of samples. Quantitative trait loci can now be mapped in the rat genome by processing a large number of samples with RDA markers and then by isolating markers close to the loci of interest by genetically directed RDA. Images Fig. 1 Fig. 3 Fig. 4 PMID:8632989

  14. Coordinated Optimization of Visual Cortical Maps (I) Symmetry-based Analysis

    PubMed Central

    Reichl, Lars; Heide, Dominik; Löwel, Siegrid; Crowley, Justin C.; Kaschube, Matthias; Wolf, Fred

    2012-01-01

    In the primary visual cortex of primates and carnivores, functional architecture can be characterized by maps of various stimulus features such as orientation preference (OP), ocular dominance (OD), and spatial frequency. It is a long-standing question in theoretical neuroscience whether the observed maps should be interpreted as optima of a specific energy functional that summarizes the design principles of cortical functional architecture. A rigorous evaluation of this optimization hypothesis is particularly demanded by recent evidence that the functional architecture of orientation columns precisely follows species invariant quantitative laws. Because it would be desirable to infer the form of such an optimization principle from the biological data, the optimization approach to explain cortical functional architecture raises the following questions: i) What are the genuine ground states of candidate energy functionals and how can they be calculated with precision and rigor? ii) How do differences in candidate optimization principles impact on the predicted map structure and conversely what can be learned about a hypothetical underlying optimization principle from observations on map structure? iii) Is there a way to analyze the coordinated organization of cortical maps predicted by optimization principles in general? To answer these questions we developed a general dynamical systems approach to the combined optimization of visual cortical maps of OP and another scalar feature such as OD or spatial frequency preference. From basic symmetry assumptions we obtain a comprehensive phenomenological classification of possible inter-map coupling energies and examine representative examples. We show that each individual coupling energy leads to a different class of OP solutions with different correlations among the maps such that inferences about the optimization principle from map layout appear viable. We systematically assess whether quantitative laws resembling

  15. Effect of Map-vaccination in ewes on body condition score, weight and Map-shedding.

    PubMed

    Hüttner, Klim; Krämer, Ulla; Kleist, Petra

    2012-01-01

    Vaccination against Mycobacterium avium subspecies paratuberculosis (Map) in sheep receives growing attention worldwide, particularly in countries with national Map control strategies. A field study was conducted, investigating the effect of GUDAIR on body condition, weight and Map-shedding in a professionally managed but largely Map-affected suffolk flock prior and after vaccination. For this, 80 ewes out of 1000 animals were randomly sampled. In the univariate analysis body condition scores of ewes twelve months after vaccination improved significantly compared to those sampled prior to vaccination. At the same time the rate of ewes shedding Map was reduced by 37%.

  16. Mapping the geogenic radon potential: methodology and spatial analysis for central Hungary.

    PubMed

    Szabó, Katalin Zsuzsanna; Jordan, Gyozo; Horváth, Ákos; Szabó, Csaba

    2014-03-01

    A detailed geogenic radon potential (GRP) mapping based on field soil gas radon and soil gas permeability measurements was carried out in this study. A conventional continuous variable approach was used in this study for GRP determination and to test its applicability to the selected area of Hungary. Spatial pattern of soil gas radon concentration, soil permeability and GRP and the relationship between geological formations and these parameters were studied by performing detailed spatial analysis. Exploratory data analysis revealed that higher soil gas radon activity concentration and GRP characterizes the mountains and hills than the plains. The highest values were found in the proluvial-deluvial sediments, rock debris on the downhill slopes eroded from hills. Among the Quaternary sediments, which characterize the study area, the fluvial sediment has the highest values, which are also located in the hilly areas. The lowest values were found in the plain areas covered by drift sand, fluvioeolic sand, fluvial sand and loess. As a conclusion, radon is related to the sediment cycle in the study area. A geogenic radon risk map was created, which assists human health risk assessment and risk reduction since it indicates the potential of the source of indoor radon. The map shows that low and medium geogenic radon potential characterizes the study area in central Hungary. High risk occurs only locally. The results reveal that Quaternary sediments are inhomogeneous from a radon point of view, fluvial sediment has medium GRP, whereas the other rock formations such as drift sand, fluioeolic sand, fluvial sand and loess, found in the study area, have low GRP. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Mapping urban revitalization: using GIS spatial analysis to evaluate a new housing policy.

    PubMed

    Perkins, Douglas D; Larsen, Courtney; Brown, Barbara B

    2009-01-01

    This longitudinal, multimethod study uses geographical information system (GIS) software to evaluate the community-wide impact of a neighborhood revitalization project. Unsystematic visual examination and analysis of GIS maps are offered as a complementary tool to quantitative analysis and one that is much more compelling, meaningful, and effective in presentation to community and nonscientific professional audiences. The centerpiece of the intervention was the development of a new, middle-class housing subdivision in an area that was declining physically and economically. This represents three major urban/housing policy directions: (1) the emphasis on home ownership for working-class families, (2) the deconcentration of poverty through development of mixed-income neighborhoods, and (3) the clean up and redevelopment of contaminated, former industrial brownfields. Resident survey responses, objective environmental assessment observations, and building permit data were collected, geocoded at the address level, and aggregated to the block level on 60 street blocks in the older neighborhoods surrounding the new housing in two waves: during site clearing and housing construction (Time 1: 1993-95) and three years post-completion (Time 2: 1998-99). Variables mapped include (a) Time 1-2 change in self-reported home repairs and improvements, (b) change in the assessed physical condition of yards and exteriors of 925 individual residential properties, (c) change in residents' home pride, and (d) a city archive of building permits at Time 2. Physical conditions improved overall in the neighborhood, but spatial analysis of the maps suggest that the spillover effects, if any, of the new housing were geographically limited and included unintended negative psychological consequences. Results argue for greater use of GIS and the street block level in community research and of psychological and behavioral variables in planning research and decisions.

  18. Physical mapping of a pollen modifier locus controlling self-incompatibility in apricot and synteny analysis within the Rosaceae.

    PubMed

    Zuriaga, Elena; Molina, Laura; Badenes, María Luisa; Romero, Carlos

    2012-06-01

    S-locus products (S-RNase and F-box proteins) are essential for the gametophytic self-incompatibility (GSI) specific recognition in Prunus. However, accumulated genetic evidence suggests that other S-locus unlinked factors are also required for GSI. For instance, GSI breakdown was associated with a pollen-part mutation unlinked to the S-locus in the apricot (Prunus armeniaca L.) cv. 'Canino'. Fine-mapping of this mutated modifier gene (M-locus) and the synteny analysis of the M-locus within the Rosaceae are here reported. A segregation distortion loci mapping strategy, based on a selectively genotyped population, was used to map the M-locus. In addition, a bacterial artificial chromosome (BAC) contig was constructed for this region using overlapping oligonucleotides probes, and BAC-end sequences (BES) were blasted against Rosaceae genomes to perform micro-synteny analysis. The M-locus was mapped to the distal part of chr.3 flanked by two SSR markers within an interval of 1.8 cM corresponding to ~364 Kb in the peach (Prunus persica L. Batsch) genome. In the integrated genetic-physical map of this region, BES were mapped against the peach scaffold_3 and BACs were anchored to the apricot map. Micro-syntenic blocks were detected in apple (Malus × domestica Borkh.) LG17/9 and strawberry (Fragaria vesca L.) FG6 chromosomes. The M-locus fine-scale mapping provides a solid basis for self-compatibility marker-assisted selection and for positional cloning of the underlying gene, a necessary goal to elucidate the pollen rejection mechanism in Prunus. In a wider context, the syntenic regions identified in peach, apple and strawberry might be useful to interpret GSI evolution in Rosaceae.

  19. Local linear approximation of the Jacobian matrix better captures phase resetting of neural limit cycle oscillators.

    PubMed

    Oprisan, Sorinel Adrian

    2014-01-01

    One effect of any external perturbations, such as presynaptic inputs, received by limit cycle oscillators when they are part of larger neural networks is a transient change in their firing rate, or phase resetting. A brief external perturbation moves the figurative point outside the limit cycle, a geometric perturbation that we mapped into a transient change in the firing rate, or a temporal phase resetting. In order to gain a better qualitative understanding of the link between the geometry of the limit cycle and the phase resetting curve (PRC), we used a moving reference frame with one axis tangent and the others normal to the limit cycle. We found that the stability coefficients associated with the unperturbed limit cycle provided good quantitative predictions of both the tangent and the normal geometric displacements induced by external perturbations. A geometric-to-temporal mapping allowed us to correctly predict the PRC while preserving the intuitive nature of this geometric approach.

  20. A tale of two fractals: The Hofstadter butterfly and the integral Apollonian gaskets

    NASA Astrophysics Data System (ADS)

    Satija, Indubala I.

    2016-11-01

    This paper unveils a mapping between a quantum fractal that describes a physical phenomena, and an abstract geometrical fractal. The quantum fractal is the Hofstadter butterfly discovered in 1976 in an iconic condensed matter problem of electrons moving in a two-dimensional lattice in a transverse magnetic field. The geometric fractal is the integer Apollonian gasket characterized in terms of a 300 BC problem of mutually tangent circles. Both of these fractals are made up of integers. In the Hofstadter butterfly, these integers encode the topological quantum numbers of quantum Hall conductivity. In the Apollonian gaskets an infinite number of mutually tangent circles are nested inside each other, where each circle has integer curvature. The mapping between these two fractals reveals a hidden D3 symmetry embedded in the kaleidoscopic images that describe the asymptotic scaling properties of the butterfly. This paper also serves as a mini review of these fractals, emphasizing their hierarchical aspects in terms of Farey fractions.

  1. Fine mapping on chromosome 13q32-34 and brain expression analysis implicates MYO16 in schizophrenia.

    PubMed

    Rodriguez-Murillo, Laura; Xu, Bin; Roos, J Louw; Abecasis, Gonçalo R; Gogos, Joseph A; Karayiorgou, Maria

    2014-03-01

    We previously reported linkage of schizophrenia and schizoaffective disorder to 13q32-34 in the European descent Afrikaner population from South Africa. The nature of genetic variation underlying linkage peaks in psychiatric disorders remains largely unknown and both rare and common variants may be contributing. Here, we examine the contribution of common variants located under the 13q32-34 linkage region. We used densely spaced SNPs to fine map the linkage peak region using both a discovery sample of 415 families and a meta-analysis incorporating two additional replication family samples. In a second phase of the study, we use one family-based data set with 237 families and independent case-control data sets for fine mapping of the common variant association signal using HapMap SNPs. We report a significant association with a genetic variant (rs9583277) within the gene encoding for the myosin heavy-chain Myr 8 (MYO16), which has been implicated in neuronal phosphoinositide 3-kinase signaling. Follow-up analysis of HapMap variation within MYO16 in a second set of Afrikaner families and additional case-control data sets of European descent highlighted a region across introns 2-6 as the most likely region to harbor common MYO16 risk variants. Expression analysis revealed a significant increase in the level of MYO16 expression in the brains of schizophrenia patients. Our results suggest that common variation within MYO16 may contribute to the genetic liability to schizophrenia.

  2. Evidence of Allopolyploidy in Urochloa humidicola Based on Cytological Analysis and Genetic Linkage Mapping

    PubMed Central

    Vigna, Bianca B. Z.; Santos, Jean C. S.; Jungmann, Leticia; do Valle, Cacilda B.; Mollinari, Marcelo; Pastina, Maria M.; Garcia, Antonio A. F.

    2016-01-01

    The African species Urochloa humidicola (Rendle) Morrone & Zuloaga (syn. Brachiaria humidicola (Rendle) Schweick.) is an important perennial forage grass found throughout the tropics. This species is polyploid, ranging from tetra to nonaploid, and apomictic, which makes genetic studies challenging; therefore, the number of currently available genetic resources is limited. The genomic architecture and evolution of U. humidicola and the molecular markers linked to apomixis were investigated in a full-sib F1 population obtained by crossing the sexual accession H031 and the apomictic cultivar U. humidicola cv. BRS Tupi, both of which are hexaploid. A simple sequence repeat (SSR)-based linkage map was constructed for the species from 102 polymorphic and specific SSR markers based on simplex and double-simplex markers. The map consisted of 49 linkage groups (LGs) and had a total length of 1702.82 cM, with 89 microsatellite loci and an average map density of 10.6 cM. Eight homology groups (HGs) were formed, comprising 22 LGs, and the other LGs remained ungrouped. The locus that controls apospory (apo-locus) was mapped in LG02 and was located 19.4 cM from the locus Bh027.c.D2. In the cytological analyses of some hybrids, bi- to hexavalents at diakinesis were observed, as well as two nucleoli in some meiocytes, smaller chromosomes with preferential allocation within the first metaphase plate and asynchronous chromosome migration to the poles during anaphase. The linkage map and the meiocyte analyses confirm previous reports of hybridization and suggest an allopolyploid origin of the hexaploid U. humidicola. This is the first linkage map of an Urochloa species, and it will be useful for future quantitative trait locus (QTL) analysis after saturation of the map and for genome assembly and evolutionary studies in Urochloa spp. Moreover, the results of the apomixis mapping are consistent with previous reports and confirm the need for additional studies to search for a co

  3. Structure-function analysis of the extracellular domain of the pneumococcal cell division site positioning protein MapZ

    NASA Astrophysics Data System (ADS)

    Manuse, Sylvie; Jean, Nicolas L.; Guinot, Mégane; Lavergne, Jean-Pierre; Laguri, Cédric; Bougault, Catherine M.; Vannieuwenhze, Michael S.; Grangeasse, Christophe; Simorre, Jean-Pierre

    2016-06-01

    Accurate placement of the bacterial division site is a prerequisite for the generation of two viable and identical daughter cells. In Streptococcus pneumoniae, the positive regulatory mechanism involving the membrane protein MapZ positions precisely the conserved cell division protein FtsZ at the cell centre. Here we characterize the structure of the extracellular domain of MapZ and show that it displays a bi-modular structure composed of two subdomains separated by a flexible serine-rich linker. We further demonstrate in vivo that the N-terminal subdomain serves as a pedestal for the C-terminal subdomain, which determines the ability of MapZ to mark the division site. The C-terminal subdomain displays a patch of conserved amino acids and we show that this patch defines a structural motif crucial for MapZ function. Altogether, this structure-function analysis of MapZ provides the first molecular characterization of a positive regulatory process of bacterial cell division.

  4. Mapping of wildlife habitat in Farmington Bay, Utah

    NASA Technical Reports Server (NTRS)

    Jaynes, R. A.; Willie, R. D. (Principal Investigator)

    1982-01-01

    Mapping was accomplished through the interpretation of high-altitude color infrared photography. The feasibility of utilizing LANDSAT digital data to augment the analysis was explored; complex patterns of wildlife habitat and confusion of spectral classes resulted in the decision to make limited use of LANDSAT data in the analysis. The final product is a map which delineates wildlife habitat at a scale of 1:24,000. The map is registered to and printed on a screened U.S.G.S. quadrangle base map. Screened delineations of shoreline contours, mapped from a previous study, are also shown on the map. Intensive field checking of the map was accomplished for the Farmington Bay Waterfowl Management Area in August 1981; other areas on the map received only spot field checking.

  5. Mind mapping in qualitative research.

    PubMed

    Tattersall, Christopher; Powell, Julia; Stroud, James; Pringle, Jan

    We tested a theory that mind mapping could be used as a tool in qualitative research to transcribe and analyse an interview. We compared results derived from mind mapping with those from interpretive phenomenological analysis by examining patients' and carers' perceptions of a new nurse-led service. Mind mapping could be used to rapidly analyse simple qualitative audio-recorded interviews. More research is needed to establish the extent to which mind mapping can assist qualitative researchers.

  6. High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis

    NASA Astrophysics Data System (ADS)

    MacFaden, Sean W.; O'Neil-Dunne, Jarlath P. M.; Royar, Anna R.; Lu, Jacqueline W. T.; Rundle, Andrew G.

    2012-01-01

    Urban tree canopy is widely believed to have myriad environmental, social, and human-health benefits, but a lack of precise canopy estimates has hindered quantification of these benefits in many municipalities. This problem was addressed for New York City using object-based image analysis (OBIA) to develop a comprehensive land-cover map, including tree canopy to the scale of individual trees. Mapping was performed using a rule-based expert system that relied primarily on high-resolution LIDAR, specifically its capacity for evaluating the height and texture of aboveground features. Multispectral imagery was also used, but shadowing and varying temporal conditions limited its utility. Contextual analysis was a key part of classification, distinguishing trees according to their physical and spectral properties as well as their relationships to adjacent, nonvegetated features. The automated product was extensively reviewed and edited via manual interpretation, and overall per-pixel accuracy of the final map was 96%. Although manual editing had only a marginal effect on accuracy despite requiring a majority of project effort, it maximized aesthetic quality and ensured the capture of small, isolated trees. Converting high-resolution LIDAR and imagery into usable information is a nontrivial exercise, requiring significant processing time and labor, but an expert system-based combination of OBIA and manual review was an effective method for fine-scale canopy mapping in a complex urban environment.

  7. New mapping of Radlandi basin and detailed analysis of its inner plains

    NASA Astrophysics Data System (ADS)

    Minelli, Francesco; Giorgetti, Carolina; Mondini, Alessandro; Pauselli, Cristina; Mancinelli, Paolo

    2013-04-01

    NEW MAPPING OF RADITLADI BASIN AND DETAILED ANALYSIS OF ITS INNER PLAINS. Francesco Minelli 1, Carolina Giorgetti 1, Alessandro C. Mondini 2, Cristina Pauselli 1, Paolo Mancinelli1. 1 Gruppo di Geologia Strutturale e Geofisica (GSG), Dipartimento di Scienze della Terra, Università degli Studi di Perugia, 06123, Perugia, Italy . Email: minelli91@yahoo.it. 2 CNR IRPI Perugia, 06123, Perugia. Introduction: The Raditladi basin is a large peak-ring impact crater discovered during the MESSENGER (MErcury Surface, Space ENvironment, GEochemistry, and Ranging) first flyby of Mercury in January 2008 [1]. The Raditladi basin is relatively young [2], and the study of the internal structures give an indication of the processes that acted recently in Mercury's geological history. Geological mapping: We first present the geological mapping of Raditladi crater. In the map we defined different sub-units on the base of previous studies [4][5] and surface morphology and reflectance. Through a GIS software we associated a polygonal layer to each sub-unit, this allowed to distinguish nine different layers. Due to the similarities with the Rachmaninoff basin, to define sub-units mapped on Raditladi, we adopted Rachmaninoff crater's units definitions made by Marchi et al. (2011) [4]. Structures analysis : We also mapped secondary structures consisting in concentric troughs arranged in a circular pattern. We defined two different kinds of troughs: (i) structures characterized by a distinct flat floor and interpretable as grabens, and (ii) structures with linear and curvilinear segments [5]. Inner plain deposit: The analysis of the topography made possible the estimation of the deposit's thickness. The measurement of the thickness is possible thanks to the presence of two small craters, crater A and crater, located in Raditladi's Inner plain. Observing the morphology of the two small craters' rim and hummocky central floor, we distinguished two different units: the shallower consists in

  8. NoiseMap and AEDT Gap Analysis

    DOT National Transportation Integrated Search

    2017-09-30

    NoiseMap and the Aviation Environmental Design Tool (AEDT) both use an integrated modeling approach to calculate aircraft noise in and around an airfield. Both models also employ the same general overall approach by using airfield operational data, s...

  9. Review, mapping and analysis of the agricultural plastic waste generation and consolidation in Europe.

    PubMed

    Briassoulis, Demetres; Babou, Epifania; Hiskakis, Miltiadis; Scarascia, Giacomo; Picuno, Pietro; Guarde, Dorleta; Dejean, Cyril

    2013-12-01

    A review of agricultural plastic waste generation and consolidation in Europe is presented. A detailed geographical mapping of the agricultural plastic use and waste generation in Europe was conducted focusing on areas of high concentration of agricultural plastics. Quantitative data and analysis of the agricultural plastic waste generation by category, geographical distribution and compositional range, and physical characteristics of the agricultural plastic waste per use and the temporal distribution of the waste generation are presented. Data were collected and cross-checked from a variety of sources, including European, national and regional services and organizations, local agronomists, retailers and farmers, importers and converters. Missing data were estimated indirectly based on the recorded cultivated areas and the characteristics of the agricultural plastics commonly used in the particular regions. The temporal distribution, the composition and physical characteristics of the agricultural plastic waste streams were mapped by category and by application. This study represents the first systematic effort to map and analyse agricultural plastic waste generation and consolidation in Europe.

  10. Genetic analysis and fine mapping of a rice brown planthopper (Nilaparvata lugens Stål) resistance gene bph19(t).

    PubMed

    Chen, J W; Wang, L; Pang, X F; Pan, Q H

    2006-04-01

    Genetic analysis and fine mapping of a resistance gene against brown planthopper (BPH) biotype 2 in rice was performed using two F(2) populations derived from two crosses between a resistant indica cultivar (cv.), AS20-1, and two susceptible japonica cvs., Aichi Asahi and Lijiangxintuanheigu. Insect resistance was evaluated using F(1) plants and the two F(2) populations. The results showed that a single recessive gene, tentatively designated as bph19(t), conditioned the resistance in AS20-1. A linkage analysis, mainly employing microsatellite markers, was carried out in the two F(2) populations through bulked segregant analysis and recessive class analysis (RCA), in combination with bioinformatics analysis (BIA). The resistance gene locus bph19(t) was finely mapped to a region of about 1.0 cM on the short arm of chromosome 3, flanked by markers RM6308 and RM3134, where one known marker RM1022, and four new markers, b1, b2, b3 and b4, developed in the present study were co-segregating with the locus. To physically map this locus, the bph19(t)-linked markers were landed on bacterial artificial chromosome or P1 artificial chromosome clones of the reference cv., Nipponbare, released by the International Rice Genome Sequencing Project. Sequence information of these clones was used to construct a physical map of the bph19(t) locus, in silico, by BIA. The bph19(t) locus was physically defined to an interval of about 60 kb. The detailed genetic and physical maps of the bph19(t) locus will facilitate marker-assisted gene pyramiding and cloning.

  11. Oregon Cascades Play Fairway Analysis: Maps

    DOE Data Explorer

    Trimble, John

    2015-12-15

    The maps in this submission include: heat flow, alkalinity, Cl, Mg, SiO2, Quaternary volcanic rocks, faults, and land ownership. All of the Oregon Cascade region. The work was done by John Trimble, in 2015, at Oregon State University.

  12. Identification and Validation of Loci Governing Seed Coat Color by Combining Association Mapping and Bulk Segregation Analysis in Soybean

    PubMed Central

    Ma, Yansong; Tian, Long; Li, Xinxiu; Li, Ying-Hui; Guan, Rongxia; Guo, Yong; Qiu, Li-Juan

    2016-01-01

    Soybean seed coat exists in a range of colors from yellow, green, brown, black, to bicolor. Classical genetic analysis suggested that soybean seed color was a moderately complex trait controlled by multi-loci. However, only a couple of loci could be detected using a single biparental segregating population. In this study, a combination of association mapping and bulk segregation analysis was employed to identify genes/loci governing this trait in soybean. A total of 14 loci, including nine novel and five previously reported ones, were identified using 176,065 coding SNPs selected from entire SNP dataset among 56 soybean accessions. Four of these loci were confirmed and further mapped using a biparental population developed from the cross between ZP95-5383 (yellow seed color) and NY279 (brown seed color), in which different seed coat colors were further dissected into simple trait pairs (green/yellow, green/black, green/brown, yellow/black, yellow/brown, and black/brown) by continuously developing residual heterozygous lines. By genotyping entire F2 population using flanking markers located in fine-mapping regions, the genetic basis of seed coat color was fully dissected and these four loci could explain all variations of seed colors in this population. These findings will be useful for map-based cloning of genes as well as marker-assisted breeding in soybean. This work also provides an alternative strategy for systematically isolating genes controlling relative complex trait by association analysis followed by biparental mapping. PMID:27404272

  13. The Europa Global Geologic Map

    NASA Astrophysics Data System (ADS)

    Leonard, E. J.; Patthoff, D. A.; Senske, D. A.; Collins, G. C.

    2018-06-01

    The Europa Global Geologic Map reveals three periods in Europa's surface history as well as an interesting distribution of microchaos. We will discuss the mapping and the interesting implications of our analysis of Europa's surface.

  14. An Analysis of Prospective Teachers' Knowledge for Constructing Concept Maps

    ERIC Educational Resources Information Center

    Subramaniam, Karthigeyan; Esprívalo Harrell, Pamela

    2015-01-01

    Background: Literature contends that a teacher's knowledge of concept map-based tasks influence how their students perceive the task and execute the creation of acceptable concept maps. Teachers who are skilled concept mappers are able to (1) understand and apply the operational terms to construct a hierarchical/non-hierarchical concept map; (2)…

  15. Mapping sleeping bees within their nest: spatial and temporal analysis of worker honey bee sleep.

    PubMed

    Klein, Barrett Anthony; Stiegler, Martin; Klein, Arno; Tautz, Jürgen

    2014-01-01

    Patterns of behavior within societies have long been visualized and interpreted using maps. Mapping the occurrence of sleep across individuals within a society could offer clues as to functional aspects of sleep. In spite of this, a detailed spatial analysis of sleep has never been conducted on an invertebrate society. We introduce the concept of mapping sleep across an insect society, and provide an empirical example, mapping sleep patterns within colonies of European honey bees (Apis mellifera L.). Honey bees face variables such as temperature and position of resources within their colony's nest that may impact their sleep. We mapped sleep behavior and temperature of worker bees and produced maps of their nest's comb contents as the colony grew and contents changed. By following marked bees, we discovered that individuals slept in many locations, but bees of different worker castes slept in different areas of the nest relative to position of the brood and surrounding temperature. Older worker bees generally slept outside cells, closer to the perimeter of the nest, in colder regions, and away from uncapped brood. Younger worker bees generally slept inside cells and closer to the center of the nest, and spent more time asleep than awake when surrounded by uncapped brood. The average surface temperature of sleeping foragers was lower than the surface temperature of their surroundings, offering a possible indicator of sleep for this caste. We propose mechanisms that could generate caste-dependent sleep patterns and discuss functional significance of these patterns.

  16. Porphyry copper deposit tract definition - A global analysis comparing geologic map scales

    USGS Publications Warehouse

    Raines, G.L.; Connors, K.A.; Chorlton, L.B.

    2007-01-01

    Geologic maps are a fundamental data source used to define mineral-resource potential tracts for the first step of a mineral resource assessment. Further, it is generally believed that the scale of the geologic map is a critical consideration. Previously published research has demonstrated that the U.S. Geological Survey porphyry tracts identified for the United States, which are based on 1:500,000-scale geology and larger scale data and published at 1:1,000,000 scale, can be approximated using a more generalized 1:2,500,000-scale geologic map. Comparison of the USGS porphyry tracts for the United States with weights-of-evidence models made using a 1:10,000,000-scale geologic map, which was made for petroleum applications, and a 1:35,000,000-scale geologic map, which was created as context for the distribution of porphyry deposits, demonstrates that, again, the USGS US porphyry tracts identified are similar to tracts defined on features from these small scale maps. In fact, the results using the 1:35,000,000-scale map show a slightly higher correlation with the USGS US tract definition, probably because the conceptual context for this small-scale map is more appropriate for porphyry tract definition than either of the other maps. This finding demonstrates that geologic maps are conceptual maps. The map information shown in each map is selected and generalized for the map to display the concepts deemed important for the map maker's purpose. Some geologic maps of small scale prove to be useful for regional mineral-resource tract definition, despite the decrease in spatial accuracy with decreasing scale. The utility of a particular geologic map for a particular application is critically dependent on the alignment of the intention of the map maker with the application. ?? International Association for Mathematical Geology 2007.

  17. Image processing for optical mapping.

    PubMed

    Ravindran, Prabu; Gupta, Aditya

    2015-01-01

    Optical Mapping is an established single-molecule, whole-genome analysis system, which has been used to gain a comprehensive understanding of genomic structure and to study structural variation of complex genomes. A critical component of Optical Mapping system is the image processing module, which extracts single molecule restriction maps from image datasets of immobilized, restriction digested and fluorescently stained large DNA molecules. In this review, we describe robust and efficient image processing techniques to process these massive datasets and extract accurate restriction maps in the presence of noise, ambiguity and confounding artifacts. We also highlight a few applications of the Optical Mapping system.

  18. Mapping of non-numerical domains on space: a systematic review and meta-analysis.

    PubMed

    Macnamara, Anne; Keage, Hannah A D; Loetscher, Tobias

    2018-02-01

    The spatial numerical association of response code (SNARC) effect is characterized by low numbers mapped to the left side of space and high numbers mapped to the right side of space. In addition to numbers, SNARC-like effects have been found in non-numerical magnitude domains such as time, size, letters, luminance, and more, whereby the smaller/earlier and larger/later magnitudes are typically mapped to the left and right of space, respectively. The purpose of this systematic and meta-analytic review was to identify and summarise all empirical papers that have investigated horizontal (left-right) SNARC-like mappings using non-numerical stimuli. A systematic search was conducted using EMBASE, Medline, and PsycINFO, where 2216 publications were identified, with 57 papers meeting the inclusion criteria (representing 112 experiments). Ninety-five of these experiments were included in a meta-analysis, resulting in an overall effect size of d = .488 for a SNARC-like effect. Additional analyses revealed a significant effect size advantage for explicit instruction tasks compared with implicit instructions, yet yielded no difference for the role of expertise on SNARC-like effects. There was clear evidence for a publication bias in the field, but the impact of this bias is likely to be modest, and it is unlikely that the SNARC-like effect is a pure artefact of this bias. The similarities in the response properties for the spatial mappings of numerical and non-numerical domains support the concept of a general higher order magnitude system. Yet, further research will need to be conducted to identify all the factors modulating the strength of the spatial associations.

  19. Single strand conformation polymorphism based SNP and Indel markers for genetic mapping and synteny analysis of common bean (Phaseolus vulgaris L.)

    PubMed Central

    2009-01-01

    Background Expressed sequence tags (ESTs) are an important source of gene-based markers such as those based on insertion-deletions (Indels) or single-nucleotide polymorphisms (SNPs). Several gel based methods have been reported for the detection of sequence variants, however they have not been widely exploited in common bean, an important legume crop of the developing world. The objectives of this project were to develop and map EST based markers using analysis of single strand conformation polymorphisms (SSCPs), to create a transcript map for common bean and to compare synteny of the common bean map with sequenced chromosomes of other legumes. Results A set of 418 EST based amplicons were evaluated for parental polymorphisms using the SSCP technique and 26% of these presented a clear conformational or size polymorphism between Andean and Mesoamerican genotypes. The amplicon based markers were then used for genetic mapping with segregation analysis performed in the DOR364 × G19833 recombinant inbred line (RIL) population. A total of 118 new marker loci were placed into an integrated molecular map for common bean consisting of 288 markers. Of these, 218 were used for synteny analysis and 186 presented homology with segments of the soybean genome with an e-value lower than 7 × 10-12. The synteny analysis with soybean showed a mosaic pattern of syntenic blocks with most segments of any one common bean linkage group associated with two soybean chromosomes. The analysis with Medicago truncatula and Lotus japonicus presented fewer syntenic regions consistent with the more distant phylogenetic relationship between the galegoid and phaseoloid legumes. Conclusion The SSCP technique is a useful and inexpensive alternative to other SNP or Indel detection techniques for saturating the common bean genetic map with functional markers that may be useful in marker assisted selection. In addition, the genetic markers based on ESTs allowed the construction of a transcript map and

  20. Using known map category marginal frequencies to improve estimates of thematic map accuracy

    NASA Technical Reports Server (NTRS)

    Card, D. H.

    1982-01-01

    By means of two simple sampling plans suggested in the accuracy-assessment literature, it is shown how one can use knowledge of map-category relative sizes to improve estimates of various probabilities. The fact that maximum likelihood estimates of cell probabilities for the simple random sampling and map category-stratified sampling were identical has permitted a unified treatment of the contingency-table analysis. A rigorous analysis of the effect of sampling independently within map categories is made possible by results for the stratified case. It is noted that such matters as optimal sample size selection for the achievement of a desired level of precision in various estimators are irrelevant, since the estimators derived are valid irrespective of how sample sizes are chosen.

  1. Porcine MAP3K5 analysis: molecular cloning, characterization, tissue expression pattern, and copy number variations associated with residual feed intake.

    PubMed

    Pu, L; Zhang, L C; Zhang, J S; Song, X; Wang, L G; Liang, J; Zhang, Y B; Liu, X; Yan, H; Zhang, T; Yue, J W; Li, N; Wu, Q Q; Wang, L X

    2016-08-12

    Mitogen-activated protein kinase kinase kinase 5 (MAP3K5) is essential for apoptosis, proliferation, differentiation, and immune responses, and is a candidate marker for residual feed intake (RFI) in pig. We cloned the full-length cDNA sequence of porcine MAP3K5 by rapid-amplification of cDNA ends. The 5451-bp gene contains a 5'-untranslated region (UTR) (718 bp), a coding region (3738 bp), and a 3'-UTR (995 bp), and encodes a peptide of 1245 amino acids, which shares 97, 99, 97, 93, 91, and 84% sequence identity with cattle, sheep, human, mouse, chicken, and zebrafish MAP3K5, respectively. The deduced MAP3K5 protein sequence contains two conserved domains: a DUF4071 domain and a protein kinase domain. Phylogenetic analysis showed that porcine MAP3K5 forms a separate branch to vicugna and camel MAP3K5. Tissue expression analysis using real-time quantitative polymerase chain reaction (qRT-PCR) revealed that MAP3K5 was expressed in the heart, liver, spleen, lung, kidney, muscle, fat, pancrea, ileum, and stomach tissues. Copy number variation was detected for porcine MAP3K5 and validated by qRT-PCR. Furthermore, a significant increase in average copy number was detected in the low RFI group when compared to the high RFI group in a Duroc pig population. These results provide useful information regarding the influence of MAP3K5 on RFI in pigs.

  2. Human factors analysis for a 2D enroute moving map application

    NASA Astrophysics Data System (ADS)

    Pschierer, Christian; Wipplinger, Patrick; Schiefele, Jens; Cromer, Scot; Laurin, John; Haffner, Skip

    2005-05-01

    The paper describes flight trials performed in Centennial, CO with a Piper Cheyenne from Marinvent. Six pilots flew the Cheyenne in twelve enroute segments between Denver Centennial and Colorado Springs. Two different settings (paper chart, enroute moving map) were evaluated with randomized settings. The flight trial goal was to evaluate the objective performance of pilots compared among the different settings. As dependent variables, positional accuracy and situational awareness probe (SAP) were measured. Analysis was conducted by an ANOVA test. In parallel, all pilots answered subjective Cooper-Harper, NASA TLX, situation awareness rating technique (SART), Display Readability Rating and debriefing questionnaires. The tested enroute moving map application has Jeppesen chart compliant symbologies for high-enroute and low-enroute. It has a briefing mode were all information found on today"s enroute paper chart together with a loaded flight plan are displayed in a north-up orientation. The execution mode displays a loaded flight plan routing together with only pertinent flight route relevant information in either a track up or north up orientation. Depiction of an own ship symbol is possible in both modes. All text and symbols are deconflicted. Additional information can be obtained by clicking on symbols. Terrain and obstacle data can be displayed for enhanced situation awareness. The result shows that pilots flying the 2D enroute moving map display perform no worse than pilots with conventional systems. Flight technical error and workload are equivalent or lower, situational awareness is higher than on conventional paper charts.

  3. QTL mapping for downy mildew resistance in cucumber via bulked segregant analysis using next-generation sequencing and conventional methods.

    PubMed

    Win, Khin Thanda; Vegas, Juan; Zhang, Chunying; Song, Kihwan; Lee, Sanghyeob

    2017-01-01

    QTL mapping using NGS-assisted BSA was successfully applied to an F 2 population for downy mildew resistance in cucumber. QTLs detected by NGS-assisted BSA were confirmed by conventional QTL analysis. Downy mildew (DM), caused by Pseudoperonospora cubensis, is one of the most destructive foliar diseases in cucumber. QTL mapping is a fundamental approach for understanding the genetic inheritance of DM resistance in cucumber. Recently, many studies have reported that a combination of bulked segregant analysis (BSA) and next-generation sequencing (NGS) can be a rapid and cost-effective way of mapping QTLs. In this study, we applied NGS-assisted BSA to QTL mapping of DM resistance in cucumber and confirmed the results by conventional QTL analysis. By sequencing two DNA pools each consisting of ten individuals showing high resistance and susceptibility to DM from a F 2 population, we identified single nucleotide polymorphisms (SNPs) between the two pools. We employed a statistical method for QTL mapping based on these SNPs. Five QTLs, dm2.2, dm4.1, dm5.1, dm5.2, and dm6.1, were detected and dm2.2 showed the largest effect on DM resistance. Conventional QTL analysis using the F 2 confirmed dm2.2 (R 2  = 10.8-24 %) and dm5.2 (R 2  = 14-27.2 %) as major QTLs and dm4.1 (R 2  = 8 %) as two minor QTLs, but could not detect dm5.1 and dm6.1. A new QTL on chromosome 2, dm2.1 (R 2  = 28.2 %) was detected by the conventional QTL method using an F 3 population. This study demonstrated the effectiveness of NGS-assisted BSA for mapping QTLs conferring DM resistance in cucumber and revealed the unique genetic inheritance of DM resistance in this population through two distinct major QTLs on chromosome 2 that mainly harbor DM resistance.

  4. Mapping seabed sediments: Comparison of manual, geostatistical, object-based image analysis and machine learning approaches

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Green, Sophie L.; Stephens, David; Lark, R. Murray; Stewart, Heather A.; Dove, Dayton

    2014-08-01

    Marine spatial planning and conservation need underpinning with sufficiently detailed and accurate seabed substrate and habitat maps. Although multibeam echosounders enable us to map the seabed with high resolution and spatial accuracy, there is still a lack of fit-for-purpose seabed maps. This is due to the high costs involved in carrying out systematic seabed mapping programmes and the fact that the development of validated, repeatable, quantitative and objective methods of swath acoustic data interpretation is still in its infancy. We compared a wide spectrum of approaches including manual interpretation, geostatistics, object-based image analysis and machine-learning to gain further insights into the accuracy and comparability of acoustic data interpretation approaches based on multibeam echosounder data (bathymetry, backscatter and derivatives) and seabed samples with the aim to derive seabed substrate maps. Sample data were split into a training and validation data set to allow us to carry out an accuracy assessment. Overall thematic classification accuracy ranged from 67% to 76% and Cohen's kappa varied between 0.34 and 0.52. However, these differences were not statistically significant at the 5% level. Misclassifications were mainly associated with uncommon classes, which were rarely sampled. Map outputs were between 68% and 87% identical. To improve classification accuracy in seabed mapping, we suggest that more studies on the effects of factors affecting the classification performance as well as comparative studies testing the performance of different approaches need to be carried out with a view to developing guidelines for selecting an appropriate method for a given dataset. In the meantime, classification accuracy might be improved by combining different techniques to hybrid approaches and multi-method ensembles.

  5. Korean coastal water depth/sediment and land cover mapping (1:25,000) by computer analysis of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Park, K. Y.; Miller, L. D.

    1978-01-01

    Computer analysis was applied to single date LANDSAT MSS imagery of a sample coastal area near Seoul, Korea equivalent to a 1:50,000 topographic map. Supervised image processing yielded a test classification map from this sample image containing 12 classes: 5 water depth/sediment classes, 2 shoreline/tidal classes, and 5 coastal land cover classes at a scale of 1:25,000 and with a training set accuracy of 76%. Unsupervised image classification was applied to a subportion of the site analyzed and produced classification maps comparable in results in a spatial sense. The results of this test indicated that it is feasible to produce such quantitative maps for detailed study of dynamic coastal processes given a LANDSAT image data base at sufficiently frequent time intervals.

  6. Preclinical medical students' understandings of academic and medical professionalism: visual analysis of mind maps.

    PubMed

    Janczukowicz, Janusz; Rees, Charlotte E

    2017-08-18

    Several studies have begun to explore medical students' understandings of professionalism generally and medical professionalism specifically. Despite espoused relationships between academic (AP) and medical professionalism (MP), previous research has not yet investigated students' conceptualisations of AP and MP and the relationships between the two. The current study, based on innovative visual analysis of mind maps, therefore aims to contribute to the developing literature on how professionalism is understood. We performed a multilayered analysis of 98 mind maps from 262 first-year medical students, including analysing textual and graphical elements of AP, MP and the relationships between AP and MP. The most common textual attributes of AP were learning, lifestyle and personality, while attributes of MP were knowledge, ethics and patient-doctor relations. Images of books, academic caps and teachers were used most often to represent AP, while images of the stethoscope, doctor and red cross were used to symbolise MP. While AP-MP relations were sometimes indicated through co-occurring text, visual connections and higher-order visual metaphors, many students struggled to articulate the relationships between AP and MP. While the mind maps' textual attributes shared similarities with those found in previous research, suggesting the universality of some professionalism attributes, our study provides new insights into students' conceptualisations of AP, MP and AP-MP relationships. We encourage medical educators to help students develop their understandings of AP, MP and AP-MP relationships, plus consider the feasibility and value of mind maps as a source of visual data for medical education research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Thermo-mechanical Modelling of Pebble Beds in Fusion Blankets and its Implementation by a Return-Mapping Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gan, Yixiang; Kamlah, Marc

    In this investigation, a thermo-mechanical model of pebble beds is adopted and developed based on experiments by Dr. Reimann at Forschungszentrum Karlsruhe (FZK). The framework of the present material model is composed of a non-linear elastic law, the Drucker-Prager-Cap theory, and a modified creep law. Furthermore, the volumetric inelastic strain dependent thermal conductivity of beryllium pebble beds is taken into account and full thermo-mechanical coupling is considered. Investigation showed that the Drucker-Prager-Cap model implemented in ABAQUS can not fulfill the requirements of both the prediction of large creep strains and the hardening behaviour caused by creep, which are of importancemore » with respect to the application of pebble beds in fusion blankets. Therefore, UMAT (user defined material's mechanical behaviour) and UMATHT (user defined material's thermal behaviour) routines are used to re-implement the present thermo-mechanical model in ABAQUS. An elastic predictor radial return mapping algorithm is used to solve the non-associated plasticity iteratively, and a proper tangent stiffness matrix is obtained for cost-efficiency in the calculation. An explicit creep mechanism is adopted for the prediction of time-dependent behaviour in order to represent large creep strains in high temperature. Finally, the thermo-mechanical interactions are implemented in a UMATHT routine for the coupled analysis. The oedometric compression tests and creep tests of pebble beds at different temperatures are simulated with the help of the present UMAT and UMATHT routines, and the comparison between the simulation and the experiments is made. (authors)« less

  8. Environmental science applications with Rapid Integrated Mapping and analysis System (RIMS)

    NASA Astrophysics Data System (ADS)

    Shiklomanov, A.; Prusevich, A.; Gordov, E.; Okladnikov, I.; Titov, A.

    2016-11-01

    The Rapid Integrated Mapping and analysis System (RIMS) has been developed at the University of New Hampshire as an online instrument for multidisciplinary data visualization, analysis and manipulation with a focus on hydrological applications. Recently it was enriched with data and tools to allow more sophisticated analysis of interdisciplinary data. Three different examples of specific scientific applications with RIMS are demonstrated and discussed. Analysis of historical changes in major components of the Eurasian pan-Arctic water budget is based on historical discharge data, gridded observational meteorological fields, and remote sensing data for sea ice area. Express analysis of the extremely hot and dry summer of 2010 across European Russia is performed using a combination of near-real time and historical data to evaluate the intensity and spatial distribution of this event and its socioeconomic impacts. Integrative analysis of hydrological, water management, and population data for Central Asia over the last 30 years provides an assessment of regional water security due to changes in climate, water use and demography. The presented case studies demonstrate the capabilities of RIMS as a powerful instrument for hydrological and coupled human-natural systems research.

  9. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    ERIC Educational Resources Information Center

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  10. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    PubMed Central

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  11. Applied cartographic communication: map symbolization for atlases.

    USGS Publications Warehouse

    Morrison, J.L.

    1984-01-01

    A detailed investigation of the symbolization used on general-purpose atlas reference maps. It indicates how theories of cartographic communication can be put into practice. Two major points emerge. First, that a logical scheme can be constructed from existing cartographic research and applied to an analysis of the choice of symbolization on a map. Second, the same structure appears to allow the cartographer to specify symbolization as a part of map design. An introductory review of cartographic communication is followed by an analysis of selected maps' usage of point, area and line symbols, boundaries, text and colour usage.-after Author

  12. Fine mapping QTL for drought resistance traits in rice (Oryza sativa L.) using bulk segregant analysis.

    PubMed

    Salunkhe, Arvindkumar Shivaji; Poornima, R; Prince, K Silvas Jebakumar; Kanagaraj, P; Sheeba, J Annie; Amudha, K; Suji, K K; Senthil, A; Babu, R Chandra

    2011-09-01

    Drought stress is a major limitation to rice (Oryza sativa L.) yields and its stability, especially in rainfed conditions. Developing rice cultivars with inherent capacity to withstand drought stress would improve rainfed rice production. Mapping quantitative trait loci (QTLs) linked to drought resistance traits will help to develop rice cultivars suitable for water-limited environments through molecular marker-assisted selection (MAS) strategy. However, QTL mapping is usually carried out by genotyping large number of progenies, which is labour-intensive, time-consuming and cost-ineffective. Bulk segregant analysis (BSA) serves as an affordable strategy for mapping large effect QTLs by genotyping only the extreme phenotypes instead of the entire mapping population. We have previously mapped a QTL linked to leaf rolling and leaf drying in recombinant inbred (RI) lines derived from two locally adapted indica rice ecotypes viz., IR20/Nootripathu using BSA. Fine mapping the QTL will facilitate its application in MAS. BSA was done by bulking DNA of 10 drought-resistant and 12 drought-sensitive RI lines. Out of 343 rice microsatellites markers genotyped, RM8085 co-segregated among the RI lines constituting the respective bulks. RM8085 was mapped in the middle of the QTL region on chromosome 1 previously identified in these RI lines thus reducing the QTL interval from 7.9 to 3.8 cM. Further, the study showed that the region, RM212-RM302-RM8085-RM3825 on chromosome 1, harbours large effect QTLs for drought-resistance traits across several genetic backgrounds in rice. Thus, the QTL may be useful for drought resistance improvement in rice through MAS and map-based cloning.

  13. EnGeoMAP - geological applications within the EnMAP hyperspectral satellite science program

    NASA Astrophysics Data System (ADS)

    Boesche, N. K.; Mielke, C.; Rogass, C.; Guanter, L.

    2016-12-01

    Hyperspectral investigations from near field to space substantially contribute to geological exploration and mining monitoring of raw material and mineral deposits. Due to their spectral characteristics, large mineral occurrences and minefields can be identified from space and the spatial distribution of distinct proxy minerals be mapped. In the frame of the EnMAP hyperspectral satellite science program a mineral and elemental mapping tool was developed - the EnGeoMAP. It contains a basic mineral mapping and a rare earth element mapping approach. This study shows the performance of EnGeoMAP based on simulated EnMAP data of the rare earth element bearing Mountain Pass Carbonatite Complex, USA, and the Rodalquilar and Lomilla Calderas, Spain, which host the economically relevant gold-silver, lead-zinc-silver-gold and alunite deposits. The mountain pass image data was simulated on the basis of AVIRIS Next Generation images, while the Rodalquilar data is based on HyMap images. The EnGeoMAP - Base approach was applied to both images, while the mountain pass image data were additionally analysed using the EnGeoMAP - REE software tool. The results are mineral and elemental maps that serve as proxies for the regional lithology and deposit types. The validation of the maps is based on chemical analyses of field samples. Current airborne sensors meet the spatial and spectral requirements for detailed mineral mapping and future hyperspectral space borne missions will additionally provide a large coverage. For those hyperspectral missions, EnGeoMAP is a rapid data analysis tool that is provided to spectral geologists working in mineral exploration.

  14. The importance of source area mapping for rockfall hazard analysis

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Frattini, Paolo; Crosta, Giovanni B.

    2013-04-01

    A problem in the characterization of the area affected by rockfall is the correct source areas definition. Different positions or different size of the source areas along a cliff result in different possibilities of propagation and diverse interaction with passive countermeasures present in the area. Through the use of Hy-Stone (Crosta et al., 2004), a code able to perform 3D numerical modeling of rockfall processes, different types of source areas were tested on a case study slope along the western flank of the Mt. de La Saxe (Courmayeur, AO), developing between 1200 and 2055 m s.l.m. The first set of source areas consists of unstable rock masses identified on the basis of field survey and Terrestrial Laser Scanning (IMAGEO, 2011). A second set of source areas has been identified by using different thresholds of slope gradient. We tested slope thresholds between 50° and 75° at 5° intervals. The third source area dataset has been generating by performing a kinematic stability analysis. For this analysis, we mapped the join sets along the rocky cliff by means of the software COLTOP 3D (Jaboyedoff, 2004), and then we identified the portions of rocky cliff where planar/wedge and toppling failures are possible assuming an average friction angle of 35°. Through the outputs of the Hy-Stone models we extracted and analyzed the kinetic energy, height of fly and velocity of the blocks falling along the rocky cliff in order to compare the controls of different source areas. We observed strong variations of kinetic energy and fly height among the different models, especially when using unstable masses identified through Terrestrial Laser Scanning. This is mainly related to the size of the blocks identified as susceptible to failure. On the contrary, the slope gradient thresholds does not have a strong impact on rockfall propagation. This contribution highlights the importance of a careful and appropriate mapping of rockfall source area for rockfall hazard analysis and the

  15. Mapping Sleeping Bees within Their Nest: Spatial and Temporal Analysis of Worker Honey Bee Sleep

    PubMed Central

    Klein, Barrett Anthony; Stiegler, Martin; Klein, Arno; Tautz, Jürgen

    2014-01-01

    Patterns of behavior within societies have long been visualized and interpreted using maps. Mapping the occurrence of sleep across individuals within a society could offer clues as to functional aspects of sleep. In spite of this, a detailed spatial analysis of sleep has never been conducted on an invertebrate society. We introduce the concept of mapping sleep across an insect society, and provide an empirical example, mapping sleep patterns within colonies of European honey bees (Apis mellifera L.). Honey bees face variables such as temperature and position of resources within their colony's nest that may impact their sleep. We mapped sleep behavior and temperature of worker bees and produced maps of their nest's comb contents as the colony grew and contents changed. By following marked bees, we discovered that individuals slept in many locations, but bees of different worker castes slept in different areas of the nest relative to position of the brood and surrounding temperature. Older worker bees generally slept outside cells, closer to the perimeter of the nest, in colder regions, and away from uncapped brood. Younger worker bees generally slept inside cells and closer to the center of the nest, and spent more time asleep than awake when surrounded by uncapped brood. The average surface temperature of sleeping foragers was lower than the surface temperature of their surroundings, offering a possible indicator of sleep for this caste. We propose mechanisms that could generate caste-dependent sleep patterns and discuss functional significance of these patterns. PMID:25029445

  16. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    NASA Astrophysics Data System (ADS)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  17. US Topo—Topographic maps for the Nation

    USGS Publications Warehouse

    Fishburn, Kristin A.; Carswell, William J.

    2017-06-23

    Building on the success of 125 years of mapping, the U.S. Geological Survey created US Topo, a georeferenced digital map produced from The National Map data. US Topo maps are designed to be used like the traditional 7.5-minute quadrangle paper topographic maps for which the U.S. Geological Survey is so well known. However, in contrast to paper-based maps, US Topo maps provide modern technological advantages that support faster, wider public distribution and basic, onscreen geospatial analysis, including the georeferencing capability to display the ground coordinate location as the user moves the cursor around the map.

  18. Analysis of rocket beacon transmissions for computerized reconstruction of ionospheric densities

    NASA Technical Reports Server (NTRS)

    Bernhardt, P. A.; Huba, J. D.; Chaturvedi, P. K.; Fulford, J. A.; Forsyth, P. A.; Anderson, D. N.; Zalesak, S. T.

    1993-01-01

    Three methods are described to obtain ionospheric electron densities from transionospheric, rocket-beacon TEC data. First, when the line-of-sight from a ground receiver to the rocket beacon is tangent to the flight trajectory, the electron concentration can be obtained by differentiating the TEC with respect to the distance to the rocket. A similar method may be used to obtain the electron-density profile if the layer is horizontally stratified. Second, TEC data obtained during chemical release experiments may be interpreted with the aid of physical models of the disturbed ionosphere to yield spatial maps of the modified regions. Third, computerized tomography (CT) can be used to analyze TEC data obtained along a chain of ground-based receivers aligned along the plane of the rocket trajectory. CT analysis of TEC data is used to reconstruct a 2D image of a simulated equatorial plume. TEC data is computed for a linear chain of nine receivers with adjacent spacings of either 100 or 200 km. The simulation data are analyzed to provide an F region reconstruction on a grid with 15 x 15 km pixels. Ionospheric rocket tomography may also be applied to rocket-assisted measurements of amplitude and phase scintillations and airglow intensities.

  19. Metrics for comparison of crystallographic maps

    DOE PAGES

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; ...

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects,more » such as regions of high density, are of interest.« less

  20. QuickMap: a public tool for large-scale gene therapy vector insertion site mapping and analysis.

    PubMed

    Appelt, J-U; Giordano, F A; Ecker, M; Roeder, I; Grund, N; Hotz-Wagenblatt, A; Opelz, G; Zeller, W J; Allgayer, H; Fruehauf, S; Laufs, S

    2009-07-01

    Several events of insertional mutagenesis in pre-clinical and clinical gene therapy studies have created intense interest in assessing the genomic insertion profiles of gene therapy vectors. For the construction of such profiles, vector-flanking sequences detected by inverse PCR, linear amplification-mediated-PCR or ligation-mediated-PCR need to be mapped to the host cell's genome and compared to a reference set. Although remarkable progress has been achieved in mapping gene therapy vector insertion sites, public reference sets are lacking, as are the possibilities to quickly detect non-random patterns in experimental data. We developed a tool termed QuickMap, which uniformly maps and analyzes human and murine vector-flanking sequences within seconds (available at www.gtsg.org). Besides information about hits in chromosomes and fragile sites, QuickMap automatically determines insertion frequencies in +/- 250 kb adjacency to genes, cancer genes, pseudogenes, transcription factor and (post-transcriptional) miRNA binding sites, CpG islands and repetitive elements (short interspersed nuclear elements (SINE), long interspersed nuclear elements (LINE), Type II elements and LTR elements). Additionally, all experimental frequencies are compared with the data obtained from a reference set, containing 1 000 000 random integrations ('random set'). Thus, for the first time a tool allowing high-throughput profiling of gene therapy vector insertion sites is available. It provides a basis for large-scale insertion site analyses, which is now urgently needed to discover novel gene therapy vectors with 'safe' insertion profiles.

  1. Coastal habitat mapping in the Aegean Sea using high resolution orthophoto maps

    NASA Astrophysics Data System (ADS)

    Topouzelis, Konstantinos; Papakonstantinou, Apostolos; Doukari, Michaela; Stamatis, Panagiotis; Makri, Despina; Katsanevakis, Stelios

    2017-09-01

    The significance of coastal habitat mapping lies in the need to prevent from anthropogenic interventions and other factors. Until 2015, Landsat-8 (30m) imagery were used as medium spatial resolution satellite imagery. So far, Sentinel-2 satellite imagery is very useful for more detailed regional scale mapping. However, the use of high resolution orthophoto maps, which are determined from UAV data, is expected to improve the mapping accuracy. This is due to small spatial resolution of the orthophoto maps (30 cm). This paper outlines the integration of UAS for data acquisition and Structure from Motion (SfM) pipeline for the visualization of selected coastal areas in the Aegean Sea. Additionally, the produced orthophoto maps analyzed through an object-based image analysis (OBIA) and nearest-neighbor classification for mapping the coastal habitats. Classification classes included the main general habitat types, i.e. seagrass, soft bottom, and hard bottom The developed methodology applied at the Koumbara beach (Ios Island - Greece). Results showed that UAS's data revealed the sub-bottom complexity in large shallow areas since they provide such information in the spatial resolution that permits the mapping of seagrass meadows with extreme detail. The produced habitat vectors are ideal as reference data for studies with satellite data of lower spatial resolution.

  2. CLIP-seq analysis of multi-mapped reads discovers novel functional RNA regulatory sites in the human transcriptome.

    PubMed

    Zhang, Zijun; Xing, Yi

    2017-09-19

    Crosslinking or RNA immunoprecipitation followed by sequencing (CLIP-seq or RIP-seq) allows transcriptome-wide discovery of RNA regulatory sites. As CLIP-seq/RIP-seq reads are short, existing computational tools focus on uniquely mapped reads, while reads mapped to multiple loci are discarded. We present CLAM (CLIP-seq Analysis of Multi-mapped reads). CLAM uses an expectation-maximization algorithm to assign multi-mapped reads and calls peaks combining uniquely and multi-mapped reads. To demonstrate the utility of CLAM, we applied it to a wide range of public CLIP-seq/RIP-seq datasets involving numerous splicing factors, microRNAs and m6A RNA methylation. CLAM recovered a large number of novel RNA regulatory sites inaccessible by uniquely mapped reads. The functional significance of these sites was demonstrated by consensus motif patterns and association with alternative splicing (splicing factors), transcript abundance (AGO2) and mRNA half-life (m6A). CLAM provides a useful tool to discover novel protein-RNA interactions and RNA modification sites from CLIP-seq and RIP-seq data, and reveals the significant contribution of repetitive elements to the RNA regulatory landscape of the human transcriptome. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Heat demand mapping and district heating grid expansion analysis: Case study of Velika Gorica

    NASA Astrophysics Data System (ADS)

    Dorotić, Hrvoje; Novosel, Tomislav; Duić, Neven; Pukšec, Tomislav

    2017-10-01

    Highly efficient cogeneration and district heating systems have a significant potential for primary energy savings and the reduction of greenhouse gas emissions through the utilization of a waste heat and renewable energy sources. These potentials are still highly underutilized in most European countries. They also play a key role in the planning of future energy systems due to their positive impact on the increase of integration of intermittent renewable energy sources, for example wind and solar in a combination with power to heat technologies. In order to ensure optimal levels of district heating penetration into an energy system, a comprehensive analysis is necessary to determine the actual demands and the potential energy supply. Economical analysis of the grid expansion by using the GIS based mapping methods hasn't been demonstrated so far. This paper presents a heat demand mapping methodology and the use of its output for the district heating network expansion analysis. The result are showing that more than 59% of the heat demand could be covered by the district heating in the city of Velika Gorica, which is two times more than the present share. The most important reason of the district heating's unfulfilled potential is already existing natural gas infrastructure.

  4. Diffusion maps for high-dimensional single-cell analysis of differentiation data.

    PubMed

    Haghverdi, Laleh; Buettner, Florian; Theis, Fabian J

    2015-09-15

    Single-cell technologies have recently gained popularity in cellular differentiation studies regarding their ability to resolve potential heterogeneities in cell populations. Analyzing such high-dimensional single-cell data has its own statistical and computational challenges. Popular multivariate approaches are based on data normalization, followed by dimension reduction and clustering to identify subgroups. However, in the case of cellular differentiation, we would not expect clear clusters to be present but instead expect the cells to follow continuous branching lineages. Here, we propose the use of diffusion maps to deal with the problem of defining differentiation trajectories. We adapt this method to single-cell data by adequate choice of kernel width and inclusion of uncertainties or missing measurement values, which enables the establishment of a pseudotemporal ordering of single cells in a high-dimensional gene expression space. We expect this output to reflect cell differentiation trajectories, where the data originates from intrinsic diffusion-like dynamics. Starting from a pluripotent stage, cells move smoothly within the transcriptional landscape towards more differentiated states with some stochasticity along their path. We demonstrate the robustness of our method with respect to extrinsic noise (e.g. measurement noise) and sampling density heterogeneities on simulated toy data as well as two single-cell quantitative polymerase chain reaction datasets (i.e. mouse haematopoietic stem cells and mouse embryonic stem cells) and an RNA-Seq data of human pre-implantation embryos. We show that diffusion maps perform considerably better than Principal Component Analysis and are advantageous over other techniques for non-linear dimension reduction such as t-distributed Stochastic Neighbour Embedding for preserving the global structures and pseudotemporal ordering of cells. The Matlab implementation of diffusion maps for single-cell data is available at https://www.helmholtz-muenchen.de/icb/single-cell-diffusion-map

  5. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    ERIC Educational Resources Information Center

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  6. Geologic Map and Map Database of Eastern Sonoma and Western Napa Counties, California

    USGS Publications Warehouse

    Graymer, R.W.; Brabb, E.E.; Jones, D.L.; Barnes, J.; Nicholson, R.S.; Stamski, R.E.

    2007-01-01

    Introduction This report contains a new 1:100,000-scale geologic map, derived from a set of geologic map databases (Arc-Info coverages) containing information at 1:62,500-scale resolution, and a new description of the geologic map units and structural relations in the map area. Prepared as part of the San Francisco Bay Region Mapping Project, the study area includes the north-central part of the San Francisco Bay region, and forms the final piece of the effort to generate new, digital geologic maps and map databases for an area which includes Alameda, Contra Costa, Marin, Napa, San Francisco, San Mateo, Santa Clara, Santa Cruz, Solano, and Sonoma Counties. Geologic mapping in Lake County in the north-central part of the map extent was not within the scope of the Project. The map and map database integrates both previously published reports and new geologic mapping and field checking by the authors (see Sources of Data index map on the map sheet or the Arc-Info coverage eswn-so and the textfile eswn-so.txt). This report contains new ideas about the geologic structures in the map area, including the active San Andreas Fault system, as well as the geologic units and their relations. Together, the map (or map database) and the unit descriptions in this report describe the composition, distribution, and orientation of geologic materials and structures within the study area at regional scale. Regional geologic information is important for analysis of earthquake shaking, liquifaction susceptibility, landslide susceptibility, engineering materials properties, mineral resources and hazards, as well as groundwater resources and hazards. These data also assist in answering questions about the geologic history and development of the California Coast Ranges.

  7. Automated Glacier Mapping using Object Based Image Analysis. Case Studies from Nepal, the European Alps and Norway

    NASA Astrophysics Data System (ADS)

    Vatle, S. S.

    2015-12-01

    Frequent and up-to-date glacier outlines are needed for many applications of glaciology, not only glacier area change analysis, but also for masks in volume or velocity analysis, for the estimation of water resources and as model input data. Remote sensing offers a good option for creating glacier outlines over large areas, but manual correction is frequently necessary, especially in areas containing supraglacial debris. We show three different workflows for mapping clean ice and debris-covered ice within Object Based Image Analysis (OBIA). By working at the object level as opposed to the pixel level, OBIA facilitates using contextual, spatial and hierarchical information when assigning classes, and additionally permits the handling of multiple data sources. Our first example shows mapping debris-covered ice in the Manaslu Himalaya, Nepal. SAR Coherence data is used in combination with optical and topographic data to classify debris-covered ice, obtaining an accuracy of 91%. Our second example shows using a high-resolution LiDAR derived DEM over the Hohe Tauern National Park in Austria. Breaks in surface morphology are used in creating image objects; debris-covered ice is then classified using a combination of spectral, thermal and topographic properties. Lastly, we show a completely automated workflow for mapping glacier ice in Norway. The NDSI and NIR/SWIR band ratio are used to map clean ice over the entire country but the thresholds are calculated automatically based on a histogram of each image subset. This means that in theory any Landsat scene can be inputted and the clean ice can be automatically extracted. Debris-covered ice can be included semi-automatically using contextual and morphological information.

  8. Analysis of Fundus Shape in Highly Myopic Eyes by Using Curvature Maps Constructed from Optical Coherence Tomography

    PubMed Central

    Miyake, Masahiro; Yamashiro, Kenji; Akagi-Kurashige, Yumiko; Oishi, Akio; Tsujikawa, Akitaka; Hangai, Masanori; Yoshimura, Nagahisa

    2014-01-01

    Purpose To evaluate fundus shape in highly myopic eyes using color maps created through optical coherence tomography (OCT) image analysis. Methods We retrospectively evaluated 182 highly myopic eyes from 113 patients. After obtaining 12 lines of 9-mm radial OCT scans with the fovea at the center, the Bruch’s membrane line was plotted and its curvature was measured at 1-µm intervals in each image, which was reflected as a color topography map. For the quantitative analysis of the eye shape, mean absolute curvature and variance of curvature were calculated. Results The color maps allowed staphyloma visualization as a ring of green color at the edge and as that of orange-red color at the bottom. Analyses of mean and variance of curvature revealed that eyes with myopic choroidal neovascularization tended to have relatively flat posterior poles with smooth surfaces, while eyes with chorioretinal atrophy exhibited a steep, curved shape with an undulated surface (P<0.001). Furthermore, eyes with staphylomas and those without clearly differed in terms of mean curvature and the variance of curvature: 98.4% of eyes with staphylomas had mean curvature ≥7.8×10−5 [1/µm] and variance of curvature ≥0.26×10−8 [1/µm]. Conclusions We established a novel method to analyze posterior pole shape by using OCT images to construct curvature maps. Our quantitative analysis revealed that fundus shape is associated with myopic complications. These values were also effective in distinguishing eyes with staphylomas from those without. This tool for the quantitative evaluation of eye shape should facilitate future research of myopic complications. PMID:25259853

  9. Web-based network analysis and visualization using CellMaps

    PubMed Central

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-01-01

    Summary: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. Availability and Implementation: The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps. The client is implemented in JavaScript and the server in C and Java. Contact: jdopazo@cipf.es Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27296979

  10. Web-based network analysis and visualization using CellMaps.

    PubMed

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-10-01

    : CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps The client is implemented in JavaScript and the server in C and Java. jdopazo@cipf.es Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1993-01-01

    The SSME has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) Develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system. (2) Develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amounts of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. A high compression ratio can be achieved to allow the minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities. (3) Integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for a quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate

  12. Molecular mapping and breeding with microsatellite markers.

    PubMed

    Lightfoot, David A; Iqbal, Muhammad J

    2013-01-01

    In genetics databases for crop plant species across the world, there are thousands of mapped loci that underlie quantitative traits, oligogenic traits, and simple traits recognized by association mapping in populations. The number of loci will increase as new phenotypes are measured in more diverse genotypes and genetic maps based on saturating numbers of markers are developed. A period of locus reevaluation will decrease the number of important loci as those underlying mega-environmental effects are recognized. A second wave of reevaluation of loci will follow from developmental series analysis, especially for harvest traits like seed yield and composition. Breeding methods to properly use the accurate maps of QTL are being developed. New methods to map, fine map, and isolate the genes underlying the loci will be critical to future advances in crop biotechnology. Microsatellite markers are the most useful tool for breeders. They are codominant, abundant in all genomes, highly polymorphic so useful in many populations, and both economical and technically easy to use. The selective genotyping approaches, including genotype ranking (indexing) based on partial phenotype data combined with favorable allele data and bulked segregation event (segregant) analysis (BSA), will be increasingly important uses for microsatellites. Examples of the methods for developing and using microsatellites derived from genomic sequences are presented for monogenic, oligogenic, and polygenic traits. Examples of successful mapping, fine mapping, and gene isolation are given. When combined with high-throughput methods for genotyping and a genome sequence, the use of association mapping with microsatellite markers will provide critical advances in the analysis of crop traits.

  13. Analysis of spatial distribution of land cover maps accuracy

    NASA Astrophysics Data System (ADS)

    Khatami, R.; Mountrakis, G.; Stehman, S. V.

    2017-12-01

    Land cover maps have become one of the most important products of remote sensing science. However, classification errors will exist in any classified map and affect the reliability of subsequent map usage. Moreover, classification accuracy often varies over different regions of a classified map. These variations of accuracy will affect the reliability of subsequent analyses of different regions based on the classified maps. The traditional approach of map accuracy assessment based on an error matrix does not capture the spatial variation in classification accuracy. Here, per-pixel accuracy prediction methods are proposed based on interpolating accuracy values from a test sample to produce wall-to-wall accuracy maps. Different accuracy prediction methods were developed based on four factors: predictive domain (spatial versus spectral), interpolation function (constant, linear, Gaussian, and logistic), incorporation of class information (interpolating each class separately versus grouping them together), and sample size. Incorporation of spectral domain as explanatory feature spaces of classification accuracy interpolation was done for the first time in this research. Performance of the prediction methods was evaluated using 26 test blocks, with 10 km × 10 km dimensions, dispersed throughout the United States. The performance of the predictions was evaluated using the area under the curve (AUC) of the receiver operating characteristic. Relative to existing accuracy prediction methods, our proposed methods resulted in improvements of AUC of 0.15 or greater. Evaluation of the four factors comprising the accuracy prediction methods demonstrated that: i) interpolations should be done separately for each class instead of grouping all classes together; ii) if an all-classes approach is used, the spectral domain will result in substantially greater AUC than the spatial domain; iii) for the smaller sample size and per-class predictions, the spectral and spatial domain

  14. A Multivariate Methodological Workflow for the Analysis of FTIR Chemical Mapping Applied on Historic Paint Stratigraphies

    PubMed Central

    Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene

    2017-01-01

    In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162

  15. Participatory Development and Analysis of a Fuzzy Cognitive Map of the Establishment of a Bio-Based Economy in the Humber Region

    PubMed Central

    Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren

    2013-01-01

    Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case

  16. Dissociable effects of reward and expectancy during evaluative feedback processing revealed by topographic ERP mapping analysis.

    PubMed

    Gheza, Davide; Paul, Katharina; Pourtois, Gilles

    2017-11-24

    Evaluative feedback provided during performance monitoring (PM) elicits either a positive or negative deflection ~250-300ms after its onset in the event-related potential (ERP) depending on whether the outcome is reward-related or not, as well as expected or not. However, it remains currently unclear whether these two deflections reflect a unitary process, or rather dissociable effects arising from non-overlapping brain networks. To address this question, we recorded 64-channel EEG in healthy adult participants performing a standard gambling task where valence and expectancy were manipulated in a factorial design. We analyzed the feedback-locked ERP data using a conventional ERP analysis, as well as an advanced topographic ERP mapping analysis supplemented with distributed source localization. Results reveal two main topographies showing opposing valence effects, and being differently modulated by expectancy. The first one was short-lived and sensitive to no-reward irrespective of expectancy. Source-estimation associated with this topographic map comprised mainly regions of the dorsal anterior cingulate cortex. The second one was primarily driven by reward, had a prolonged time-course and was monotonically influenced by expectancy. Moreover, this reward-related topographical map was best accounted for by intracranial generators estimated in the posterior cingulate cortex. These new findings suggest the existence of dissociable brain systems depending on feedback valence and expectancy. More generally, they inform about the added value of using topographic ERP mapping methods, besides conventional ERP measurements, to characterize qualitative changes occurring in the spatio-temporal dynamic of reward processing during PM. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Thermal Analysis and Microhardness Mapping in Hybrid Laser Welds in a Structural Steel

    DTIC Science & Technology

    2003-01-01

    conditions. Via the keyhole the laser beam brings about easier ignition of the arc, stabilization of the arc welding process, and penetration of the...with respect to the conventional GMAW or GTAW processes without the need for very close fit-up. This paper will compare an autogenous laser weld to a...UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP017864 TITLE: Thermal Analysis and Microhardness Mapping in Hybrid Laser

  18. Description and Sensitivity Analysis of the SOLSE/LORE-2 and SAGE III Limb Scattering Ozone Retrieval Algorithms

    NASA Technical Reports Server (NTRS)

    Loughman, R.; Flittner, D.; Herman, B.; Bhartia, P.; Hilsenrath, E.; McPeters, R.; Rault, D.

    2002-01-01

    The SOLSE (Shuttle Ozone Limb Sounding Experiment) and LORE (Limb Ozone Retrieval Experiment) instruments are scheduled for reflight on Space Shuttle flight STS-107 in July 2002. In addition, the SAGE III (Stratospheric Aerosol and Gas Experiment) instrument will begin to make limb scattering measurements during Spring 2002. The optimal estimation technique is used to analyze visible and ultraviolet limb scattered radiances and produce a retrieved ozone profile. The algorithm used to analyze data from the initial flight of the SOLSE/LORE instruments (on Space Shuttle flight STS-87 in November 1997) forms the basis of the current algorithms, with expansion to take advantage of the increased multispectral information provided by SOLSE/LORE-2 and SAGE III. We also present detailed sensitivity analysis for these ozone retrieval algorithms. The primary source of ozone retrieval error is tangent height misregistration (i.e., instrument pointing error), which is relevant throughout the altitude range of interest, and can produce retrieval errors on the order of 10-20 percent due to a tangent height registration error of 0.5 km at the tangent point. Other significant sources of error are sensitivity to stratospheric aerosol and sensitivity to error in the a priori ozone estimate (given assumed instrument signal-to-noise = 200). These can produce errors up to 10 percent for the ozone retrieval at altitudes less than 20 km, but produce little error above that level.

  19. Two-trait-locus linkage analysis: A powerful strategy for mapping complex genetic traits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schork, N.J.; Boehnke, M.; Terwilliger, J.D.

    1993-11-01

    Nearly all diseases mapped to date follow clear Mendelian, single-locus segregation patterns. In contrast, many common familial diseases such as diabetes, psoriasis, several forms of cancer, and schizophrenia are familial and appear to have a genetic component but do not exhibit simple Mendelian transmission. More complex models are required to explain the genetics of these important diseases. In this paper, the authors explore two-trait-locus, two-marker-locus linkage analysis in which two trait loci are mapped simultaneously to separate genetic markers. The authors compare the utility of this approach to standard one-trait-locus, one-marker-locus linkage analysis with and without allowance for heterogeneity. Themore » authors also compare the utility of the two-trait-locus, two-marker-locus analysis to two-trait-locus, one-marker-locus linkage analysis. For common diseases, pedigrees are often bilineal, with disease genes entering via two or more unrelated pedigree members. Since such pedigrees often are avoided in linkage studies, the authors also investigate the relative information content of unilineal and bilineal pedigrees. For the dominant-or-recessive and threshold models that the authors consider, the authors find that two-trait-locus, two-marker-locus linkage analysis can provide substantially more linkage information, as measured by expected maximum lod score, than standard one-trait-locus, one-marker-locus methods, even allowing for heterogeneity, while, for a dominant-or-dominant generating model, one-locus models that allow for heterogeneity extract essentially as much information as the two-trait-locus methods. For these three models, the authors also find that bilineal pedigrees provide sufficient linkage information to warrant their inclusion in such studies. The authors discuss strategies for assessing the significance of the two linkages assumed in two-trait-locus, two-marker-locus models. 37 refs., 1 fig., 4 tabs.« less

  20. Oregon Cascades Play Fairway Analysis: Faults and Heat Flow maps

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This submission includes a fault map of the Oregon Cascades and backarc, a probability map of heat flow, and a fault density probability layer. More extensive metadata can be found within each zip file.

  1. Lunar terrain mapping and relative-roughness analysis

    USGS Publications Warehouse

    Rowan, Lawrence C.; McCauley, John F.; Holm, Esther A.

    1971-01-01

    Terrain maps of the equatorial zone (long 70° E.-70° W. and lat 10° N-10° S.) were prepared at scales of 1:2,000,000 and 1:1,000,000 to classify lunar terrain with respect to roughness and to provide a basis for selecting sites for Surveyor and Apollo landings as well as for Ranger and Lunar Orbiter photographs. The techniques that were developed as a result of this effort can be applied to future planetary exploration. By using the best available earth-based observational data and photographs 1:1,000,000-scale and U.S. Geological Survey lunar geologic maps and U.S. Air Force Aeronautical Chart and Information Center LAC charts, lunar terrain was described by qualitative and quantitative methods and divided into four fundamental classes: maria, terrae, craters, and linear features. Some 35 subdivisions were defined and mapped throughout the equatorial zone, and, in addition, most of the map units were illustrated by photographs. The terrain types were analyzed quantitatively to characterize and order their relative-roughness characteristics. Approximately 150,000 east-west slope measurements made by a photometric technique (photoclinometry) in 51 sample areas indicate that algebraic slope-frequency distributions are Gaussian, and so arithmetic means and standard deviations accurately describe the distribution functions. The algebraic slope-component frequency distributions are particularly useful for rapidly determining relative roughness of terrain. The statistical parameters that best describe relative roughness are the absolute arithmetic mean, the algebraic standard deviation, and the percentage of slope reversal. Statistically derived relative-relief parameters are desirable supplementary measures of relative roughness in the terrae. Extrapolation of relative roughness for the maria was demonstrated using Ranger VII slope-component data and regional maria slope data, as well as the data reported here. It appears that, for some morphologically homogeneous

  2. Image Analysis for Facility Siting: a Comparison of Lowand High-altitude Image Interpretability for Land Use/land Cover Mapping

    NASA Technical Reports Server (NTRS)

    Borella, H. M.; Estes, J. E.; Ezra, C. E.; Scepan, J.; Tinney, L. R.

    1982-01-01

    For two test sites in Pennsylvania the interpretability of commercially acquired low-altitude and existing high-altitude aerial photography are documented in terms of time, costs, and accuracy for Anderson Level II land use/land cover mapping. Information extracted from the imagery is to be used in the evaluation process for siting energy facilities. Land use/land cover maps were drawn at 1:24,000 scale using commercially flown color infrared photography obtained from the United States Geological Surveys' EROS Data Center. Detailed accuracy assessment of the maps generated by manual image analysis was accomplished employing a stratified unaligned adequate class representation. Both 'area-weighted' and 'by-class' accuracies were documented and field-verified. A discrepancy map was also drawn to illustrate differences in classifications between the two map scales. Results show that the 1:24,000 scale map set was more accurate (99% to 94% area-weighted) than the 1:62,500 scale set, especially when sampled by class (96% to 66%). The 1:24,000 scale maps were also more time-consuming and costly to produce, due mainly to higher image acquisition costs.

  3. Fine Mapping on Chromosome 13q32–34 and Brain Expression Analysis Implicates MYO16 in Schizophrenia

    PubMed Central

    Rodriguez-Murillo, Laura; Xu, Bin; Roos, J Louw; Abecasis, Gonçalo R; Gogos, Joseph A; Karayiorgou, Maria

    2014-01-01

    We previously reported linkage of schizophrenia and schizoaffective disorder to 13q32–34 in the European descent Afrikaner population from South Africa. The nature of genetic variation underlying linkage peaks in psychiatric disorders remains largely unknown and both rare and common variants may be contributing. Here, we examine the contribution of common variants located under the 13q32–34 linkage region. We used densely spaced SNPs to fine map the linkage peak region using both a discovery sample of 415 families and a meta-analysis incorporating two additional replication family samples. In a second phase of the study, we use one family-based data set with 237 families and independent case–control data sets for fine mapping of the common variant association signal using HapMap SNPs. We report a significant association with a genetic variant (rs9583277) within the gene encoding for the myosin heavy-chain Myr 8 (MYO16), which has been implicated in neuronal phosphoinositide 3-kinase signaling. Follow-up analysis of HapMap variation within MYO16 in a second set of Afrikaner families and additional case–control data sets of European descent highlighted a region across introns 2–6 as the most likely region to harbor common MYO16 risk variants. Expression analysis revealed a significant increase in the level of MYO16 expression in the brains of schizophrenia patients. Our results suggest that common variation within MYO16 may contribute to the genetic liability to schizophrenia. PMID:24141571

  4. Conformal mapping for multiple terminals

    PubMed Central

    Wang, Weimin; Ma, Wenying; Wang, Qiang; Ren, Hao

    2016-01-01

    Conformal mapping is an important mathematical tool that can be used to solve various physical and engineering problems in many fields, including electrostatics, fluid mechanics, classical mechanics, and transformation optics. It is an accurate and convenient way to solve problems involving two terminals. However, when faced with problems involving three or more terminals, which are more common in practical applications, existing conformal mapping methods apply assumptions or approximations. A general exact method does not exist for a structure with an arbitrary number of terminals. This study presents a conformal mapping method for multiple terminals. Through an accurate analysis of boundary conditions, additional terminals or boundaries are folded into the inner part of a mapped region. The method is applied to several typical situations, and the calculation process is described for two examples of an electrostatic actuator with three electrodes and of a light beam splitter with three ports. Compared with previously reported results, the solutions for the two examples based on our method are more precise and general. The proposed method is helpful in promoting the application of conformal mapping in analysis of practical problems. PMID:27830746

  5. Guided asteroid deflection by kinetic impact: Mapping keyholes to an asteroid's surface

    NASA Astrophysics Data System (ADS)

    Chesley, S.; Farnocchia, D.

    2014-07-01

    The kinetic impactor deflection approach is likely to be the optimal deflection strategy in most real-world cases, given the likelihood of decades of warning time provided by asteroid search programs and the probable small size of the next confirmed asteroid impact that would require deflection. However, despite its straightforward implementation, the kinetic impactor approach can have its effectiveness limited by the astrodynamics that govern the impactor spacecraft trajectory. First, the deflection from an impact is maximized when the asteroid is at perihelion, while an impact near perihelion can in some cases be energetically difficult to implement. Additionally, the asteroid change in velocity Δ V should aligned with the target's heliocentric velocity vector in order to maximize the deflection at a potential impact some years in the future. Thus the relative velocity should be aligned with or against the heliocentric velocity, which implies that the impactor and asteroid orbits should be tangent at the point of impact. However, for natural bodies such as meteorites colliding with the Earth, the relative velocity vectors tend to cluster near the sunward or anti- sunward directions, far from the desired direction. This is because there is generally a significant crossing angle between the orbits of the impactor and target and an impact at tangency is unusual. The point is that hitting the asteroid is not enough, but rather we desire to hit the asteroid at a point when the asteroid and spacecraft orbits are nearly tangent and when the asteroid is near perihelion. However, complicating the analysis is the fact that the impact of a spacecraft on an asteroid would create an ejecta plume that is roughly normal to the surface at the point of impact. This escaping ejecta provides additional momentum transfer that generally adds to the effectiveness of a kinetic deflection. The ratio β between the ejecta momentum and the total momentum (ejecta plus spacecraft) can

  6. Landscape patterns from mathematical morphology on maps with contagion

    Treesearch

    Kurt Riitters; Peter Vogt; Pierre Soille; Christine Estreguil

    2009-01-01

    The perceived realism of simulated maps with contagion (spatial autocorrelation) has led to their use for comparing landscape pattern metrics and as habitat maps for modeling organism movement across landscapes. The objective of this study was to conduct a neutral model analysis of pattern metrics defined by morphological spatial pattern analysis (MSPA) on maps with...

  7. Using geologic maps and seismic refraction in pavement-deflection analysis

    DOT National Transportation Integrated Search

    1999-10-01

    The researchers examined the relationship between three data types -- geologic maps, pavement deflection, and seismic refraction data -- from diverse geologic settings to determine whether geologic maps and seismic data might be used to interpret def...

  8. Preclinical medical students’ understandings of academic and medical professionalism: visual analysis of mind maps

    PubMed Central

    Rees, Charlotte E

    2017-01-01

    Introduction Several studies have begun to explore medical students’ understandings of professionalism generally and medical professionalism specifically. Despite espoused relationships between academic (AP) and medical professionalism (MP), previous research has not yet investigated students’ conceptualisations of AP and MP and the relationships between the two. Objectives The current study, based on innovative visual analysis of mind maps, therefore aims to contribute to the developing literature on how professionalism is understood. Methods We performed a multilayered analysis of 98 mind maps from 262 first-year medical students, including analysing textual and graphical elements of AP, MP and the relationships between AP and MP. Results The most common textual attributes of AP were learning, lifestyle and personality, while attributes of MP were knowledge, ethics and patient-doctor relations. Images of books, academic caps and teachers were used most often to represent AP, while images of the stethoscope, doctor and red cross were used to symbolise MP. While AP-MP relations were sometimes indicated through co-occurring text, visual connections and higher-order visual metaphors, many students struggled to articulate the relationships between AP and MP. Conclusions While the mind maps’ textual attributes shared similarities with those found in previous research, suggesting the universality of some professionalism attributes, our study provides new insights into students’ conceptualisations of AP, MP and AP-MP relationships. We encourage medical educators to help students develop their understandings of AP, MP and AP-MP relationships, plus consider the feasibility and value of mind maps as a source of visual data for medical education research. PMID:28821520

  9. Terrain Correction on the moving equal area cylindrical map projection of the surface of a reference ellipsoid

    NASA Astrophysics Data System (ADS)

    Ardalan, A.; Safari, A.; Grafarend, E.

    2003-04-01

    An operational algorithm for computing the ellipsoidal terrain correction based on application of closed form solution of the Newton integral in terms of Cartesian coordinates in the cylindrical equal area map projected surface of a reference ellipsoid has been developed. As the first step the mapping of the points on the surface of a reference ellipsoid onto the cylindrical equal area map projection of a cylinder tangent to a point on the surface of reference ellipsoid closely studied and the map projection formulas are computed. Ellipsoidal mass elements with various sizes on the surface of the reference ellipsoid is considered and the gravitational potential and the vector of gravitational intensity of these mass elements has been computed via the solution of Newton integral in terms of ellipsoidal coordinates. The geographical cross section areas of the selected ellipsoidal mass elements are transferred into cylindrical equal area map projection and based on the transformed area elements Cartesian mass elements with the same height as that of the ellipsoidal mass elements are constructed. Using the close form solution of the Newton integral in terms of Cartesian coordinates the potential of the Cartesian mass elements are computed and compared with the same results based on the application of the ellipsoidal Newton integral over the ellipsoidal mass elements. The results of the numerical computations show that difference between computed gravitational potential of the ellipsoidal mass elements and Cartesian mass element in the cylindrical equal area map projection is of the order of 1.6 × 10-8m^2/s^2 for a mass element with the cross section size of 10 km × 10 km and the height of 1000 m. For a 1 km × 1 km mass element with the same height, this difference is less than 1.5 × 10-4 m^2}/s^2. The results of the numerical computations indicate that a new method for computing the terrain correction based on the closed form solution of the Newton integral in

  10. US Topo: Topographic Maps for the Nation

    USGS Publications Warehouse

    Hytes, Patricia L.

    2009-01-01

    US Topo is the next generation of topographic maps from the U.S. Geological Survey (USGS). Arranged in the familiar 7.5-minute quadrangle format, digital US Topo maps are designed to look and feel (and perform) like the traditional paper topographic maps for which the USGS is so well known. In contrast to paper-based maps, US Topo maps provide modern technical advantages that support faster, wider public distribution and enable basic, on-screen geographic analysis for all users. US Topo maps are available free on the Web. Each map quadrangle is constructed in GeoPDF? format from key layers of geographic data (orthoimagery, roads, geographic names, topographic contours, and hydrographic features) found in The National Map. US Topo quadrangles can be printed from personal computers or plotters as complete, full-sized, maps or in customized sections, in a user-desired specific format. Paper copies of the maps can also be purchased from the USGS Store. Download links and a users guide are featured on the US Topo Web site. US Topo users can turn geographic data layers on and off as needed; they can zoom in and out to highlight specific features or see a broader area. File size for each digital 7.5-minute quadrangle, about 15-20 megabytes, is suitable for most users. Associated electronic tools for geographic analysis are available free for download.

  11. Rapid Determination of Mineral Abundance by X-ray Microfluorescence Mapping and Multispectral Image Analysis

    NASA Astrophysics Data System (ADS)

    Moscati, R. J.; Marshall, B. D.

    2005-12-01

    X-ray microfluorescence (XRMF) spectrometry is a rapid, accurate technique to map element abundances of rock surfaces (such as thin-section billets, the block remaining when a thin section is prepared). Scanning a specimen with a collimated primary X-ray beam (100 μm diameter) generates characteristic secondary X-rays that yield the relative chemical abundances for the major rock-/mineral-forming analytes (such as Si, Al, K, Ca, and Fe). When Cu-rich epoxy is used to impregnate billets, XRMF also can determine porosity from the Cu abundance. Common billet scan size is 30 x 15 mm and the typical mapping time rarely exceeds 2.5 hrs (much faster than traditional point-counting). No polishing or coating is required for the billets, although removing coarse striations or gross irregularities on billet surfaces should improve the spatial accuracy of the maps. Background counts, spectral artifacts, and diffraction peaks typically are inconsequential for maps of major elements. An operational check is performed after every 10 analyses on a standard that contains precisely measured areas of Mn and Mo. Reproducibility of the calculated area ratio of Mn:Mo is consistently within 5% of the known value. For each billet, the single element maps (TIFF files) generated by XRMF are imported into MultiSpec© (a program developed at Purdue University for analysis of multispectral image data, available from http://dynamo.ecn.purdue.edu/~biehl/MultiSpec/) where mineral phases can be spectrally identified and their relative abundances quantified. The element maps for each billet are layered to produce a multi-element file for mineral classification and statistical processing, including modal estimates of mineral abundance. Although mineral identification is possible even if the mineralogy is unknown, prior petrographic examination of the corresponding thin section yields more accurate maps because the software can be set to identify all similar pixels. Caution is needed when using

  12. Detecting chaos in particle accelerators through the frequency map analysis method.

    PubMed

    Papaphilippou, Yannis

    2014-06-01

    The motion of beams in particle accelerators is dominated by a plethora of non-linear effects, which can enhance chaotic motion and limit their performance. The application of advanced non-linear dynamics methods for detecting and correcting these effects and thereby increasing the region of beam stability plays an essential role during the accelerator design phase but also their operation. After describing the nature of non-linear effects and their impact on performance parameters of different particle accelerator categories, the theory of non-linear particle motion is outlined. The recent developments on the methods employed for the analysis of chaotic beam motion are detailed. In particular, the ability of the frequency map analysis method to detect chaotic motion and guide the correction of non-linear effects is demonstrated in particle tracking simulations but also experimental data.

  13. Integrating Geospatial Technologies in Fifth-Grade Curriculum: Impact on Spatial Ability and Map-Analysis Skills

    ERIC Educational Resources Information Center

    Jadallah, May; Hund, Alycia M.; Thayn, Jonathan; Studebaker, Joel Garth; Roman, Zachary J.; Kirby, Elizabeth

    2017-01-01

    This study explores the effects of geographic information systems (GIS) curriculum on fifth-grade students' spatial ability and map-analysis skills. A total of 174 students from an urban public school district and their teachers participated in a quasi-experimental design study. Four teachers implemented a GIS curriculum in experimental classes…

  14. Genetic Architecture of Aluminum Tolerance in Rice (Oryza sativa) Determined through Genome-Wide Association Analysis and QTL Mapping

    PubMed Central

    Famoso, Adam N.; Zhao, Keyan; Clark, Randy T.; Tung, Chih-Wei; Wright, Mark H.; Bustamante, Carlos; Kochian, Leon V.; McCouch, Susan R.

    2011-01-01

    Aluminum (Al) toxicity is a primary limitation to crop productivity on acid soils, and rice has been demonstrated to be significantly more Al tolerant than other cereal crops. However, the mechanisms of rice Al tolerance are largely unknown, and no genes underlying natural variation have been reported. We screened 383 diverse rice accessions, conducted a genome-wide association (GWA) study, and conducted QTL mapping in two bi-parental populations using three estimates of Al tolerance based on root growth. Subpopulation structure explained 57% of the phenotypic variation, and the mean Al tolerance in Japonica was twice that of Indica. Forty-eight regions associated with Al tolerance were identified by GWA analysis, most of which were subpopulation-specific. Four of these regions co-localized with a priori candidate genes, and two highly significant regions co-localized with previously identified QTLs. Three regions corresponding to induced Al-sensitive rice mutants (ART1, STAR2, Nrat1) were identified through bi-parental QTL mapping or GWA to be involved in natural variation for Al tolerance. Haplotype analysis around the Nrat1 gene identified susceptible and tolerant haplotypes explaining 40% of the Al tolerance variation within the aus subpopulation, and sequence analysis of Nrat1 identified a trio of non-synonymous mutations predictive of Al sensitivity in our diversity panel. GWA analysis discovered more phenotype–genotype associations and provided higher resolution, but QTL mapping identified critical rare and/or subpopulation-specific alleles not detected by GWA analysis. Mapping using Indica/Japonica populations identified QTLs associated with transgressive variation where alleles from a susceptible aus or indica parent enhanced Al tolerance in a tolerant Japonica background. This work supports the hypothesis that selectively introgressing alleles across subpopulations is an efficient approach for trait enhancement in plant breeding programs and

  15. An intra-specific consensus genetic map of pigeonpea [Cajanus cajan (L.) Millspaugh] derived from six mapping populations.

    PubMed

    Bohra, Abhishek; Saxena, Rachit K; Gnanesh, B N; Saxena, Kulbhushan; Byregowda, M; Rathore, Abhishek; Kavikishor, P B; Cook, Douglas R; Varshney, Rajeev K

    2012-10-01

    Pigeonpea (Cajanus cajan L.) is an important food legume crop of rainfed agriculture. Owing to exposure of the crop to a number of biotic and abiotic stresses, the crop productivity has remained stagnant for almost last five decades at ca. 750 kg/ha. The availability of a cytoplasmic male sterility (CMS) system has facilitated the development and release of hybrids which are expected to enhance the productivity of pigeonpea. Recent advances in genomics and molecular breeding such as marker-assisted selection (MAS) offer the possibility to accelerate hybrid breeding. Molecular markers and genetic maps are pre-requisites for deploying MAS in breeding. However, in the case of pigeonpea, only one inter- and two intra-specific genetic maps are available so far. Here, four new intra-specific genetic maps comprising 59-140 simple sequence repeat (SSR) loci with map lengths ranging from 586.9 to 881.6 cM have been constructed. Using these four genetic maps together with two recently published intra-specific genetic maps, a consensus map was constructed, comprising of 339 SSR loci spanning a distance of 1,059 cM. Furthermore, quantitative trait loci (QTL) analysis for fertility restoration (Rf) conducted in three mapping populations identified four major QTLs explaining phenotypic variances up to 24 %. To the best of our knowledge, this is the first report on construction of a consensus genetic map in pigeonpea and on the identification of QTLs for fertility restoration. The developed consensus genetic map should serve as a reference for developing new genetic maps as well as correlating with the physical map in pigeonpea to be developed in near future. The availability of more informative markers in the bins harbouring QTLs for sterility mosaic disease (SMD) and Rf will facilitate the selection of the most suitable markers for genetic analysis and molecular breeding applications in pigeonpea.

  16. Recurrence quantification analysis applied to spatiotemporal pattern analysis in high-density mapping of human atrial fibrillation.

    PubMed

    Zeemering, Stef; Bonizzi, Pietro; Maesen, Bart; Peeters, Ralf; Schotten, Ulrich

    2015-01-01

    Spatiotemporal complexity of atrial fibrillation (AF) patterns is often quantified by annotated intracardiac contact mapping. We introduce a new approach that applies recurrence plot (RP) construction followed by recurrence quantification analysis (RQA) to epicardial atrial electrograms, recorded with a high-density grid of electrodes. In 32 patients with no history of AF (aAF, n=11), paroxysmal AF (PAF, n=12) and persistent AF (persAF, n=9), RPs were constructed using a phase space electrogram embedding dimension equal to the estimated AF cycle length. Spatial information was incorporated by 1) averaging the recurrence over all electrodes, and 2) by applying principal component analysis (PCA) to the matrix of embedded electrograms and selecting the first principal component as a representation of spatial diversity. Standard RQA parameters were computed on the constructed RPs and correlated to the number of fibrillation waves per AF cycle (NW). Averaged RP RQA parameters showed no correlation with NW. Correlations improved when applying PCA, with maximum correlation achieved between RP threshold and NW (RR1%, r=0.68, p <; 0.001) and RP determinism (DET, r=-0.64, p <; 0.001). All studied RQA parameters based on the PCA RP were able to discriminate between persAF and aAF/PAF (DET persAF 0.40 ± 0.11 vs. 0.59 ± 0.14/0.62 ± 0.16, p <; 0.01). RP construction and RQA combined with PCA provide a quick and reliable tool to visualize dynamical behaviour and to assess the complexity of contact mapping patterns in AF.

  17. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    PubMed

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  18. Interactive map of refugee movement in Europe

    NASA Astrophysics Data System (ADS)

    Calka, Beata; Cahan, Bruce

    2016-12-01

    Considering the recent mass movement of people fleeing war and oppression, an analysis of changes in migration, in particular an analysis of the final destination refugees choose, seems to be of utmost importance. Many international organisations like UNHCR (the United Nations High Commissioner for Refugees) or EuroStat gather and provide information on the number of refugees and the routes they follow. What is also needed to study the state of affairs closely is a visual form presenting the rapidly changing situation. An analysis of the problem together with up-to-date statistical data presented in the visual form of a map is essential. This article describes methods of preparing such interactive maps displaying movement of refugees in European Union countries. Those maps would show changes taking place throughout recent years but also the dynamics of the development of the refugee crisis in Europe. The ArcGIS software was applied to make the map accessible on the Internet. Additionally, online sources and newspaper articles were used to present the movement of migrants. The interactive map makes it possible to watch spatial data with an opportunity to navigate within the map window. Because of that it is a clear and convenient tool to visualise such processes as refugee migration in Europe.

  19. On the map: Nature and Science editorials.

    PubMed

    Waaijer, Cathelijn J F; van Bochove, Cornelis A; van Eck, Nees Jan

    2011-01-01

    Bibliometric mapping of scientific articles based on keywords and technical terms in abstracts is now frequently used to chart scientific fields. In contrast, no significant mapping has been applied to the full texts of non-specialist documents. Editorials in Nature and Science are such non-specialist documents, reflecting the views of the two most read scientific journals on science, technology and policy issues. We use the VOSviewer mapping software to chart the topics of these editorials. A term map and a document map are constructed and clusters are distinguished in both of them. The validity of the document clustering is verified by a manual analysis of a sample of the editorials. This analysis confirms the homogeneity of the clusters obtained by mapping and augments the latter with further detail. As a result, the analysis provides reliable information on the distribution of the editorials over topics, and on differences between the journals. The most striking difference is that Nature devotes more attention to internal science policy issues and Science more to the political influence of scientists. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s11192-010-0205-9) contains supplementary material, which is available to authorized users.

  20. Surface mapping of spike potential fields: experienced EEGers vs. computerized analysis.

    PubMed

    Koszer, S; Moshé, S L; Legatt, A D; Shinnar, S; Goldensohn, E S

    1996-03-01

    An EEG epileptiform spike focus recorded with scalp electrodes is clinically localized by visual estimation of the point of maximal voltage and the distribution of its surrounding voltages. We compared such estimated voltage maps, drawn by experienced electroencephalographers (EEGers), with a computerized spline interpolation technique employed in the commercially available software package FOCUS. Twenty-two spikes were recorded from 15 patients during long-term continuous EEG monitoring. Maps of voltage distribution from the 28 electrodes surrounding the points of maximum change in slope (the spike maximum) were constructed by the EEGer. The same points of maximum spike and voltage distributions at the 29 electrodes were mapped by computerized spline interpolation and a comparison between the two methods was made. The findings indicate that the computerized spline mapping techniques employed in FOCUS construct voltage maps with similar maxima and distributions as the maps created by experienced EEGers. The dynamics of spike activity, including correlations, are better visualized using the computerized technique than by manual interpretation alone. Its use as a technique for spike localization is accurate and adds information of potential clinical value.

  1. Cyberspace Classification and Cognition: Information and Communications Cyberspaces

    ERIC Educational Resources Information Center

    Kellerman, Aharon

    2007-01-01

    The notions cognitive space and cognitive/mental maps were proposed in the late 1940s, and have been extensively studied since the 1970s within behavioral geography, as well as within tangent disciplines, notably environmental psychology and architecture. Viewing these notions from the perspective of the 2000s, one can state that the hidden…

  2. Comparative Analysis of EO-1 ALI and Hyperion, and Landsat ETM+ Data for Mapping Forest Crown Closure and Leaf Area Index

    PubMed Central

    Pu, Ruiliang; Gong, Peng; Yu, Qian

    2008-01-01

    In this study, a comparative analysis of capabilities of three sensors for mapping forest crown closure (CC) and leaf area index (LAI) was conducted. The three sensors are Hyperspectral Imager (Hyperion) and Advanced Land Imager (ALI) onboard EO-1 satellite and Landsat-7 Enhanced Thematic Mapper Plus (ETM+). A total of 38 mixed coniferous forest CC and 38 LAI measurements were collected at Blodgett Forest Research Station, University of California at Berkeley, USA. The analysis method consists of (1) extracting spectral vegetation indices (VIs), spectral texture information and maximum noise fractions (MNFs), (2) establishing multivariate prediction models, (3) predicting and mapping pixel-based CC and LAI values, and (4) validating the mapped CC and LAI results with field validated photo-interpreted CC and LAI values. The experimental results indicate that the Hyperion data are the most effective for mapping forest CC and LAI (CC mapped accuracy (MA) = 76.0%, LAI MA = 74.7%), followed by ALI data (CC MA = 74.5%, LAI MA = 70.7%), with ETM+ data results being least effective (CC MA = 71.1%, LAI MA = 63.4%). This analysis demonstrates that the Hyperion sensor outperforms the other two sensors: ALI and ETM+. This is because of its high spectral resolution with rich subtle spectral information, of its short-wave infrared data for constructing optimal VIs that are slightly affected by the atmosphere, and of its more available MNFs than the other two sensors to be selected for establishing prediction models. Compared to ETM+ data, ALI data are better for mapping forest CC and LAI due to ALI data with more bands and higher signal-to-noise ratios than those of ETM+ data. PMID:27879906

  3. Comparative Analysis of EO-1 ALI and Hyperion, and Landsat ETM+ Data for Mapping Forest Crown Closure and Leaf Area Index.

    PubMed

    Pu, Ruiliang; Gong, Peng; Yu, Qian

    2008-06-06

    In this study, a comparative analysis of capabilities of three sensors for mapping forest crown closure (CC) and leaf area index (LAI) was conducted. The three sensors are Hyperspectral Imager (Hyperion) and Advanced Land Imager (ALI) onboard EO-1 satellite and Landsat-7 Enhanced Thematic Mapper Plus (ETM+). A total of 38 mixed coniferous forest CC and 38 LAI measurements were collected at Blodgett Forest Research Station, University of California at Berkeley, USA. The analysis method consists of (1) extracting spectral vegetation indices (VIs), spectral texture information and maximum noise fractions (MNFs), (2) establishing multivariate prediction models, (3) predicting and mapping pixel-based CC and LAI values, and (4) validating the mapped CC and LAI results with field validated photo-interpreted CC and LAI values. The experimental results indicate that the Hyperion data are the most effective for mapping forest CC and LAI (CC mapped accuracy (MA) = 76.0%, LAI MA = 74.7%), followed by ALI data (CC MA = 74.5%, LAI MA = 70.7%), with ETM+ data results being least effective (CC MA = 71.1%, LAI MA = 63.4%). This analysis demonstrates that the Hyperion sensor outperforms the other two sensors: ALI and ETM+. This is because of its high spectral resolution with rich subtle spectral information, of its short-wave infrared data for constructing optimal VIs that are slightly affected by the atmosphere, and of its more available MNFs than the other two sensors to be selected for establishing prediction models. Compared to ETM+ data, ALI data are better for mapping forest CC and LAI due to ALI data with more bands and higher signal-to-noise ratios than those of ETM+ data.

  4. Mars Global Geologic Mapping: Amazonian Results

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Dohm, J. M.; Irwin, R.; Kolb, E. J.; Skinner, J. A., Jr.; Hare, T. M.

    2008-01-01

    We are in the second year of a five-year effort to map the geology of Mars using mainly Mars Global Surveyor, Mars Express, and Mars Odyssey imaging and altimetry datasets. Previously, we have reported on details of project management, mapping datasets (local and regional), initial and anticipated mapping approaches, and tactics of map unit delineation and description [1-2]. For example, we have seen how the multiple types and huge quantity of image data as well as more accurate and detailed altimetry data now available allow for broader and deeper geologic perspectives, based largely on improved landform perception, characterization, and analysis. Here, we describe early mapping results, which include updating of previous northern plains mapping [3], including delineation of mainly Amazonian units and regional fault mapping, as well as other advances.

  5. Agricultural cropland mapping using black-and-white aerial photography, Object-Based Image Analysis and Random Forests

    NASA Astrophysics Data System (ADS)

    Vogels, M. F. A.; de Jong, S. M.; Sterk, G.; Addink, E. A.

    2017-02-01

    Land-use and land-cover (LULC) conversions have an important impact on land degradation, erosion and water availability. Information on historical land cover (change) is crucial for studying and modelling land- and ecosystem degradation. During the past decades major LULC conversions occurred in Africa, Southeast Asia and South America as a consequence of a growing population and economy. Most distinct is the conversion of natural vegetation into cropland. Historical LULC information can be derived from satellite imagery, but these only date back until approximately 1972. Before the emergence of satellite imagery, landscapes were monitored by black-and-white (B&W) aerial photography. This photography is often visually interpreted, which is a very time-consuming approach. This study presents an innovative, semi-automated method to map cropland acreage from B&W photography. Cropland acreage was mapped on two study sites in Ethiopia and in The Netherlands. For this purpose we used Geographic Object-Based Image Analysis (GEOBIA) and a Random Forest classification on a set of variables comprising texture, shape, slope, neighbour and spectral information. Overall mapping accuracies attained are 90% and 96% for the two study areas respectively. This mapping method increases the timeline at which historical cropland expansion can be mapped purely from brightness information in B&W photography up to the 1930s, which is beneficial for regions where historical land-use statistics are mostly absent.

  6. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  7. Mapping and proteomic analysis of albumin and globulin proteins in hexaploid wheat kernels (Triticum aestivum L.).

    PubMed

    Merlino, Marielle; Leroy, Philippe; Chambon, Christophe; Branlard, Gérard

    2009-05-01

    Albumins and globulins of wheat endosperm represent 20% of total kernel protein. They are soluble proteins, mainly enzymes and proteins involved in cell functions. Two-dimensional gel immobiline electrophoresis (2DE) (pH 4-7) x SDS-Page revealed around 2,250 spots. Ninety percent of the spots were common between the very distantly related cultivars 'Opata 85' and 'Synthetic W7984', the two parents of the International Triticeae Mapping Initiative (ITMI) progeny. 'Opata' had 130 specific spots while 'Synthetic' had 96. 2DE and image analysis of the soluble proteins present in 112 recombinant inbred lines of the F9-mapped ITMI progeny enabled 120 unbiased segregating spots to be mapped on 21 wheat (Triticum aestivum L. em. Thell) chromosomes. After trypsic digestion, mapped spots were subjected to MALDI-Tof or tandem mass spectrometry for protein identification by database mining. Among the 'Opata' and 'Synthetic' spots identified, many enzymes have already been mapped in the barley and rice genomes. Multigene families of Heat Shock Proteins, beta-amylases, UDP-glucose pyrophosphorylases, peroxydases and thioredoxins were successfully identified. Although other proteins remain to be identified, some differences were found in the number of segregating proteins involved in response to stress: 11 proteins found in the modern selected cultivar 'Opata 85' as compared to 4 in the new hexaploid ;Synthetic W7984'. In addition, 'Opata' and 'Synthetic' differed in the number of proteins involved in protein folding (2 and 10, respectively). The usefulness of the mapped enzymes for future research on seed composition and characteristics is discussed.

  8. Map design and production issues for the Utah Gap Analysis Project

    USGS Publications Warehouse

    Hutchinson, John A.; Wittmann, J.H.

    1997-01-01

    The cartographic preparation and printing of four maps for the Utah GAP Project presented a wide range of challenges in cartographic design and production. In meeting these challenges, the map designers had to balance the purpose of the maps together with their legibility and utility against both the researchers' desire to show as much detail as possible and the technical limitations inherent in the printing process. This article describes seven design and production issues in order to illustrate the challenges of making maps from a merger of satellite data and GIS databases, and to point toward future investigation and development.

  9. Mapping Vegetation Community Types in a Highly-Disturbed Landscape: Integrating Hiearchical Object-Based Image Analysis with Digital Surface Models

    NASA Astrophysics Data System (ADS)

    Snavely, Rachel A.

    Focusing on the semi-arid and highly disturbed landscape of San Clemente Island, California, this research tests the effectiveness of incorporating a hierarchal object-based image analysis (OBIA) approach with high-spatial resolution imagery and light detection and range (LiDAR) derived canopy height surfaces for mapping vegetation communities. The study is part of a large-scale research effort conducted by researchers at San Diego State University's (SDSU) Center for Earth Systems Analysis Research (CESAR) and Soil Ecology and Restoration Group (SERG), to develop an updated vegetation community map which will support both conservation and management decisions on Naval Auxiliary Landing Field (NALF) San Clemente Island. Trimble's eCognition Developer software was used to develop and generate vegetation community maps for two study sites, with and without vegetation height data as input. Overall and class-specific accuracies were calculated and compared across the two classifications. The highest overall accuracy (approximately 80%) was observed with the classification integrating airborne visible and near infrared imagery having very high spatial resolution with a LiDAR derived canopy height model. Accuracies for individual vegetation classes differed between both classification methods, but were highest when incorporating the LiDAR digital surface data. The addition of a canopy height model, however, yielded little difference in classification accuracies for areas of very dense shrub cover. Overall, the results show the utility of the OBIA approach for mapping vegetation with high spatial resolution imagery, and emphasizes the advantage of both multi-scale analysis and digital surface data for accuracy characterizing highly disturbed landscapes. The integrated imagery and digital canopy height model approach presented both advantages and limitations, which have to be considered prior to its operational use in mapping vegetation communities.

  10. Mapping Findspots of Roman Military Brickstamps in Mogontiacum (Mainz) and Archaeometrical Analysis

    NASA Astrophysics Data System (ADS)

    Dolata, Jens; Mucha, Hans-Joachim; Bartel, Hans-Georg

    Mainz was a Roman settlement that was established as an important military outpost in 13 BC. Almost 100 years later Mainz, the ancient Mogontiacum, became the seat of the administrative centre of the Roman Province of Germania Superior. About 3,500 brickstamps concerning to the period until the fall of the Roman Empire in the fifth century AD have been found in archaeological excavations. These documents have to be investigated based on several methods for a better understanding the history. Here the focus is on an application of spatial statistical analysis in archaeology. Concretely, about 250 sites have to be investigated. So, we compare maps of different periods graphically by nonparametric density estimation. Here different weights of the sites according to the radius of the finding area are taken into account. Moreover we can test whether archaeological segmentation is statistically significant or not. In combination of smooth mapping, testing and looking for dated brickstamps there is a good chance to achieve new sources for the Roman history of Mainz.

  11. A preliminary analysis of the Mariner 10 color ratio map of Mercury

    NASA Technical Reports Server (NTRS)

    Rava, Barry; Hapke, Bruce

    1987-01-01

    A preliminary geological analysis of the Mariner 10 orange/UV color ratio map of Mercury is given, assuming a basaltic crust. Certain errors in the map are pointed out. The relationship between color and terrain are distinctly non-lunar. Rays and ejecta are bluer than average on Mercury, whereas they are redder on the Moon. This fact, along with the lack of the ferrous band in Mercury's spectral reflectance and smaller albedo contrasts, implies that the crust is low in Fe and Ti. There is no correlation between color boundaries and the smooth plains on Mercury, in contrast with the strong correlation between color and maria-highlands contacts on the Moon. The smooth plains are not Mercurian analogs of lunar maria, and a lunar-type of second wave melting did not occur. Ambiguous correlations between color and topography indicate that older, redder materials underlie younger, bluer rocks in many places on the planet, implying that the last stages of volcanism involved low-Fe lavas covering higher-Fe rocks. There is some evidence of late Fe-rich pyroclastic activity.

  12. Data analysis and mapping of the mountain permafrost distribution

    NASA Astrophysics Data System (ADS)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2017-04-01

    the permafrost occurrence where it is unknown, the mentioned supervised learning techniques inferred a classification function from labelled training data (pixels of permafrost absence and presence). A particular attention was given to the pre-processing of the dataset, with the study of its complexity and the relation between permafrost data and employed environmental variables. The application of feature selection techniques completed this analysis and informed about redundant or valueless predictors. Classification performances were assessed with AUROC on independent validation sets (0.81 for LR, 0.85 with SVM and 0.88 with RF). At the micro scale obtained permafrost maps illustrate consistent results compared to the field reality thanks to the high resolution of the dataset (10 meters). Moreover, compared to classical models, the permafrost prediction is computed without recurring to altitude thresholds (above which permafrost may be found). Finally, as machine learning is a non-deterministic approach, mountain permafrost distribution maps are presented and discussed with corresponding uncertainties maps, which provide information on the quality of the results.

  13. Mapping vegetation in Yellowstone National Park using spectral feature analysis of AVIRIS data

    USGS Publications Warehouse

    Kokaly, Raymond F.; Despain, Don G.; Clark, Roger N.; Livo, K. Eric

    2003-01-01

    Knowledge of the distribution of vegetation on the landscape can be used to investigate ecosystem functioning. The sizes and movements of animal populations can be linked to resources provided by different plant species. This paper demonstrates the application of imaging spectroscopy to the study of vegetation in Yellowstone National Park (Yellowstone) using spectral feature analysis of data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). AVIRIS data, acquired on August 7, 1996, were calibrated to surface reflectance using a radiative transfer model and field reflectance measurements of a ground calibration site. A spectral library of canopy reflectance signatures was created by averaging pixels of the calibrated AVIRIS data over areas of known forest and nonforest vegetation cover types in Yellowstone. Using continuum removal and least squares fitting algorithms in the US Geological Survey's Tetracorder expert system, the distributions of these vegetation types were determined by comparing the absorption features of vegetation in the spectral library with the spectra from the AVIRIS data. The 0.68 μm chlorophyll absorption feature and leaf water absorption features, centered near 0.98 and 1.20 μm, were analyzed. Nonforest cover types of sagebrush, grasslands, willows, sedges, and other wetland vegetation were mapped in the Lamar Valley of Yellowstone. Conifer cover types of lodgepole pine, whitebark pine, Douglas fir, and mixed Engelmann spruce/subalpine fir forests were spectrally discriminated and their distributions mapped in the AVIRIS images. In the Mount Washburn area of Yellowstone, a comparison of the AVIRIS map of forest cover types to a map derived from air photos resulted in an overall agreement of 74.1% (kappa statistic=0.62).

  14. Continuation of SAGE and MLS High-Resolution Ozone Profiles with the Suomi NPP OMPS Limb Profiler

    NASA Astrophysics Data System (ADS)

    Kramarova, N. A.; Bhartia, P. K.; Moy, L.; Chen, Z.; Frith, S. M.

    2015-12-01

    The Ozone Mapper and Profiler Suite (OMPS) Limb Profiler (LP) onboard the Suomi NPP satellite is design to measure ozone profiles with a high vertical resolution (~2 km) and dense spatial sampling (~1° latitude). The LP sensor represents a new generation of the US ozone profile instruments with the plan for a follow-up limb instrument onboard the Joint Polar Satellite System 2 (JPSS-2) in 2021. In this study we will examine the suitability of using LP profiles to continue the EOS climate ozone profile record from the SAGE and MLS datasets. First of all, we evaluate the accuracy in determining the LP tangent height by analyzing measured and calculated radiances. The accurate estimation of the tangent height is critical for limb observations. Several methods were explored to estimate the uncertainties in the LP tangent height registration, and the results will be briefly summarized in this presentation. Version 2 of LP data, released in May 2014, includes a static adjustment of ~1.5 km and a dynamic tangent height adjustment within each orbit. A recent analysis of Version 2 Level 1 radiances revealed a 100 m step in the tangent height that occurred on 26 April 2013, due to a switch to two star trackers in determining spacecraft position. In addition, a ~200 m shift in the tangent height along each orbit was detected. These uncertainties in tangent height registrations can affect the stability of the LP ozone record. Therefore, the second step in our study includes a validation of LP ozone profiles against correlative satellite ozone measurements (Aura MLS, ACE-FTS, OSIRIS, and SBUV) with the focus on time-dependent changes. We estimate relative drifts between OMPS LP and correlative ozone records to evaluate stability of the LP measurements. We also test the tangent height corrections found in the internal analysis of Version 2 measurements to determine their effect on the long-term stability of the LP ozone record.

  15. Using MountainsMap (Digital Surf) surface analysis software as an analysis tool for x-ray mirror optical metrology data

    NASA Astrophysics Data System (ADS)

    Duffy, Alan; Yates, Brian; Takacs, Peter

    2012-09-01

    The Optical Metrology Facility at the Canadian Light Source (CLS) has recently purchased MountainsMap surface analysis software from Digital Surf and we report here our experiences with this package and its usefulness as a tool for examining metrology data of synchrotron x-ray mirrors. The package has a number of operators that are useful for determining surface roughness and slope error including compliance with ISO standards (viz. ISO 4287 and ISO 25178). The software is extensible with MATLAB scripts either by loading an m-file or by a user written script. This makes it possible to apply a custom operator to measurement data sets. Using this feature we have applied the simple six-line MATLAB code for the direct least square fitting of ellipses developed by Fitzgibbon et. al. to investigate the residual slope error of elliptical mirrors upon the removal of the best-fit-ellipse. The software includes support for many instruments (e.g. Zygo, MicroMap, etc...) and can import ASCII data (e.g. LTP data). The stitching module allows the user to assemble overlapping images and we report on our experiences with this feature applied to MicroMap surface roughness data. The power spectral density function was determined for the stitched and unstitched data and compared.

  16. Inlining 3d Reconstruction, Multi-Source Texture Mapping and Semantic Analysis Using Oblique Aerial Imagery

    NASA Astrophysics Data System (ADS)

    Frommholz, D.; Linkiewicz, M.; Poznanska, A. M.

    2016-06-01

    This paper proposes an in-line method for the simplified reconstruction of city buildings from nadir and oblique aerial images that at the same time are being used for multi-source texture mapping with minimal resampling. Further, the resulting unrectified texture atlases are analyzed for façade elements like windows to be reintegrated into the original 3D models. Tests on real-world data of Heligoland/ Germany comprising more than 800 buildings exposed a median positional deviation of 0.31 m at the façades compared to the cadastral map, a correctness of 67% for the detected windows and good visual quality when being rendered with GPU-based perspective correction. As part of the process building reconstruction takes the oriented input images and transforms them into dense point clouds by semi-global matching (SGM). The point sets undergo local RANSAC-based regression and topology analysis to detect adjacent planar surfaces and determine their semantics. Based on this information the roof, wall and ground surfaces found get intersected and limited in their extension to form a closed 3D building hull. For texture mapping the hull polygons are projected into each possible input bitmap to find suitable color sources regarding the coverage and resolution. Occlusions are detected by ray-casting a full-scale digital surface model (DSM) of the scene and stored in pixel-precise visibility maps. These maps are used to derive overlap statistics and radiometric adjustment coefficients to be applied when the visible image parts for each building polygon are being copied into a compact texture atlas without resampling whenever possible. The atlas bitmap is passed to a commercial object-based image analysis (OBIA) tool running a custom rule set to identify windows on the contained façade patches. Following multi-resolution segmentation and classification based on brightness and contrast differences potential window objects are evaluated against geometric constraints and

  17. Comparing physiographic maps with different categorisations

    NASA Astrophysics Data System (ADS)

    Zawadzka, J.; Mayr, T.; Bellamy, P.; Corstanje, R.

    2015-02-01

    This paper addresses the need for a robust map comparison method suitable for finding similarities between thematic maps with different forms of categorisations. In our case, the requirement was to establish the information content of newly derived physiographic maps with regards to set of reference maps for a study area in England and Wales. Physiographic maps were derived from the 90 m resolution SRTM DEM, using a suite of existing and new digital landform mapping methods with the overarching purpose of enhancing the physiographic unit component of the Soil and Terrain database (SOTER). Reference maps were seven soil and landscape datasets mapped at scales ranging from 1:200,000 to 1:5,000,000. A review of commonly used statistical methods for categorical comparisons was performed and of these, the Cramer's V statistic was identified as the most appropriate for comparison of maps with different legends. Interpretation of multiple Cramer's V values resulting from one-by-one comparisons of the physiographic and baseline maps was facilitated by multi-dimensional scaling and calculation of average distances between the maps. The method allowed for finding similarities and dissimilarities amongst physiographic maps and baseline maps and informed the recommendation of the most suitable methodology for terrain analysis in the context of soil mapping.

  18. A Tangible Approach to Concept Mapping

    NASA Astrophysics Data System (ADS)

    Tanenbaum, Karen; Antle, Alissa N.

    2009-05-01

    The Tangible Concept Mapping project investigates using a tangible user interface to engage learners in concept map creation. This paper describes a prototype implementation of the system, presents some preliminary analysis of its ease of use and effectiveness, and discusses how elements of tangible interaction support concept mapping by helping users organize and structure their knowledge about a domain. The role of physical engagement and embodiment in supporting the mental activity of creating the concept map is explored as one of the benefits of a tangible approach to learning.

  19. Early warning smartphone diagnostics for water security and analysis using real-time pH mapping

    NASA Astrophysics Data System (ADS)

    Hossain, Md. Arafat; Canning, John; Ast, Sandra; Rutledge, Peter J.; Jamalipour, Abbas

    2015-12-01

    Early detection of environmental disruption, unintentional or otherwise, is increasingly desired to ensure hazard minimization in many settings. Here, using a field-portable, smartphone fluorimeter to assess water quality based on the pH response of a designer probe, a map of pH of public tap water sites has been obtained. A custom designed Android application digitally processed and mapped the results utilizing the global positioning system (GPS) service of the smartphone. The map generated indicates no disruption in pH for all sites measured, and all the data are assessed to fall inside the upper limit of local government regulations, consistent with authority reported measurements. This implementation demonstrates a new security concept: network environmental forensics utilizing the potential of novel smartgrid analysis with wireless sensors for the detection of potential disruption to water quality at any point in the city. This concept is applicable across all smartgrid strategies within the next generation of the Internet of Things and can be extended on national and global scales to address a range of target analytes, both chemical and biological.

  20. Rotavirus - Global research density equalizing mapping and gender analysis.

    PubMed

    Köster, Corinna; Klingelhöfer, Doris; Groneberg, David A; Schwarzer, Mario

    2016-01-02

    Rotaviruses are the leading reason for dehydration and severe diarrheal disease and in infants and young children worldwide. An increasing number of related publications cause a crucial challenge to determine the relevant scientific output. Therefore, scientometric analyses are helpful to evaluate quantity as well as quality of the worldwide research activities on Rotavirus. Up to now, no in-depth global scientometric analysis relating to Rotavirus publications has been carried out. This study used scientometric tools and the method of density equalizing mapping to visualize the differences of the worldwide research effort referring to Rotavirus. The aim of the study was to compare scientific output geographically and over time by using an in-depth data analysis and New quality and quantity indices in science (NewQIS) tools. Furthermore, a gender analysis was part of the data interpretation. We retrieved all Rotavirus-related articles, which were published on "Rotavirus" during the time period from 1900 to 2013, from the Web of Science by a defined search term. These items were analyzed regarding quantitative and qualitative aspects, and visualized with the help of bibliometric methods and the technique of density equalizing mapping to show the differences of the worldwide research efforts. This work aimed to extend the current NewQIS platform. The 5906 Rotavirus associated articles were published in 138 countries from 1900 to 2013. The USA authored 2037 articles that equaled 34.5% of all published items followed by Japan with 576 articles and the United Kingdom - as the most productive representative of the European countries - with 495 articles. Furthermore, the USA established the most cooperations with other countries and was found to be in the center of an international collaborative network. We performed a gender analysis of authors per country (threshold was set at a publishing output of more than 100 articles by more than 50 authors whose names could be

  1. Mapping analysis of scaffold/matrix attachment regions (s/MARs) from two different mammalian cell lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilus, Nur Shazwani Mohd; Ahmad, Azrin; Yusof, Nurul Yuziana Mohd

    Scaffold/matrix attachment regions (S/MARs) are potential element that can be integrated into expression vector to increase expression of recombinant protein. Many studies on S/MAR have been done but none has revealed the distribution of S/MAR in a genome. In this study, we have isolated S/MAR sequences from HEK293 and Chinese hamster ovary cell lines (CHO DG44) using two different methods utilizing 2 M NaCl and lithium-3,5-diiodosalicylate (LIS). The isolated S/MARs were sequenced using Next Generation Sequencing (NGS) platform. Based on reference mapping analysis against human genome database, a total of 8,994,856 and 8,412,672 contigs of S/MAR sequences were retrieved frommore » 2M NaCl and LIS extraction of HEK293 respectively. On the other hand, reference mapping analysis of S/MAR derived from CHO DG44 against our own CHO DG44 database have generated a total of 7,204,348 and 4,672,913 contigs from 2 M NaCl and LIS extraction method respectively.« less

  2. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Applying parallel factor analysis and Tucker-3 methods on sensory and instrumental data to establish preference maps: case study on sweet corn varieties.

    PubMed

    Gere, Attila; Losó, Viktor; Györey, Annamária; Kovács, Sándor; Huzsvai, László; Nábrádi, András; Kókai, Zoltán; Sipos, László

    2014-12-01

    Traditional internal and external preference mapping methods are based on principal component analysis (PCA). However, parallel factor analysis (PARAFAC) and Tucker-3 methods could be a better choice. To evaluate the methods, preference maps of sweet corn varieties will be introduced. A preference map of eight sweet corn varieties was established using PARAFAC and Tucker-3 methods. Instrumental data were also integrated into the maps. The triplot created by the PARAFAC model explains better how odour is separated from texture or appearance, and how some varieties are separated from others. Internal and external preference maps were created using parallel factor analysis (PARAFAC) and Tucker-3 models employing both sensory (trained panel and consumers) and instrumental parameters simultaneously. Triplots of the applied three-way models have a competitive advantage compared to the traditional biplots of the PCA-based external preference maps. The solution of PARAFAC and Tucker-3 is very similar regarding the interpretation of the first and third factors. The main difference is due to the second factor as it differentiated the attributes better. Consumers who prefer 'super sweet' varieties (they place great emphasis especially on taste) are much younger and have significantly higher incomes, and buy sweet corn products rarely (once a month). Consumers who consume sweet corn products mainly because of their texture and appearance are significantly older and include a higher ratio of men. © 2014 Society of Chemical Industry.

  4. Knowledge Mapping: A Multipurpose Task Analysis Tool.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1988-01-01

    Describes knowledge mapping, a tool developed to increase the objectivity and accuracy of task difficulty ratings for job design. Application in a semiconductor manufacturing environment is discussed, including identifying prerequisite knowledge for a given task; establishing training development priorities; defining knowledge levels; identifying…

  5. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of

  6. Phenotypic mapping of metabolic profiles using self-organizing maps of high-dimensional mass spectrometry data.

    PubMed

    Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A

    2014-07-01

    A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.

  7. VESsel GENeration Analysis (VESGEN): Innovative Vascular Mappings for Astronaut Exploration Health Risks and Human Terrestrial Medicine

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, Patricia; Kao, David; Valizadegan, Hamed; Martin, Rodney; Murray, Matthew C.; Ramesh, Sneha; Sekaran, Srinivaas

    2017-01-01

    Currently, astronauts face significant health risks in future long-duration exploration missions such as colonizing the Moon and traveling to Mars. Numerous risks include greatly increased radiation exposures beyond the low earth orbit (LEO) of the ISS, and visual and ocular impairments in response to microgravity environments. The cardiovascular system is a key mediator in human physiological responses to radiation and microgravity. Moreover, blood vessels are necessarily involved in the progression and treatment of vascular-dependent terrestrial diseases such as cancer, coronary vessel disease, wound-healing, reproductive disorders, and diabetes. NASA developed an innovative, globally requested beta-level software, VESsel GENeration Analysis (VESGEN) to map and quantify vascular remodeling for application to astronaut and terrestrial health challenges. VESGEN mappings of branching vascular trees and networks are based on a weighted multi-parametric analysis derived from vascular physiological branching rules. Complex vascular branching patterns are determined by biological signaling mechanisms together with the fluid mechanics of multi-phase laminar blood flow.

  8. Measuring Agreement in Participatory Mapping

    ERIC Educational Resources Information Center

    Caspersen, Janna R.; Van Holt, Tracy; Johnson, Jeffrey C.

    2017-01-01

    This article offers a way to measure agreement in participatory mapping. We asked subject matter experts (SMEs) to draw where Sudanese ethnic groups were located on a map. We then used an eigenanalysis approach to determine whether SMEs agreed on the location of ethnic groups. We used minimum residual factor analysis to assess the extent of…

  9. Quasipolynomial generalization of Lotka-Volterra mappings

    NASA Astrophysics Data System (ADS)

    Hernández-Bermejo, Benito; Brenig, Léon

    2002-07-01

    In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.

  10. Short-Arc Analysis of Intersatellite Tracking Data in a Gravity Mapping Mission

    NASA Technical Reports Server (NTRS)

    Rowlands, David D.; Ray, Richard D.; Chinn, Douglas S.; Lemoine, Frank G.; Smith, David E. (Technical Monitor)

    2001-01-01

    A technique for the analysis of low-low intersatellite range-rate data in a gravity mapping mission is explored. The technique is based on standard tracking data analysis for orbit determination but uses a spherical coordinate representation of the 12 epoch state parameters describing the baseline between the two satellites. This representation of the state parameters is exploited to allow the intersatellite range-rate analysis to benefit from information provided by other tracking data types without large simultaneous multiple data type solutions. The technique appears especially valuable for estimating gravity from short arcs (e.g., less than 15 minutes) of data. Gravity recovery simulations which use short arcs are compared with those using arcs a day in length. For a high-inclination orbit, the short-arc analysis recovers low-order gravity coefficients remarkably well, although higher order terms, especially sectorial terms, are less accurate. Simulations suggest that either long or short arcs of GRACE data are likely to improve parts of the geopotential spectrum by orders of magnitude.

  11. An investigation of the effects of aft blowing on a 3.0 caliber tangent ogive body at high angles of attack. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Gittner, Nathan M.

    1992-01-01

    An experimental investigation of the effects of aft blowing on the asymmetric vortex flow of a slender, axisymmetric body at high angles of attack was conducted. A 3.0 caliber tangent ogive body fitted with a cylindrical afterbody was tested in a wind tunnel under subsonic, laminar flow test conditions. Asymmetric blowing from both a single nozzle and a double nozzle configuration, positioned near the body apex, was investigated. Aft blowing was observed to alter the vortex asymmetry by moving the blowing-side vortex closer to the body surface while moving the non-blowing-side vortex further away from the body. The effect of increasing the blowing coefficient was to move the blowing-side vortex closer to the body surface at a more upstream location. The data also showed that blowing was more effective in altering the initial vortex asymmetry at the higher angles of attack than at the lower. The effects of changing the nozzle exit geometry were investigated and it was observed that blowing from a nozzle with a low, broad exit geometry was more effective in reducing the vortex asymmetry than blowing from a high, narrow exit geometry.

  12. Assessment of histological differentiation in gastric cancers using whole-volume histogram analysis of apparent diffusion coefficient maps.

    PubMed

    Zhang, Yujuan; Chen, Jun; Liu, Song; Shi, Hua; Guan, Wenxian; Ji, Changfeng; Guo, Tingting; Zheng, Huanhuan; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng; Liu, Tian

    2017-02-01

    To investigate the efficacy of histogram analysis of the entire tumor volume in apparent diffusion coefficient (ADC) maps for differentiating between histological grades in gastric cancer. Seventy-eight patients with gastric cancer were enrolled in a retrospective 3.0T magnetic resonance imaging (MRI) study. ADC maps were obtained at two different b values (0 and 1000 sec/mm 2 ) for each patient. Tumors were delineated on each slice of the ADC maps, and a histogram for the entire tumor volume was subsequently generated. A series of histogram parameters (eg, skew and kurtosis) were calculated and correlated with the histological grade of the surgical specimen. The diagnostic performance of each parameter for distinguishing poorly from moderately well-differentiated gastric cancers was assessed by using the area under the receiver operating characteristic curve (AUC). There were significant differences in the 5 th , 10 th , 25 th , and 50 th percentiles, skew, and kurtosis between poorly and well-differentiated gastric cancers (P < 0.05). There were correlations between the degrees of differentiation and histogram parameters, including the 10 th percentile, skew, kurtosis, and max frequency; the correlation coefficients were 0.273, -0.361, -0.339, and -0.370, respectively. Among all the histogram parameters, the max frequency had the largest AUC value, which was 0.675. Histogram analysis of the ADC maps on the basis of the entire tumor volume can be useful in differentiating between histological grades for gastric cancer. 4 J. Magn. Reson. Imaging 2017;45:440-449. © 2016 International Society for Magnetic Resonance in Medicine.

  13. Mapping agroecosystem zone using remote sensing for food security analysis in Bantul district Daerah Istimewa Yogyakarta

    NASA Astrophysics Data System (ADS)

    Murti, Sigit Heru

    2017-10-01

    Food security is one of the most important issue for Indonesia. The huge population number and high population growing rate has made the food security a critical issue. This paper describe the application of remote sensing data to (1) map agroecosystem zones in Bantul District, Special Region of Yogyakarta, Indonesia in 2012 and (2) analyze the food security in the study area based on the resulting agro-ecosystem map. Bantul District is selected as the pilot area because this area is among the highest food crop production area in the Province. ALOS AVNIR-2 image accquired on 15 June 2010 was integrated with Indonesian Surface map (RBI map), soil types map, and slope steepness map. Population statistics data was also used to calculate the food needs. Field survey was conducted to obtain the crop field productivity information on each agro-ecosystem zone and assess the accuracy of the model. This research indicates that (1) Bantul District can be divided into three agroecosystem zones, where each zone has unique topograhic configuration and soil types composition, and (2) Bantul Distict is categorized as food secure area since the rice production in 2012 managed to cover the food needs of the people with the surplus of 33,208.6 tonnes of rice. However, when the analysis was conducted at sub-district level, there are four subdistrict with food insecurity where the food needs surpass the rice production. These sub-district are Kasihan Sub-district (-5,598.4 t), Banguntapan Sub-district (-2,483.4 t), Pajangan Sub-district (-1,039.6 t) and Dlingo Sub-district (-798.7 t).

  14. Design of an image encryption scheme based on a multiple chaotic map

    NASA Astrophysics Data System (ADS)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  15. Tularosa Basin Play Fairway Analysis: Weights of Evidence; Mineralogy, and Temperature Anomaly Maps

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This submission has two shapefiles and a tiff image. The weights of evidence analysis was applied to data representing heat of the earth and fracture permeability using training sites around the Southwest; this is shown in the tiff image. A shapefile of surface temperature anomalies was derived from the statistical analysis of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) thermal infrared data which had been converted to surface temperatures; these anomalies have not been field checked. The second shapefile shows outcrop mineralogy which originally mapped by the New Mexico Bureau of Geology and Mineral Resources, and supplemented with mineralogic information related to rock fracability risk for EGS. Further metadata can be found within each file.

  16. Wide-field lensing mass maps from Dark Energy Survey science verification data: Methodology and detailed analysis

    DOE PAGES

    Vikram, V.

    2015-07-29

    Weak gravitational lensing allows one to reconstruct the spatial distribution of the projected mass density across the sky. These “mass maps” provide a powerful tool for studying cosmology as they probe both luminous and dark matter. In this paper, we present a weak lensing mass map reconstructed from shear measurements in a 139 deg 2 area from the Dark Energy Survey (DES) science verification data. We compare the distribution of mass with that of the foreground distribution of galaxies and clusters. The overdensities in the reconstructed map correlate well with the distribution of optically detected clusters. We demonstrate that candidatemore » superclusters and voids along the line of sight can be identified, exploiting the tight scatter of the cluster photometric redshifts. We cross-correlate the mass map with a foreground magnitude-limited galaxy sample from the same data. Our measurement gives results consistent with mock catalogs from N-body simulations that include the primary sources of statistical uncertainties in the galaxy, lensing, and photo-z catalogs. The statistical significance of the cross-correlation is at the 6.8σ level with 20 arcminute smoothing. We find that the contribution of systematics to the lensing mass maps is generally within measurement uncertainties. In this study, we analyze less than 3% of the final area that will be mapped by the DES; the tools and analysis techniques developed in this paper can be applied to forthcoming larger data sets from the survey.« less

  17. Assessing the methods needed for improved dengue mapping: a SWOT analysis.

    PubMed

    Attaway, David Frost; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2014-01-01

    Dengue fever, a mosquito-borne viral infection, is a growing threat to human health in tropical and subtropical areas worldwide. There is a demand from public officials for maps that capture the current distribution of dengue and maps that analyze risk factors to predict the future burden of disease. To identify relevant articles, we searched Google Scholar, PubMed, BioMed Central, and WHOLIS (World Health Organization Library Database) for published articles with a specific set of dengue criteria between January 2002 and July 2013. After evaluating the currently available dengue models, we identified four key barriers to the creation of high-quality dengue maps: (1) data limitations related to the expense of diagnosing and reporting dengue cases in places where health information systems are underdeveloped; (2) issues related to the use of socioeconomic proxies in places with limited dengue incidence data; (3) mosquito ranges which may be changing as a result of climate changes; and (4) the challenges of mapping dengue events at a variety of scales. An ideal dengue map will present endemic and epidemic dengue information from both rural and urban areas. Overcoming the current barriers requires expanded collaboration and data sharing by geographers, epidemiologists, and entomologists. Enhanced mapping techniques would allow for improved visualizations of dengue rates and risks.

  18. Scanning evanescent electro-magnetic microscope

    DOEpatents

    Xiang, Xiao-Dong; Gao, Chen; Schultz, Peter G.; Wei, Tao

    2003-01-01

    A novel scanning microscope is described that uses near-field evanescent electromagnetic waves to probe sample properties. The novel microscope is capable of high resolution imaging and quantitative measurements of the electrical properties of the sample. The inventive scanning evanescent wave electromagnetic microscope (SEMM) can map dielectric constant, tangent loss, conductivity, complex electrical impedance, and other electrical parameters of materials. The quantitative map corresponds to the imaged detail. The novel microscope can be used to measure electrical properties of both dielectric and electrically conducting materials.

  19. Scanning evanescent electro-magnetic microscope

    DOEpatents

    Xiang, Xiao-Dong; Gao, Chen

    2001-01-01

    A novel scanning microscope is described that uses near-field evanescent electromagnetic waves to probe sample properties. The novel microscope is capable of high resolution imaging and quantitative measurements of the electrical properties of the sample. The inventive scanning evanescent wave electromagnetic microscope (SEMM) can map dielectric constant, tangent loss, conductivity, complex electrical impedance, and other electrical parameters of materials. The quantitative map corresponds to the imaged detail. The novel microscope can be used to measure electrical properties of both dielectric and electrically conducting materials.

  20. GIS-aided low flow mapping

    NASA Astrophysics Data System (ADS)

    Saghafian, B.; Mohammadi, A.

    2003-04-01

    Most studies involving water resources allocation, water quality, hydropower generation, and allowable water withdrawal and transfer require estimation of low flows. Normally, frequency analysis on at-station D-day low flow data is performed to derive various T-yr return period values. However, this analysis is restricted to the location of hydrometric stations where the flow discharge is measured. Regional analysis is therefore conducted to relate the at-station low flow quantiles to watershed characteristics. This enables the transposition of low flow quantiles to ungauged sites. Nevertheless, a procedure to map the regional regression relations for the entire stream network, within the bounds of the relations, is particularly helpful when one studies and weighs alternative sites for certain water resources project. In this study, we used a GIS-aided procedure for low flow mapping in Gilan province, part of northern region in Iran. Gilan enjoys a humid climate with an average of 1100 mm annual precipitation. Although rich in water resources, the highly populated area is quite dependent on minimum amount of water to sustain the vast rice farming and to maintain required flow discharge for quality purposes. To carry out the low flow analysis, a total of 36 hydrometric stations with sufficient and reliable discharge data were identified in the region. The average area of the watersheds was 250 sq. km. Log Pearson type 3 was found the best distribution for flow durations over 60 days, while log normal fitted well the shorter duration series. Low flows with return periods of 2, 5, 10, 25, 50, and 100 year were then computed. Cluster analysis identified two homogeneous areas. Although various watershed parameters were examined in factor analysis, the results showed watershed area, length of the main stream, and annual precipitation were the most effective low flow parameters. The regression equations were then mapped with the aid of GIS based on flow accumulation maps

  1. Comparative mapping of Raphanus sativus genome using Brassica markers and quantitative trait loci analysis for the Fusarium wilt resistance trait.

    PubMed

    Yu, Xiaona; Choi, Su Ryun; Ramchiary, Nirala; Miao, Xinyang; Lee, Su Hee; Sun, Hae Jeong; Kim, Sunggil; Ahn, Chun Hee; Lim, Yong Pyo

    2013-10-01

    Fusarium wilt (FW), caused by the soil-borne fungal pathogen Fusarium oxysporum is a serious disease in cruciferous plants, including the radish (Raphanus sativus). To identify quantitative trait loci (QTL) or gene(s) conferring resistance to FW, we constructed a genetic map of R. sativus using an F2 mapping population derived by crossing the inbred lines '835' (susceptible) and 'B2' (resistant). A total of 220 markers distributed in 9 linkage groups (LGs) were mapped in the Raphanus genome, covering a distance of 1,041.5 cM with an average distance between adjacent markers of 4.7 cM. Comparative analysis of the R. sativus genome with that of Arabidopsis thaliana and Brassica rapa revealed 21 and 22 conserved syntenic regions, respectively. QTL mapping detected a total of 8 loci conferring FW resistance that were distributed on 4 LGs, namely, 2, 3, 6, and 7 of the Raphanus genome. Of the detected QTL, 3 QTLs (2 on LG 3 and 1 on LG 7) were constitutively detected throughout the 2-year experiment. QTL analysis of LG 3, flanked by ACMP0609 and cnu_mBRPGM0085, showed a comparatively higher logarithm of the odds (LOD) value and percentage of phenotypic variation. Synteny analysis using the linked markers to this QTL showed homology to A. thaliana chromosome 3, which contains disease-resistance gene clusters, suggesting conservation of resistance genes between them.

  2. Geologic Mapping of Ascraeus Mons, Mars

    NASA Technical Reports Server (NTRS)

    Mohr, K. J.; Williams, D. A.; Garry, W. B.

    2016-01-01

    Ascraeus Mons (AM) is the northeastern most large shield volcano residing in the Tharsis province on Mars. We are funded by NASA's Mars Data Analysis Program to complete a digital geologic map based on the mapping style. Previous mapping of a limited area of these volcanoes using HRSC images (13-25 m/pixel) revealed a diverse distribution of volcanic landforms within the calderas, along the flanks, rift aprons, and surrounding plains. The general scientific objectives for which this mapping is based is to show the different lava flow morphologies across AM to better understand the evolution and geologic history.

  3. Adjusting stream-sediment geochemical maps in the Austrian Bohemian Massif by analysis of variance

    USGS Publications Warehouse

    Davis, J.C.; Hausberger, G.; Schermann, O.; Bohling, G.

    1995-01-01

    The Austrian portion of the Bohemian Massif is a Precambrian terrane composed mostly of highly metamorphosed rocks intruded by a series of granitoids that are petrographically similar. Rocks are exposed poorly and the subtle variations in rock type are difficult to map in the field. A detailed geochemical survey of stream sediments in this region has been conducted and included as part of the Geochemischer Atlas der Republik O??sterreich, and the variations in stream sediment composition may help refine the geological interpretation. In an earlier study, multivariate analysis of variance (MANOVA) was applied to the stream-sediment data in order to minimize unwanted sampling variation and emphasize relationships between stream sediments and rock types in sample catchment areas. The estimated coefficients were used successfully to correct for the sampling effects throughout most of the region, but also introduced an overcorrection in some areas that seems to result from consistent but subtle differences in composition of specific rock types. By expanding the model to include an additional factor reflecting the presence of a major tectonic unit, the Rohrbach block, the overcorrection is removed. This iterative process simultaneously refines both the geochemical map by removing extraneous variation and the geological map by suggesting a more detailed classification of rock types. ?? 1995 International Association for Mathematical Geology.

  4. Design and analysis for thematic map accuracy assessment: Fundamental principles

    Treesearch

    Stephen V. Stehman; Raymond L. Czaplewski

    1998-01-01

    Land-cover maps are used in numerous natural resource applications to describe the spatial distribution and pattern of land-cover, to estimate areal extent of various cover classes, or as input into habitat suitability models, land-cover change analyses, hydrological models, and risk analyses. Accuracy assessment quantifies data quality so that map users may evaluate...

  5. Physical mapping and BAC-end sequence analysis provide initial insights into the flax (Linum usitatissimum L.) genome

    PubMed Central

    2011-01-01

    Background Flax (Linum usitatissimum L.) is an important source of oil rich in omega-3 fatty acids, which have proven health benefits and utility as an industrial raw material. Flax seeds also contain lignans which are associated with reducing the risk of certain types of cancer. Its bast fibres have broad industrial applications. However, genomic tools needed for molecular breeding were non existent. Hence a project, Total Utilization Flax GENomics (TUFGEN) was initiated. We report here the first genome-wide physical map of flax and the generation and analysis of BAC-end sequences (BES) from 43,776 clones, providing initial insights into the genome. Results The physical map consists of 416 contigs spanning ~368 Mb, assembled from 32,025 fingerprints, representing roughly 54.5% to 99.4% of the estimated haploid genome (370-675 Mb). The N50 size of the contigs was estimated to be ~1,494 kb. The longest contig was ~5,562 kb comprising 437 clones. There were 96 contigs containing more than 100 clones. Approximately 54.6 Mb representing 8-14.8% of the genome was obtained from 80,337 BES. Annotation revealed that a large part of the genome consists of ribosomal DNA (~13.8%), followed by known transposable elements at 6.1%. Furthermore, ~7.4% of sequence was identified to harbour novel repeat elements. Homology searches against flax-ESTs and NCBI-ESTs suggested that ~5.6% of the transcriptome is unique to flax. A total of 4064 putative genomic SSRs were identified and are being developed as novel markers for their use in molecular breeding. Conclusion The first genome-wide physical map of flax constructed with BAC clones provides a framework for accessing target loci with economic importance for marker development and positional cloning. Analysis of the BES has provided insights into the uniqueness of the flax genome. Compared to other plant genomes, the proportion of rDNA was found to be very high whereas the proportion of known transposable elements was low. The SSRs

  6. Physical mapping and BAC-end sequence analysis provide initial insights into the flax (Linum usitatissimum L.) genome.

    PubMed

    Ragupathy, Raja; Rathinavelu, Rajkumar; Cloutier, Sylvie

    2011-05-09

    Flax (Linum usitatissimum L.) is an important source of oil rich in omega-3 fatty acids, which have proven health benefits and utility as an industrial raw material. Flax seeds also contain lignans which are associated with reducing the risk of certain types of cancer. Its bast fibres have broad industrial applications. However, genomic tools needed for molecular breeding were non existent. Hence a project, Total Utilization Flax GENomics (TUFGEN) was initiated. We report here the first genome-wide physical map of flax and the generation and analysis of BAC-end sequences (BES) from 43,776 clones, providing initial insights into the genome. The physical map consists of 416 contigs spanning ~368 Mb, assembled from 32,025 fingerprints, representing roughly 54.5% to 99.4% of the estimated haploid genome (370-675 Mb). The N50 size of the contigs was estimated to be ~1,494 kb. The longest contig was ~5,562 kb comprising 437 clones. There were 96 contigs containing more than 100 clones. Approximately 54.6 Mb representing 8-14.8% of the genome was obtained from 80,337 BES. Annotation revealed that a large part of the genome consists of ribosomal DNA (~13.8%), followed by known transposable elements at 6.1%. Furthermore, ~7.4% of sequence was identified to harbour novel repeat elements. Homology searches against flax-ESTs and NCBI-ESTs suggested that ~5.6% of the transcriptome is unique to flax. A total of 4064 putative genomic SSRs were identified and are being developed as novel markers for their use in molecular breeding. The first genome-wide physical map of flax constructed with BAC clones provides a framework for accessing target loci with economic importance for marker development and positional cloning. Analysis of the BES has provided insights into the uniqueness of the flax genome. Compared to other plant genomes, the proportion of rDNA was found to be very high whereas the proportion of known transposable elements was low. The SSRs identified from BES will be

  7. Pressure Mapping and Efficiency Analysis of an EPPLER 857 Hydrokinetic Turbine

    NASA Astrophysics Data System (ADS)

    Clark, Tristan

    A conceptual energy ship is presented to provide renewable energy. The ship, driven by the wind, drags a hydrokinetic turbine through the water. The power generated is used to run electrolysis on board, taking the resultant hydrogen back to shore to be used as an energy source. The basin efficiency (Power/thrust*velocity) of the Hydrokinetic Turbine (HTK) plays a vital role in this process. In order to extract the maximum allowable power from the flow, the blades need to be optimized. The structural analysis of the blade is important, as the blade will undergo high pressure loads from the water. A procedure for analysis of a preliminary Hydrokinetic Turbine blade design is developed. The blade was designed by a non-optimized Blade Element Momentum Theory (BEMT) code. Six simulations were run, with varying mesh resolution, turbulence models, and flow region size. The procedure was developed that provides detailed explanation for the entire process, from geometry and mesh generation to post-processing analysis tools. The efficiency results from the simulations are used to study the mesh resolution, flow region size, and turbulence models. The results are compared to the BEMT model design targets. Static pressure maps are created that can be used for structural analysis of the blades.

  8. Mapping Forest Inventory and Analysis forest land use: timberland, reserved forest land, and other forest land

    Treesearch

    Mark D. Nelson; John Vissage

    2007-01-01

    The Forest Inventory and Analysis (FIA) program produces area estimates of forest land use within three subcategories: timberland, reserved forest land, and other forest land. Mapping these subcategories of forest land requires the ability to spatially distinguish productive from unproductive land, and reserved from nonreserved land. FIA field data were spatially...

  9. Mapping Argonaute and conventional RNA-binding protein interactions with RNA at single-nucleotide resolution using HITS-CLIP and CIMS analysis

    PubMed Central

    Moore, Michael; Zhang, Chaolin; Gantman, Emily Conn; Mele, Aldo; Darnell, Jennifer C.; Darnell, Robert B.

    2014-01-01

    Summary Identifying sites where RNA binding proteins (RNABPs) interact with target RNAs opens the door to understanding the vast complexity of RNA regulation. UV-crosslinking and immunoprecipitation (CLIP) is a transformative technology in which RNAs purified from in vivo cross-linked RNA-protein complexes are sequenced to reveal footprints of RNABP:RNA contacts. CLIP combined with high throughput sequencing (HITS-CLIP) is a generalizable strategy to produce transcriptome-wide RNA binding maps with higher accuracy and resolution than standard RNA immunoprecipitation (RIP) profiling or purely computational approaches. Applying CLIP to Argonaute proteins has expanded the utility of this approach to mapping binding sites for microRNAs and other small regulatory RNAs. Finally, recent advances in data analysis take advantage of crosslinked-induced mutation sites (CIMS) to refine RNA-binding maps to single-nucleotide resolution. Once IP conditions are established, HITS-CLIP takes approximately eight days to prepare RNA for sequencing. Established pipelines for data analysis, including for CIMS, take 3-4 days. PMID:24407355

  10. Seismic slope-performance analysis: from hazard map to decision support system

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.; Ho, Carlton L.

    1999-01-01

    In response to the growing recognition of engineers and decision-makers of the regional effects of earthquake-induced landslides, this paper presents a general approach to conducting seismic landslide zonation, based on the popular Newmark's sliding block analogy for modeling coherent landslides. Four existing models based on the sliding block analogy are compared. The comparison shows that the models forecast notably different levels of slope performance. Considering this discrepancy along with the limitations of static maps as a decision tool, a spatial decision support system (SDSS) for seismic landslide analysis is proposed, which will support investigations over multiple scales for any number of earthquake scenarios and input conditions. Most importantly, the SDSS will allow use of any seismic landslide analysis model and zonation approach. Developments associated with the SDSS will produce an object-oriented model for encapsulating spatial data, an object-oriented specification to allow construction of models using modular objects, and a direct-manipulation, dynamic user-interface that adapts to the particular seismic landslide model configuration.

  11. Robot Acquisition of Active Maps Through Teleoperation and Vector Space Analysis

    NASA Technical Reports Server (NTRS)

    Peters, Richard Alan, II

    2003-01-01

    The work performed under this contract was in the area of intelligent robotics. The problem being studied was the acquisition of intelligent behaviors by a robot. The method was to acquire action maps that describe tasks as sequences of reflexive behaviors. Action maps (a.k.a. topological maps) are graphs whose nodes represent sensorimotor states and whose edges represent the motor actions that cause the robot to proceed from one state to the next. The maps were acquired by the robot after being teleoperated or otherwise guided by a person through a task several times. During a guided task, the robot records all its sensorimotor signals. The signals from several task trials are partitioned into episodes of static behavior. The corresponding episodes from each trial are averaged to produce a task description as a sequence of characteristic episodes. The sensorimotor states that indicate episode boundaries become the nodes, and the static behaviors, the edges. It was demonstrated that if compound maps are constructed from a set of tasks then the robot can perform new tasks in which it was never explicitly trained.

  12. Spectral Mixture Analysis to map burned areas in Brazil's deforestation arc from 1992 to 2011

    NASA Astrophysics Data System (ADS)

    Antunes Daldegan, G.; Ribeiro, F.; Roberts, D. A.

    2017-12-01

    The two most extensive biomes in South America, the Amazon and the Cerrado, are subject to several fire events every dry season. Both are known for their ecological and environmental importance. However, due to the intensive human occupation over the last four decades, they have been facing high deforestation rates. The Cerrado biome is adapted to fire and is considered a fire-dependent landscape. In contrast, the Amazon as a tropical moist broadleaf forest does not display similar characteristics and is classified as a fire-sensitive landscape. Nonetheless, studies have shown that forest areas that have already been burned become more prone to experience recurrent burns. Remote sensing has been extensively used by a large number of researchers studying fire occurrence at a global scale, as well as in both landscapes aforementioned. Digital image processing aiming to map fire activity has been applied to a number of imagery from sensors of various spatial, temporal, and spectral resolutions. More specifically, several studies have used Landsat data to map fire scars in the Amazon forest and in the Cerrado. An advantage of using Landsat data is the potential to map fire scars at a finer spatial resolution, when compared to products derived from imagery of sensors featuring better temporal resolution but coarser spatial resolution, such as MODIS (Moderate Resolution Imaging Spectrometer) and GOES (Geostationary Operational Environmental Satellite). This study aimed to map burned areas present in the Amazon-Cerrado transition zone by applying Spectral Mixture Analysis on Landsat imagery for a period of 20 years (1992-2011). The study area is a subset of this ecotone, centered at the State of Mato Grosso. By taking advantage of the Landsat 5TM and Landsat 7ETM+ imagery collections available in Google Earth Engine platform and applying Spectral Mixture Analysis (SMA) techniques over them permitted to model fire scar fractions and delimitate burned areas. Overlaying

  13. Cost Analysis of Spatial Data Production as Part of Business Intelligence Within the Mapping Department

    NASA Astrophysics Data System (ADS)

    Kisa, A.; Erkek, B.; Çolak, S.

    2012-07-01

    performance critters are redefined, improvement of existing software are defined, cost analysis implemented as a part of business intelligence. This paper indicated some activities such as cost analysis and its reflection in Mapping Department as an example to share in the concept of reorganization.

  14. Analysis of tsunami disaster map by Geographic Information System (GIS): Aceh Singkil-Indonesia

    NASA Astrophysics Data System (ADS)

    Farhan, A.; Akhyar, H.

    2017-02-01

    Tsunami risk map is used by stakeholder as a base to decide evacuation plan and evaluates from disaster. Aceh Singkil district of Aceh- Indonesia’s disaster maps have been developed and analyzed by using GIS tool. Overlay methods through algorithms are used to produce hazard map, vulnerability, capacity and finally created disaster risk map. Spatial maps are used topographic maps, administrative map, SRTM. The parameters are social, economic, physical environmental vulnerability, a level of exposed people, parameters of houses, public building, critical facilities, productive land, population density, sex ratio, poor ratio, disability ratio, age group ratio, the protected forest, natural forest, and mangrove forest. The results show high-risk tsunami disaster at nine villages; moderate levels are seventeen villages, and other villages are shown in the low level of tsunami risk disaster.

  15. Analysis of Biogenic VOCs Emissions During the MAPS-Seoul Aircraft Field Campaign

    NASA Astrophysics Data System (ADS)

    Lee, Y.; Woo, J. H.; Kim, Y.; Bu, C.; Kim, J.; Kim, H. K.; Lee, M. H.; Eo, Y.

    2016-12-01

    The MAPS-Seoul (Megacity Air Pollution Studies-Seoul) aircraft mission was conducted in May - June 2016 to understand atmospheric environment over the South Korea. BVOCs emissions forecasting, along with other components, were conducted daily in support of the aircraft mission planning. The biogenic emissions as well as anthropogenic ones were very important factor to model and analyze atmospheric environment since more than 80% of global VOCs emission comes from biogenic sources. This also could be true for South Korea, since more than 70% of its land area are vegetated such as forest, cropland. For modeling-based BVOC emission estimation, geographical distribution of PFT (plant functional type) and LAI (Leaf Area Index) are considered as very important driving variables. Most of cases, PFTs and LAI were derived from the low-resolution satellite-based information which are not quite ideal for relatively small area like South Korea. In this study, we developed the more reliable Korean PFT and LAI cover derived from Korean landcover maps and modeled satellite images. The WRF-MEGAN modeling framework over South Korea for the period of May to June 2016 was used to estimate re-analysis BVOCs emission field. Analysis of different PFT and LAI inputs affected local and national biogenic emission estimations will be presented at site. Acknowledgements : This subject is supported by Korea Ministry of Environment as "Climate Change Correspondence Program". This work was supported by a grant from the National Institute of Environment Research (NIER), funded by the Ministry of Environment (MOE) of the Republic of Korea.

  16. Effects of habitat map generalization in biodiversity assessment

    NASA Technical Reports Server (NTRS)

    Stoms, David M.

    1992-01-01

    Species richness is being mapped as part of an inventory of biological diversity in California (i.e., gap analysis). Species distributions are modeled with a GIS on the basis of maps of each species' preferred habitats. Species richness is then tallied in equal-area sampling units. A GIS sensitivity analysis examined the effects of the level of generalization of the habitat map on the predicted distribution of species richness in the southern Sierra Nevada. As the habitat map was generalized, the number of habitat types mapped within grid cells tended to decrease with a corresponding decline in numbers of species predicted. Further, the ranking of grid cells in order of predicted numbers of species changed dramatically between levels of generalization. Areas predicted to be of greatest conservation value on the basis of species richness may therefore be sensitive to GIS data resolution.

  17. ASTER spectral analysis and lithologic mapping of the Khanneshin carbonatite volcano, Afghanistan

    USGS Publications Warehouse

    Mars, John C.; Rowan, Lawrence C.

    2011-01-01

    Advanced Spaceborne Thermal and Reflection Radiometer (ASTER) data of the early Quaternary Khanneshin carbonatite volcano located in southern Afghanistan were used to identify carbonate rocks within the volcano and to distinguish them from Neogene ferruginous polymict sandstone and argillite. The carbonatitic rocks are characterized by diagnostic CO3 absorption near 11.2 μm and 2.31–2.33 μm, whereas the sandstone, argillite, and adjacent alluvial deposits exhibit intense Si-O absorption near 8.7 μm caused mainly by quartz and Al-OH absorption near 2.20 μm due to muscovite and illite.Calcitic carbonatite was distinguished from ankeritic carbonatite in the short wave infrared (SWIR) region of the ASTER data due to a slight shift of the CO3 absorption feature toward 2.26 μm (ASTER band 7) in the ankeritic carbonatite spectra. Spectral assessment using ASTER SWIR data suggests that the area is covered by extensive carbonatite flows that contain calcite, ankerite, and muscovite, though some areas mapped as ankeritic carbonatite on a preexisting geologic map were not identified in the ASTER data. A contact aureole shown on the geologic map was defined using an ASTER false color composite image (R = 6, G = 3, B = 1) and a logical operator byte image. The contact aureole rocks exhibit Fe2+, Al-OH, and Fe, Mg-OH spectral absorption features at 1.65, 2.2, and 2.33 μm, respectively, which suggest that the contact aureole rocks contain muscovite, epidote, and chlorite. The contact aureole rocks were mapped using an Interactive Data Language (IDL) logical operator.A visible through short wave infrared (VNIR-SWIR) mineral and rock-type map based on matched filter, band ratio, and logical operator analysis illustrates: (1) laterally extensive calcitic carbonatite that covers most of the crater and areas northeast of the crater; (2) ankeritic carbonatite located southeast and north of the crater and some small deposits located within the crater; (3) agglomerate that

  18. Advanced concentration analysis of atom probe tomography data: Local proximity histograms and pseudo-2D concentration maps.

    PubMed

    Felfer, Peter; Cairney, Julie

    2018-06-01

    Analysing the distribution of selected chemical elements with respect to interfaces is one of the most common tasks in data mining in atom probe tomography. This can be represented by 1D concentration profiles, 2D concentration maps or proximity histograms, which represent concentration, density etc. of selected species as a function of the distance from a reference surface/interface. These are some of the most useful tools for the analysis of solute distributions in atom probe data. In this paper, we present extensions to the proximity histogram in the form of 'local' proximity histograms, calculated for selected parts of a surface, and pseudo-2D concentration maps, which are 2D concentration maps calculated on non-flat surfaces. This way, local concentration changes at interfaces or and other structures can be assessed more effectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Assessing the methods needed for improved dengue mapping: a SWOT analysis

    PubMed Central

    Attaway, David Frost; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2014-01-01

    Introduction Dengue fever, a mosquito-borne viral infection, is a growing threat to human health in tropical and subtropical areas worldwide. There is a demand from public officials for maps that capture the current distribution of dengue and maps that analyze risk factors to predict the future burden of disease. Methods To identify relevant articles, we searched Google Scholar, PubMed, BioMed Central, and WHOLIS (World Health Organization Library Database) for published articles with a specific set of dengue criteria between January 2002 and July 2013. Results After evaluating the currently available dengue models, we identified four key barriers to the creation of high-quality dengue maps: (1) data limitations related to the expense of diagnosing and reporting dengue cases in places where health information systems are underdeveloped; (2) issues related to the use of socioeconomic proxies in places with limited dengue incidence data; (3) mosquito ranges which may be changing as a result of climate changes; and (4) the challenges of mapping dengue events at a variety of scales. Conclusion An ideal dengue map will present endemic and epidemic dengue information from both rural and urban areas. Overcoming the current barriers requires expanded collaboration and data sharing by geographers, epidemiologists, and entomologists. Enhanced mapping techniques would allow for improved visualizations of dengue rates and risks. PMID:25328585

  20. Mapping knowledge domains: Characterizing PNAS

    PubMed Central

    Boyack, Kevin W.

    2004-01-01

    A review of data mining and analysis techniques that can be used for the mapping of knowledge domains is given. Literature mapping techniques can be based on authors, documents, journals, words, and/or indicators. Most mapping questions are related to research assessment or to the structure and dynamics of disciplines or networks. Several mapping techniques are demonstrated on a data set comprising 20 years of papers published in PNAS. Data from a variety of sources are merged to provide unique indicators of the domain bounded by PNAS. By using funding source information and citation counts, it is shown that, on an aggregate basis, papers funded jointly by the U.S. Public Health Service (which includes the National Institutes of Health) and non-U.S. government sources outperform papers funded by other sources, including by the U.S. Public Health Service alone. Grant data from the National Institute on Aging show that, on average, papers from large grants are cited more than those from small grants, with performance increasing with grant amount. A map of the highest performing papers over the 20-year period was generated by using citation analysis. Changes and trends in the subjects of highest impact within the PNAS domain are described. Interactions between topics over the most recent 5-year period are also detailed. PMID:14963238

  1. Multifractal and Singularity Maps of soil surface moisture distribution derived from 2D image analysis.

    NASA Astrophysics Data System (ADS)

    Cumbrera, Ramiro; Millán, Humberto; Martín-Sotoca, Juan Jose; Pérez Soto, Luis; Sanchez, Maria Elena; Tarquis, Ana Maria

    2016-04-01

    Soil moisture distribution usually presents extreme variation at multiple spatial scales. Image analysis could be a useful tool for investigating these spatial patterns of apparent soil moisture at multiple resolutions. The objectives of the present work were (i) to describe the local scaling of apparent soil moisture distribution and (ii) to define apparent soil moisture patterns from vertical planes of Vertisol pit images. Two soil pits (0.70 m long × 0.60 m width × 0.30 m depth) were excavated on a bare Mazic Pellic Vertisol. One was excavated in April/2011 and the other pit was established in May/2011 after 3 days of a moderate rainfall event. Digital photographs were taken from each Vertisol pit using a Kodak™ digital camera. The mean image size was 1600 × 945 pixels with one physical pixel ≈373 μm of the photographed soil pit. For more details see Cumbrera et al. (2012). Geochemical exploration have found with increasingly interests and benefits of using fractal (power-law) models to characterize geochemical distribution, using the concentration-area (C-A) model (Cheng et al., 1994; Cheng, 2012). This method is based on the singularity maps of a measure that at each point define areas with self-similar properties that are shown in power-law relationships in Concentration-Area plots (C-A method). The C-A method together with the singularity map ("Singularity-CA" method) define thresholds that can be applied to segment the map. We have applied it to each soil image. The results show that, in spite of some computational and practical limitations, image analysis of apparent soil moisture patterns could be used to study the dynamical change of soil moisture sampling in agreement with previous results (Millán et al., 2016). REFERENCES Cheng, Q., Agterberg, F. P. and Ballantyne, S. B. (1994). The separation of geochemical anomalies from background by fractal methods. Journal of Geochemical Exploration, 51, 109-130. Cheng, Q. (2012). Singularity theory and

  2. OPTIMA: sensitive and accurate whole-genome alignment of error-prone genomic maps by combinatorial indexing and technology-agnostic statistical analysis.

    PubMed

    Verzotto, Davide; M Teo, Audrey S; Hillmer, Axel M; Nagarajan, Niranjan

    2016-01-01

    Resolution of complex repeat structures and rearrangements in the assembly and analysis of large eukaryotic genomes is often aided by a combination of high-throughput sequencing and genome-mapping technologies (for example, optical restriction mapping). In particular, mapping technologies can generate sparse maps of large DNA fragments (150 kilo base pairs (kbp) to 2 Mbp) and thus provide a unique source of information for disambiguating complex rearrangements in cancer genomes. Despite their utility, combining high-throughput sequencing and mapping technologies has been challenging because of the lack of efficient and sensitive map-alignment algorithms for robustly aligning error-prone maps to sequences. We introduce a novel seed-and-extend glocal (short for global-local) alignment method, OPTIMA (and a sliding-window extension for overlap alignment, OPTIMA-Overlap), which is the first to create indexes for continuous-valued mapping data while accounting for mapping errors. We also present a novel statistical model, agnostic with respect to technology-dependent error rates, for conservatively evaluating the significance of alignments without relying on expensive permutation-based tests. We show that OPTIMA and OPTIMA-Overlap outperform other state-of-the-art approaches (1.6-2 times more sensitive) and are more efficient (170-200 %) and precise in their alignments (nearly 99 % precision). These advantages are independent of the quality of the data, suggesting that our indexing approach and statistical evaluation are robust, provide improved sensitivity and guarantee high precision.

  3. US Topo: topographic maps for the nation

    USGS Publications Warehouse

    Carswell, William J.

    2013-01-01

    US Topo is the next generation of topographic maps from the U.S. Geological Survey (USGS). Arranged in the familiar 7.5-minute quadrangle format, digital US Topo maps are designed to look and feel (and perform) like the traditional paper topographic maps for which the USGS is so well known. In contrast to paper-based maps, US Topo maps provide modern technical advantages that support faster, wider public distribution and enable basic, on-screen geographic analysis for all users. The US Topo quadrangle map has been redesigned so that map elements are visually distinguishable with the imagery turned on and off, while keeping the file size as small as possible. The US Topo map redesign includes improvements to various display factors, including symbol definitions (color, line thickness, line symbology, area fills), layer order, and annotation fonts. New features for 2013 include the following: a raster shaded relief layer, military boundaries, cemeteries and post offices, and a US Topo cartographic symbols legend as an attachment. US Topo quadrangle maps are available free on the Web. Each map quadrangle is constructed in GeoPDF® format using key layers of geographic data (orthoimagery, roads, geographic names, topographic contours, and hydrographic features) from The National Map databases. US Topo quadrangle maps can be printed from personal computers or plotters as complete, full-sized, maps or in customized sections, in a user-desired specific format. Paper copies of the maps can also be purchased from the USGS Store. Download links and a users guide are featured on the US Topo Web site. US Topo users can turn geographic data layers on and off as needed; they can zoom in and out to highlight specific features or see a broader area. File size for each digital 7.5-minute quadrangle, about 30 megabytes. Associated electronic tools for geographic analysis are available free for download. The US Topo provides the Nation with a topographic product that users can

  4. Improved resolution in the position of drought-related QTLs in a single mapping population of rice by meta-analysis

    PubMed Central

    Khowaja, Farkhanda S; Norton, Gareth J; Courtois, Brigitte; Price, Adam H

    2009-01-01

    Background Meta-analysis of QTLs combines the results of several QTL detection studies and provides narrow confidence intervals for meta-QTLs, permitting easier positional candidate gene identification. It is usually applied to multiple mapping populations, but can be applied to one. Here, a meta-analysis of drought related QTLs in the Bala × Azucena mapping population compiles data from 13 experiments and 25 independent screens providing 1,650 individual QTLs separated into 5 trait categories; drought avoidance, plant height, plant biomass, leaf morphology and root traits. A heat map of the overlapping 1 LOD confidence intervals provides an overview of the distribution of QTLs. The programme BioMercator is then used to conduct a formal meta-analysis at example QTL clusters to illustrate the value of meta-analysis of QTLs in this population. Results The heat map graphically illustrates the genetic complexity of drought related traits in rice. QTLs can be linked to their physical position on the rice genome using Additional file 1 provided. Formal meta-analysis on chromosome 1, where clusters of QTLs for all trait categories appear close, established that the sd1 semi-dwarfing gene coincided with a plant height meta-QTL, that the drought avoidance meta-QTL was not likely to be associated with this gene, and that this meta-QTL was not pleiotropic with close meta-QTLs for leaf morphology and root traits. On chromosome 5, evidence suggests that a drought avoidance meta-QTL was pleiotropic with leaf morphology and plant biomass meta-QTLs, but not with meta-QTLs for root traits and plant height 10 cM lower down. A region of dense root QTL activity graphically visible on chromosome 9 was dissected into three meta-QTLs within a space of 35 cM. The confidence intervals for meta-QTLs obtained ranged from 5.1 to 14.5 cM with an average of 9.4 cM, which is approximately 180 genes in rice. Conclusion The meta-analysis is valuable in providing improved ability to dissect the

  5. Development of predictive mapping techniques for soil survey and salinity mapping

    NASA Astrophysics Data System (ADS)

    Elnaggar, Abdelhamid A.

    Conventional soil maps represent a valuable source of information about soil characteristics, however they are subjective, very expensive, and time-consuming to prepare. Also, they do not include explicit information about the conceptual mental model used in developing them nor information about their accuracy, in addition to the error associated with them. Decision tree analysis (DTA) was successfully used in retrieving the expert knowledge embedded in old soil survey data. This knowledge was efficiently used in developing predictive soil maps for the study areas in Benton and Malheur Counties, Oregon and accessing their consistency. A retrieved soil-landscape model from a reference area in Harney County was extrapolated to develop a preliminary soil map for the neighboring unmapped part of Malheur County. The developed map had a low prediction accuracy and only a few soil map units (SMUs) were predicted with significant accuracy, mostly those shallow SMUs that have either a lithic contact with the bedrock or developed on a duripan. On the other hand, the developed soil map based on field data was predicted with very high accuracy (overall was about 97%). Salt-affected areas of the Malheur County study area are indicated by their high spectral reflectance and they are easily discriminated from the remote sensing data. However, remote sensing data fails to distinguish between the different classes of soil salinity. Using the DTA method, five classes of soil salinity were successfully predicted with an overall accuracy of about 99%. Moreover, the calculated area of salt-affected soil was overestimated when mapped using remote sensing data compared to that predicted by using DTA. Hence, DTA could be a very helpful approach in developing soil survey and soil salinity maps in more objective, effective, less-expensive and quicker ways based on field data.

  6. Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2014-07-01

    The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. polymapR - linkage analysis and genetic map construction from F1 populations of outcrossing polyploids.

    PubMed

    Bourke, Peter M; van Geest, Geert; Voorrips, Roeland E; Jansen, Johannes; Kranenburg, Twan; Shahin, Arwa; Visser, Richard G F; Arens, Paul; Smulders, Marinus J M; Maliepaard, Chris

    2018-05-02

    Polyploid species carry more than two copies of each chromosome, a condition found in many of the world's most important crops. Genetic mapping in polyploids is more complex than in diploid species, resulting in a lack of available software tools. These are needed if we are to realise all the opportunities offered by modern genotyping platforms for genetic research and breeding in polyploid crops. polymapR is an R package for genetic linkage analysis and integrated genetic map construction from bi-parental populations of outcrossing autopolyploids. It can currently analyse triploid, tetraploid and hexaploid marker datasets and is applicable to various crops including potato, leek, alfalfa, blueberry, chrysanthemum, sweet potato or kiwifruit. It can detect, estimate and correct for preferential chromosome pairing, and has been tested on high-density marker datasets from potato, rose and chrysanthemum, generating high-density integrated linkage maps in all of these crops. polymapR is freely available under the general public license from the Comprehensive R Archive Network (CRAN) at http://cran.r-project.org/package=polymapR. Chris Maliepaard chris.maliepaard@wur.nl or Roeland E. Voorrips roeland.voorrips@wur.nl. Supplementary data are available at Bioinformatics online.

  8. Gap Analysis of Benthic Mapping at Three National Parks: Assateague Island National Seashore, Channel Islands National Park, and Sleeping Bear Dunes National Lakeshore

    USGS Publications Warehouse

    Rose, Kathryn V.; Nayegandhi, Amar; Moses, Christopher S.; Beavers, Rebecca; Lavoie, Dawn; Brock, John C.

    2012-01-01

    The National Park Service (NPS) Inventory and Monitoring (I&M) Program initiated a benthic habitat mapping program in ocean and coastal parks in 2008-2009 in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With more than 80 ocean and Great Lakes parks encompassing approximately 2.5 million acres of submerged territory and approximately 12,000 miles of coastline (Curdts, 2011), this Servicewide Benthic Mapping Program (SBMP) is essential. This report presents an initial gap analysis of three pilot parks under the SBMP: Assateague Island National Seashore (ASIS), Channel Islands National Park (CHIS), and Sleeping Bear Dunes National Lakeshore (SLBE) (fig. 1). The recommended SBMP protocols include servicewide standards (for example, gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). The SBMP requires the inventory and mapping of critical components of coastal and marine ecosystems: bathymetry, geoforms, surface geology, and biotic cover. In order for a park unit benthic inventory to be considered complete, maps of bathymetry and other key components must be combined into a final report (Moses and others, 2010). By this standard, none of the three pilot parks are mapped (inventoried) to completion with respect to submerged resources. After compiling the existing benthic datasets for these parks, this report has concluded that CHIS, with 49 percent of its submerged area mapped, has the most complete benthic inventory of the three. The ASIS submerged inventory is 41 percent complete, and SLBE is 17.5 percent complete.

  9. Shear Wave Velocity, Depth to Bedrock, and Fundamental Resonance Applied to Bedrock Mapping using MASW and H/V Analysis

    NASA Astrophysics Data System (ADS)

    Gonsiewski, J.

    2015-12-01

    Mapping bedrock depth is useful for earthquake hazard analysis, subsurface water transport, and other applications. Recently, collaborative experimentation provided an opportunity to explore a mapping method. Near surface glacial till shear wave velocity (Vs) where data is available from an array of 3-component seismometers were studied for this experiment. Vs is related to depth to bedrock (h) and fundamental resonance (Fo); Fo = Vs/(4h). The H/V spectral peak frequency of recordings from a 3-component seismometer yields a fundamental resonance estimate. Where a suitable average Vs is established, the depth to bedrock can be calculated at every seismometer. 3-component seismometer data was provided by Spectraseis. Geophones, seismographs, and an extra 3-component seismometer were provided by Wright State University for this study. For Vs analysis, three MASW surveys were conducted near the seismometer array. SurfSeis3© was used for processing MASW data. Overtones from complicated bedrock structure and great bedrock depth are improved by combining overtones from multiple source offsets from each survey. From MASW Vs and depth to bedrock results, theoretical fundamental resonance (Fo) was calculated and compared with the H/V peak spectral frequency measured by a seismometer at selected sites and processed by Geopsy processing software. Calculated bedrock depths from all geophysical data were compared with measured bedrock depths at nearby water wells and oil and gas wells provided by ODNR. Vs and depth to bedrock results from MASW produced similar calculated fundamental resonances to the H/V approximations by respective seismometers. Bedrock mapping was performed upon verifying the correlation between the theoretical fundamental resonance and H/V peak frequencies. Contour maps were generated using ArcGIS®. Contour lines interpolated from local wells were compared with the depths calculated from H/V analysis. Bedrock depths calculated from the seismometer array

  10. System Analysis by Mapping a Fault-tree into a Bayesian-network

    NASA Astrophysics Data System (ADS)

    Sheng, B.; Deng, C.; Wang, Y. H.; Tang, L. H.

    2018-05-01

    In view of the limitations of fault tree analysis in reliability assessment, Bayesian Network (BN) has been studied as an alternative technology. After a brief introduction to the method for mapping a Fault Tree (FT) into an equivalent BN, equations used to calculate the structure importance degree, the probability importance degree and the critical importance degree are presented. Furthermore, the correctness of these equations is proved mathematically. Combining with an aircraft landing gear’s FT, an equivalent BN is developed and analysed. The results show that richer and more accurate information have been achieved through the BN method than the FT, which demonstrates that the BN is a superior technique in both reliability assessment and fault diagnosis.

  11. Magnetoencephalographic Mapping of Epileptic Spike Population Using Distributed Source Analysis: Comparison With Intracranial Electroencephalographic Spikes.

    PubMed

    Tanaka, Naoaki; Papadelis, Christos; Tamilia, Eleonora; Madsen, Joseph R; Pearl, Phillip L; Stufflebeam, Steven M

    2018-04-27

    This study evaluates magnetoencephalographic (MEG) spike population as compared with intracranial electroencephalographic (IEEG) spikes using a quantitative method based on distributed source analysis. We retrospectively studied eight patients with medically intractable epilepsy who had an MEG and subsequent IEEG monitoring. Fifty MEG spikes were analyzed in each patient using minimum norm estimate. For individual spikes, each vertex in the source space was considered activated when its source amplitude at the peak latency was higher than a threshold, which was set at 50% of the maximum amplitude over all vertices. We mapped the total count of activation at each vertex. We also analyzed 50 IEEG spikes in the same manner over the intracranial electrodes and created the activation count map. The location of the electrodes was obtained in the MEG source space by coregistering postimplantation computed tomography to MRI. We estimated the MEG- and IEEG-active regions associated with the spike populations using the vertices/electrodes with a count over 25. The activation count maps of MEG spikes demonstrated the localization associated with the spike population by variable count values at each vertex. The MEG-active region overlapped with 65 to 85% of the IEEG-active region in our patient group. Mapping the MEG spike population is valid for demonstrating the trend of spikes clustering in patients with epilepsy. In addition, comparison of MEG and IEEG spikes quantitatively may be informative for understanding their relationship.

  12. Changing Predictors of Map Use in Wayfinding.

    ERIC Educational Resources Information Center

    Scholnick, Ellin Kofsky; And Others

    Using a map for guiding travel requires: (1) skills in encoding information from a terrain and a map; (2) finding a match between the two; and (3) maintaining the match despite directional shifts from turns on a route. In order to test this analysis, 94 children between the ages of 4 and 6 used maps to locate the route to a goal through a network…

  13. Land cover map for map zones 8 and 9 developed from SAGEMAP, GNN, and SWReGAP: a pilot for NWGAP

    Treesearch

    James S. Kagan; Janet L. Ohmann; Matthew Gregory; Claudine Tobalske

    2008-01-01

    As part of the Northwest Gap Analysis Project, land cover maps were generated for most of eastern Washington and eastern Oregon. The maps were derived from regional SAGEMAP and SWReGAP data sets using decision tree classifiers for nonforest areas, and Gradient Nearest Neighbor imputation modeling for forests and woodlands. The maps integrate data from regional...

  14. Experimental and Automated Analysis Techniques for High-resolution Electrical Mapping of Small Intestine Slow Wave Activity

    PubMed Central

    Angeli, Timothy R; O'Grady, Gregory; Paskaranandavadivel, Niranchan; Erickson, Jonathan C; Du, Peng; Pullan, Andrew J; Bissett, Ian P

    2013-01-01

    Background/Aims Small intestine motility is governed by an electrical slow wave activity, and abnormal slow wave events have been associated with intestinal dysmotility. High-resolution (HR) techniques are necessary to analyze slow wave propagation, but progress has been limited by few available electrode options and laborious manual analysis. This study presents novel methods for in vivo HR mapping of small intestine slow wave activity. Methods Recordings were obtained from along the porcine small intestine using flexible printed circuit board arrays (256 electrodes; 4 mm spacing). Filtering options were compared, and analysis was automated through adaptations of the falling-edge variable-threshold (FEVT) algorithm and graphical visualization tools. Results A Savitzky-Golay filter was chosen with polynomial-order 9 and window size 1.7 seconds, which maintained 94% of slow wave amplitude, 57% of gradient and achieved a noise correction ratio of 0.083. Optimized FEVT parameters achieved 87% sensitivity and 90% positive-predictive value. Automated activation mapping and animation successfully revealed slow wave propagation patterns, and frequency, velocity, and amplitude were calculated and compared at 5 locations along the intestine (16.4 ± 0.3 cpm, 13.4 ± 1.7 mm/sec, and 43 ± 6 µV, respectively, in the proximal jejunum). Conclusions The methods developed and validated here will greatly assist small intestine HR mapping, and will enable experimental and translational work to evaluate small intestine motility in health and disease. PMID:23667749

  15. ReMap 2018: an updated atlas of regulatory regions from an integrative analysis of DNA-binding ChIP-seq experiments

    PubMed Central

    Chèneby, Jeanne; Gheorghe, Marius; Artufel, Marie

    2018-01-01

    Abstract With this latest release of ReMap (http://remap.cisreg.eu), we present a unique collection of regulatory regions in human, as a result of a large-scale integrative analysis of ChIP-seq experiments for hundreds of transcriptional regulators (TRs) such as transcription factors, transcriptional co-activators and chromatin regulators. In 2015, we introduced the ReMap database to capture the genome regulatory space by integrating public ChIP-seq datasets, covering 237 TRs across 13 million (M) peaks. In this release, we have extended this catalog to constitute a unique collection of regulatory regions. Specifically, we have collected, analyzed and retained after quality control a total of 2829 ChIP-seq datasets available from public sources, covering a total of 485 TRs with a catalog of 80M peaks. Additionally, the updated database includes new search features for TR names as well as aliases, including cell line names and the ability to navigate the data directly within genome browsers via public track hubs. Finally, full access to this catalog is available online together with a TR binding enrichment analysis tool. ReMap 2018 provides a significant update of the ReMap database, providing an in depth view of the complexity of the regulatory landscape in human. PMID:29126285

  16. The use of error-category mapping in pharmacokinetic model analysis of dynamic contrast-enhanced MRI data.

    PubMed

    Gill, Andrew B; Anandappa, Gayathri; Patterson, Andrew J; Priest, Andrew N; Graves, Martin J; Janowitz, Tobias; Jodrell, Duncan I; Eisen, Tim; Lomas, David J

    2015-02-01

    This study introduces the use of 'error-category mapping' in the interpretation of pharmacokinetic (PK) model parameter results derived from dynamic contrast-enhanced (DCE-) MRI data. Eleven patients with metastatic renal cell carcinoma were enrolled in a multiparametric study of the treatment effects of bevacizumab. For the purposes of the present analysis, DCE-MRI data from two identical pre-treatment examinations were analysed by application of the extended Tofts model (eTM), using in turn a model arterial input function (AIF), an individually-measured AIF and a sample-average AIF. PK model parameter maps were calculated. Errors in the signal-to-gadolinium concentration ([Gd]) conversion process and the model-fitting process itself were assigned to category codes on a voxel-by-voxel basis, thereby forming a colour-coded 'error-category map' for each imaged slice. These maps were found to be repeatable between patient visits and showed that the eTM converged adequately in the majority of voxels in all the tumours studied. However, the maps also clearly indicated sub-regions of low Gd uptake and of non-convergence of the model in nearly all tumours. The non-physical condition ve ≥ 1 was the most frequently indicated error category and appeared sensitive to the form of AIF used. This simple method for visualisation of errors in DCE-MRI could be used as a routine quality-control technique and also has the potential to reveal otherwise hidden patterns of failure in PK model applications. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Exploring Students' Mapping Behaviors and Interactive Discourses in a Case Diagnosis Problem: Sequential Analysis of Collaborative Causal Map Drawing Processes

    ERIC Educational Resources Information Center

    Lee, Woon Jee

    2012-01-01

    The purpose of this study was to explore the nature of students' mapping and discourse behaviors while constructing causal maps to articulate their understanding of a complex, ill-structured problem. In this study, six graduate-level students were assigned to one of three pair groups, and each pair used the causal mapping software program,…

  18. Uncertainty Analysis in Large Area Aboveground Biomass Mapping

    NASA Astrophysics Data System (ADS)

    Baccini, A.; Carvalho, L.; Dubayah, R.; Goetz, S. J.; Friedl, M. A.

    2011-12-01

    Satellite and aircraft-based remote sensing observations are being more frequently used to generate spatially explicit estimates of aboveground carbon stock of forest ecosystems. Because deforestation and forest degradation account for circa 10% of anthropogenic carbon emissions to the atmosphere, policy mechanisms are increasingly recognized as a low-cost mitigation option to reduce carbon emission. They are, however, contingent upon the capacity to accurately measures carbon stored in the forests. Here we examine the sources of uncertainty and error propagation in generating maps of aboveground biomass. We focus on characterizing uncertainties associated with maps at the pixel and spatially aggregated national scales. We pursue three strategies to describe the error and uncertainty properties of aboveground biomass maps, including: (1) model-based assessment using confidence intervals derived from linear regression methods; (2) data-mining algorithms such as regression trees and ensembles of these; (3) empirical assessments using independently collected data sets.. The latter effort explores error propagation using field data acquired within satellite-based lidar (GLAS) acquisitions versus alternative in situ methods that rely upon field measurements that have not been systematically collected for this purpose (e.g. from forest inventory data sets). A key goal of our effort is to provide multi-level characterizations that provide both pixel and biome-level estimates of uncertainties at different scales.

  19. Analysis Of Direct Numerical Simulation Results Of Adverse Pressure Gradient Boundary Layer Through Anisotropy Invariant Mapping And Comparison With The Rans Simulations

    NASA Astrophysics Data System (ADS)

    Gungor, Ayse Gul; Nural, Ozan Ekin; Ertunc, Ozgur

    2017-11-01

    Purpose of this study is to analyze the direct numerical simulation data of a turbulent boundary layer subjected to strong adverse pressure gradient through anisotropy invariant mapping. RANS simulation using the ``Elliptic Blending Model'' of Manceau and Hanjolic (2002) is also conducted for the same flow case with commercial software Star-CCM+ and comparison of the results with DNS data is done. RANS simulation captures the general trends in the velocity field but, significant deviations are found when skin friction coefficients are compared. Anisotropy invariant map of Lumley and Newman (1977) and barycentric map of Banerjee et al. (2007) are used for the analysis. Invariant mapping of the DNS data has yielded that at locations away from the wall, flow is close to one component turbulence state. In the vicinity of the wall, turbulence is at two component limit which is one border of the barycentric map and as the flow evolves along the streamwise direction, it approaches to two component turbulence state. Additionally, at the locations away from the wall, turbulence approaches to two component limit. Furthermore, analysis of the invariants of the RANS simulations shows dissimilar results. In RANS simulations invariants do not approach to any of the limit states unlike the DNS.

  20. Modeling and Analysis of Information Product Maps

    ERIC Educational Resources Information Center

    Heien, Christopher Harris

    2012-01-01

    Information Product Maps are visual diagrams used to represent the inputs, processing, and outputs of data within an Information Manufacturing System. A data unit, drawn as an edge, symbolizes a grouping of raw data as it travels through this system. Processes, drawn as vertices, transform each data unit input into various forms prior to delivery…

  1. Combining geostatistics with Moran's I analysis for mapping soil heavy metals in Beijing, China.

    PubMed

    Huo, Xiao-Ni; Li, Hong; Sun, Dan-Feng; Zhou, Lian-Di; Li, Bao-Guo

    2012-03-01

    Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran's I analysis was used to supplement the traditional geostatistics. According to Moran's I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran's I and the standardized Moran's I, Z(I) reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran's I analysis was better than traditional geostatistics. Thus, Moran's I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals.

  2. An ultra-high density linkage map and QTL mapping for sex and growth-related traits of common carp (Cyprinus carpio)

    PubMed Central

    Peng, Wenzhu; Xu, Jian; Zhang, Yan; Feng, Jianxin; Dong, Chuanju; Jiang, Likun; Feng, Jingyan; Chen, Baohua; Gong, Yiwen; Chen, Lin; Xu, Peng

    2016-01-01

    High density genetic linkage maps are essential for QTL fine mapping, comparative genomics and high quality genome sequence assembly. In this study, we constructed a high-density and high-resolution genetic linkage map with 28,194 SNP markers on 14,146 distinct loci for common carp based on high-throughput genotyping with the carp 250 K single nucleotide polymorphism (SNP) array in a mapping family. The genetic length of the consensus map was 10,595.94 cM with an average locus interval of 0.75 cM and an average marker interval of 0.38 cM. Comparative genomic analysis revealed high level of conserved syntenies between common carp and the closely related model species zebrafish and medaka. The genome scaffolds were anchored to the high-density linkage map, spanning 1,357 Mb of common carp reference genome. QTL mapping and association analysis identified 22 QTLs for growth-related traits and 7 QTLs for sex dimorphism. Candidate genes underlying growth-related traits were identified, including important regulators such as KISS2, IGF1, SMTLB, NPFFR1 and CPE. Candidate genes associated with sex dimorphism were also identified including 3KSR and DMRT2b. The high-density and high-resolution genetic linkage map provides an important tool for QTL fine mapping and positional cloning of economically important traits, and improving common carp genome assembly. PMID:27225429

  3. Mapping Cultural Ecosystem Services in Vilnius using Hot-Spot Analysis.

    NASA Astrophysics Data System (ADS)

    Pereira, Paulo; Depellegrin, Daniel; Egarter-Vigl, Lukas; Oliva, Marc; Misiune, Ieva; Keesstra, Saskia; Estebaranz, Ferran; Cerda, Artemi

    2017-04-01

    Cultural services in urban areas are very important to promote tourism activities and develop the economy. These activities are fundamental for the sustainability of the urban areas since can represent an important monetary source. However, one of the major threats to the sustainability of cultural services is the high amount of visitants that can lead to a degradation of the services provided (Depellegrin et al., 2016). Mapping the potential of cultural ecosystems services is fundamental to assess the capacity that the territory have to provide it. Previous works used land use classification to identify the ecosystem services potential, and revealed to be a good methodology to attribute to each type of land use a specific capacity (Burkhard et al., 2008). The objective of this work is to map the cultural services in Vilnius area using a hot-spot analysis. Ecosystem services potential was assessed using the matrix developed by Burkhard et al. (2009), which ranks ES capacity from 0= no capacity to 5=very high relevant capacity to a different land use type. The results showed that with the exception of Cultural Heritage ecosystem services that had a random pattern (Z-score=0.62, p<0.530), all the others had clustered pattern: Recreation and Tourism (Z-score=4.02, p<0.001), Landscape Aesthetics (Z-score=4.44, p<0.001), Knowledge Systems (Z-score=4.15, p<0.001), Religious and Spiritual (Z-score=3.80, p<0.001) and Natural Heritage (Z-score=5.64, p<0.001). The incremental Moran's I result showed that Recreation and Tourism ecosystem services had the maximum spatial correlation at the distance of 5125.12 m, Landscape Aesthetics at 3495.70 m, Knowledge Systems at 5218.66 m, Religious and Spiritual at 3495.70 m, Cultural Heritage at 6746.17 m and Natural Heritage at 6205.82 m. This showed that the cultural services studied have a different spatial correlation. References Burkhard B, Kroll F, Müller F, Windhorst W. 2009. Landscapes' capacities to provide ecosystem services

  4. Localization of the Netherton Syndrome Gene to Chromosome 5q32, by Linkage Analysis and Homozygosity Mapping

    PubMed Central

    Chavanas, Stéphane; Garner, Chad; Bodemer, Christine; Ali, Mohsin; Teillac, Dominique Hamel-; Wilkinson, John; Bonafé, Jean-Louis; Paradisi, Mauro; Kelsell, David P.; Ansai, Shin-ichi; Mitsuhashi, Yoshihiko; Larrègue, Marc; Leigh, Irene M.; Harper, John I.; Taïeb, Alain; Prost, Yves de; Cardon, Lon R.; Hovnanian, Alain

    2000-01-01

    Netherton syndrome (NS [MIM 256500]) is a rare and severe autosomal recessive disorder characterized by congenital ichthyosis, a specific hair-shaft defect (trichorrhexis invaginata), and atopic manifestations. Infants with this syndrome often fail to thrive; life-threatening complications result in high postnatal mortality. We report the assignment of the NS gene to chromosome 5q32, by linkage analysis and homozygosity mapping in 20 families affected with NS. Significant evidence for linkage (maximum multipoint LOD score 10.11) between markers D5S2017 and D5S413 was obtained, with no evidence for locus heterogeneity. Analysis of critical recombinants mapped the NS locus between markers D5S463 and D5S2013, within an <3.5-cM genetic interval. The NS locus is telomeric to the cytokine gene cluster in 5q31. The five known genes encoding casein kinase Iα, the α subunit of retinal rod cGMP phosphodiesterase, the regulator of mitotic-spindle assembly, adrenergic receptor β2, and the diastrophic dysplasia sulfate–transporter gene, as well as the 38 expressed-sequence tags mapped within the critical region, are not obvious candidates. Our study is the first step toward the positional cloning of the NS gene. This finding promises a better understanding of the molecular mechanisms that control epidermal differentiation and immunity. PMID:10712206

  5. A new MAP for Mars

    NASA Technical Reports Server (NTRS)

    Zubrin, Robert; Price, Steve; Clark, Ben; Cantrell, Jim; Bourke, Roger

    1993-01-01

    A Mars Aerial Platform (MAP) mission capable of generating thousands of very-high-resolution (20 cm/pixel) pictures of the Martian surface is considered. The MAP entry vehicle will map the global circulation of the planet's atmosphere and examine the surface and subsurface. Data acquisition will use instruments carried aboard balloons flying at nominal altitude of about 7 km over the Martian surface. The MAP balloons will take high- and medium-resolution photographs of Mars, sound its surface with radar, and provide tracking data to chart its winds. Mars vehicle design is based on the fourth-generation NTP, NEP, SEP vehicle set that provides a solid database for determining transportation system costs. Interference analysis and 3D image generation are performed using manual system sizing and sketching in conjunction with precise CAD modeling.

  6. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  7. Analysis of the Precision of Variable Flip Angle T1 Mapping with Emphasis on the Noise Propagated from RF Transmit Field Maps.

    PubMed

    Lee, Yoojin; Callaghan, Martina F; Nagy, Zoltan

    2017-01-01

    In magnetic resonance imaging, precise measurements of longitudinal relaxation time ( T 1 ) is crucial to acquire useful information that is applicable to numerous clinical and neuroscience applications. In this work, we investigated the precision of T 1 relaxation time as measured using the variable flip angle method with emphasis on the noise propagated from radiofrequency transmit field ([Formula: see text]) measurements. The analytical solution for T 1 precision was derived by standard error propagation methods incorporating the noise from the three input sources: two spoiled gradient echo (SPGR) images and a [Formula: see text] map. Repeated in vivo experiments were performed to estimate the total variance in T 1 maps and we compared these experimentally obtained values with the theoretical predictions to validate the established theoretical framework. Both the analytical and experimental results showed that variance in the [Formula: see text] map propagated comparable noise levels into the T 1 maps as either of the two SPGR images. Improving precision of the [Formula: see text] measurements significantly reduced the variance in the estimated T 1 map. The variance estimated from the repeatedly measured in vivo T 1 maps agreed well with the theoretically-calculated variance in T 1 estimates, thus validating the analytical framework for realistic in vivo experiments. We concluded that for T 1 mapping experiments, the error propagated from the [Formula: see text] map must be considered. Optimizing the SPGR signals while neglecting to improve the precision of the [Formula: see text] map may result in grossly overestimating the precision of the estimated T 1 values.

  8. Global trends in research related to social media in psychology: mapping and bibliometric analysis.

    PubMed

    Zyoud, Sa'ed H; Sweileh, Waleed M; Awang, Rahmat; Al-Jabi, Samah W

    2018-01-01

    Social media, defined as interactive Web applications, have been on the rise globally, particularly among adults. The objective of this study was to investigate the trend of the literature related to the most used social network worldwide (i.e. Facebook, Twitter, LinkedIn, Snapchat, and Instagram) in the field of psychology. Specifically, this study will assess the growth in publications, citation analysis, international collaboration, author productivity, emerging topics and the mapping of frequent terms in publications pertaining to social media in the field of psychology. Publications related to social media in the field of psychology published between 2004 and 2014 were obtained from the Web of Science. The records extracted were analysed for bibliometric characteristics such as the growth in publications, citation analysis, international collaboration, emerging topics and the mapping of frequent terms in publications pertaining to social media in the field of psychology. VOSviewer v.1.6.5 was used to construct scientific maps. Overall, 959 publications were retrieved during the period between 2004 and 2015. The number of research publications in social media in the field of psychology showed a steady upward growth. Publications from the USA accounted for 57.14% of the total publications and the highest h -index (48).The most common document type was research articles (873; 91.03%). Over 99.06% of the publications were published in English. Computers in Human Behavior was the most prolific journal. The University of Wisconsin - Madison ranked first in terms of the total publications (n = 39). A visualisation analysis showed that personality psychology, experimental psychology, psychological risk factors, and developmental psychology were continual concerns of the research. This is the first study reporting the global trends in the research related to social media in the psychology field. Based on the raw data from the Web of Science, publication

  9. Spatial cluster analysis of nanoscopically mapped serotonin receptors for classification of fixed brain tissue

    NASA Astrophysics Data System (ADS)

    Sams, Michael; Silye, Rene; Göhring, Janett; Muresan, Leila; Schilcher, Kurt; Jacak, Jaroslaw

    2014-01-01

    We present a cluster spatial analysis method using nanoscopic dSTORM images to determine changes in protein cluster distributions within brain tissue. Such methods are suitable to investigate human brain tissue and will help to achieve a deeper understanding of brain disease along with aiding drug development. Human brain tissue samples are usually treated postmortem via standard fixation protocols, which are established in clinical laboratories. Therefore, our localization microscopy-based method was adapted to characterize protein density and protein cluster localization in samples fixed using different protocols followed by common fluorescent immunohistochemistry techniques. The localization microscopy allows nanoscopic mapping of serotonin 5-HT1A receptor groups within a two-dimensional image of a brain tissue slice. These nanoscopically mapped proteins can be confined to clusters by applying the proposed statistical spatial analysis. Selected features of such clusters were subsequently used to characterize and classify the tissue. Samples were obtained from different types of patients, fixed with different preparation methods, and finally stored in a human tissue bank. To verify the proposed method, samples of a cryopreserved healthy brain have been compared with epitope-retrieved and paraffin-fixed tissues. Furthermore, samples of healthy brain tissues were compared with data obtained from patients suffering from mental illnesses (e.g., major depressive disorder). Our work demonstrates the applicability of localization microscopy and image analysis methods for comparison and classification of human brain tissues at a nanoscopic level. Furthermore, the presented workflow marks a unique technological advance in the characterization of protein distributions in brain tissue sections.

  10. A High-throughput AFLP-based Method for Constructing Integrated Genetic and Physical Maps: Progress Toward a Sorghum Genome Map

    PubMed Central

    Klein, Patricia E.; Klein, Robert R.; Cartinhour, Samuel W.; Ulanch, Paul E.; Dong, Jianmin; Obert, Jacque A.; Morishige, Daryl T.; Schlueter, Shannon D.; Childs, Kevin L.; Ale, Melissa; Mullet, John E.

    2000-01-01

    Sorghum is an important target for plant genomic mapping because of its adaptation to harsh environments, diverse germplasm collection, and value for comparing the genomes of grass species such as corn and rice. The construction of an integrated genetic and physical map of the sorghum genome (750 Mbp) is a primary goal of our sorghum genome project. To help accomplish this task, we have developed a new high-throughput PCR-based method for building BAC contigs and locating BAC clones on the sorghum genetic map. This task involved pooling 24,576 sorghum BAC clones (∼4× genome equivalents) in six different matrices to create 184 pools of BAC DNA. DNA fragments from each pool were amplified using amplified fragment length polymorphism (AFLP) technology, resolved on a LI-COR dual-dye DNA sequencing system, and analyzed using Bionumerics software. On average, each set of AFLP primers amplified 28 single-copy DNA markers that were useful for identifying overlapping BAC clones. Data from 32 different AFLP primer combinations identified ∼2400 BACs and ordered ∼700 BAC contigs. Analysis of a sorghum RIL mapping population using the same primer pairs located ∼200 of the BAC contigs on the sorghum genetic map. Restriction endonuclease fingerprinting of the entire collection of sorghum BAC clones was applied to test and extend the contigs constructed using this PCR-based methodology. Analysis of the fingerprint data allowed for the identification of 3366 contigs each containing an average of 5 BACs. BACs in ∼65% of the contigs aligned by AFLP analysis had sufficient overlap to be confirmed by DNA fingerprint analysis. In addition, 30% of the overlapping BACs aligned by AFLP analysis provided information for merging contigs and singletons that could not be joined using fingerprint data alone. Thus, the combination of fingerprinting and AFLP-based contig assembly and mapping provides a reliable, high-throughput method for building an integrated genetic and physical map

  11. Construction of a genetic map using EST-SSR markers and QTL analysis of major agronomic characters in hexaploid sweet potato (Ipomoea batatas (L.) Lam).

    PubMed

    Kim, Jin-Hee; Chung, Il Kyung; Kim, Kyung-Min

    2017-01-01

    The Sweet potato, Ipomoea batatas (L.) Lam, is difficult to study in genetics and genomics because it is a hexaploid. The sweet potato study not have been performed domestically or internationally. In this study was performed to construct genetic map and quantitative trait loci (QTL) analysis. A total of 245 EST-SSR markers were developed, and the map was constructed by using 210 of those markers. The total map length was 1508.1 cM, and the mean distance between markers was 7.2 cM. Fifteen characteristics were investigated for QTLs analysis. According to those, the Four QTLs were identified, and The LOD score was 3.0. Further studies need to develop molecular markers in terms of EST-SSR markers for doing to be capable of efficient breeding. The genetic map created here using EST-SSR markers will facilitate planned breeding of sweet potato cultivars with various desirable traits.

  12. Conceptualizing physical activity parenting practices using expert informed concept mapping analysis.

    PubMed

    Mâsse, Louise C; O'Connor, Teresia M; Tu, Andrew W; Hughes, Sheryl O; Beauchamp, Mark R; Baranowski, Tom

    2017-06-14

    Parents are widely recognized as playing a central role in the development of child behaviors such as physical activity. As there is little agreement as to the dimensions of physical activity-related parenting practices that should be measured or how they should be operationalized, this study engaged experts to develop an integrated conceptual framework for assessing parenting practices that influence multiple aspects of 5 to 12 year old children's participation in physical activity. The ultimate goal of this study is to inform the development of an item bank (repository of calibrated items) aimed at measuring physical activity parenting practices. Twenty four experts from 6 countries (Australia, Canada, England, Scotland, the Netherlands, & United States (US)) sorted 77 physical activity parenting practice concepts identified from our previously published synthesis of the literature (74 measures) and survey of Canadian and US parents. Concept Mapping software was used to conduct the multi-dimensional scaling (MDS) analysis and a cluster analysis of the MDS solution of the Expert's sorting which was qualitatively reviewed and commented on by the Experts. The conceptual framework includes 12 constructs which are presented using three main domains of parenting practices (neglect/control, autonomy support, and structure). The neglect/control domain includes two constructs: permissive and pressuring parenting practices. The autonomy supportive domain includes four constructs: encouragement, guided choice, involvement in child physical activities, and praises/rewards for their child's physical activity. Finally, the structure domain includes six constructs: co-participation, expectations, facilitation, modeling, monitoring, and restricting physical activity for safety or academic concerns. The concept mapping analysis provided a useful process to engage experts in re-conceptualizing physical activity parenting practices and identified key constructs to include in

  13. Three-dimensional dominant frequency mapping using autoregressive spectral analysis of atrial electrograms of patients in persistent atrial fibrillation.

    PubMed

    Salinet, João L; Masca, Nicholas; Stafford, Peter J; Ng, G André; Schlindwein, Fernando S

    2016-03-08

    Areas with high frequency activity within the atrium are thought to be 'drivers' of the rhythm in patients with atrial fibrillation (AF) and ablation of these areas seems to be an effective therapy in eliminating DF gradient and restoring sinus rhythm. Clinical groups have applied the traditional FFT-based approach to generate the three-dimensional dominant frequency (3D DF) maps during electrophysiology (EP) procedures but literature is restricted on using alternative spectral estimation techniques that can have a better frequency resolution that FFT-based spectral estimation. Autoregressive (AR) model-based spectral estimation techniques, with emphasis on selection of appropriate sampling rate and AR model order, were implemented to generate high-density 3D DF maps of atrial electrograms (AEGs) in persistent atrial fibrillation (persAF). For each patient, 2048 simultaneous AEGs were recorded for 20.478 s-long segments in the left atrium (LA) and exported for analysis, together with their anatomical locations. After the DFs were identified using AR-based spectral estimation, they were colour coded to produce sequential 3D DF maps. These maps were systematically compared with maps found using the Fourier-based approach. 3D DF maps can be obtained using AR-based spectral estimation after AEGs downsampling (DS) and the resulting maps are very similar to those obtained using FFT-based spectral estimation (mean 90.23 %). There were no significant differences between AR techniques (p = 0.62). The processing time for AR-based approach was considerably shorter (from 5.44 to 5.05 s) when lower sampling frequencies and model order values were used. Higher levels of DS presented higher rates of DF agreement (sampling frequency of 37.5 Hz). We have demonstrated the feasibility of using AR spectral estimation methods for producing 3D DF maps and characterised their differences to the maps produced using the FFT technique, offering an alternative approach for 3D DF computation

  14. Polarization-correlation analysis of maps of optical anisotropy biological layers

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. A.; Dubolazov, A. V.; Prysyazhnyuk, V. S.; Marchuk, Y. F.; Pashkovskaya, N. V.; Motrich, A. V.; Novakovskaya, O. Y.

    2014-08-01

    A new information optical technique of diagnostics of the structure of polycrystalline films of bile is proposed. The model of Mueller-matrix description of mechanisms of optical anisotropy of such objects as optical activity, birefringence, as well as linear and circular dichroism is suggested. The ensemble of informationally topical azimuthally stable Mueller-matrix invariants is determined. Within the statistical analysis of such parameters distributions the objective criteria of differentiation of films of bile taken from healthy donors and diabetes of type 2 were determined. From the point of view of probative medicine the operational characteristics (sensitivity, specificity and accuracy) of the information-optical method of Mueller-matrix mapping of polycrystalline films of bile were found and its efficiency in diagnostics of diabetes extent of type 2 was demonstrated. Considered prospects of applying this method in the diagnosis of cirrhosis.

  15. Human pose tracking from monocular video by traversing an image motion mapped body pose manifold

    NASA Astrophysics Data System (ADS)

    Basu, Saurav; Poulin, Joshua; Acton, Scott T.

    2010-01-01

    Tracking human pose from monocular video sequences is a challenging problem due to the large number of independent parameters affecting image appearance and nonlinear relationships between generating parameters and the resultant images. Unlike the current practice of fitting interpolation functions to point correspondences between underlying pose parameters and image appearance, we exploit the relationship between pose parameters and image motion flow vectors in a physically meaningful way. Change in image appearance due to pose change is realized as navigating a low dimensional submanifold of the infinite dimensional Lie group of diffeomorphisms of the two dimensional sphere S2. For small changes in pose, image motion flow vectors lie on the tangent space of the submanifold. Any observed image motion flow vector field is decomposed into the basis motion vector flow fields on the tangent space and combination weights are used to update corresponding pose changes in the different dimensions of the pose parameter space. Image motion flow vectors are largely invariant to style changes in experiments with synthetic and real data where the subjects exhibit variation in appearance and clothing. The experiments demonstrate the robustness of our method (within +/-4° of ground truth) to style variance.

  16. Spatiotemporal analysis and mapping of oral cancer risk in changhua county (taiwan): an application of generalized bayesian maximum entropy method.

    PubMed

    Yu, Hwa-Lung; Chiang, Chi-Ting; Lin, Shu-De; Chang, Tsun-Kuo

    2010-02-01

    Incidence rate of oral cancer in Changhua County is the highest among the 23 counties of Taiwan during 2001. However, in health data analysis, crude or adjusted incidence rates of a rare event (e.g., cancer) for small populations often exhibit high variances and are, thus, less reliable. We proposed a generalized Bayesian Maximum Entropy (GBME) analysis of spatiotemporal disease mapping under conditions of considerable data uncertainty. GBME was used to study the oral cancer population incidence in Changhua County (Taiwan). Methodologically, GBME is based on an epistematics principles framework and generates spatiotemporal estimates of oral cancer incidence rates. In a way, it accounts for the multi-sourced uncertainty of rates, including small population effects, and the composite space-time dependence of rare events in terms of an extended Poisson-based semivariogram. The results showed that GBME analysis alleviates the noises of oral cancer data from population size effect. Comparing to the raw incidence data, the maps of GBME-estimated results can identify high risk oral cancer regions in Changhua County, where the prevalence of betel quid chewing and cigarette smoking is relatively higher than the rest of the areas. GBME method is a valuable tool for spatiotemporal disease mapping under conditions of uncertainty. 2010 Elsevier Inc. All rights reserved.

  17. Temporal analysis and spatial mapping of Lymantria dispar nuclear polyhedrosis virus transcripts and in vitro translation polypeptides

    Treesearch

    James M. Slavicek

    1991-01-01

    Genomic expression of the Lymantriu dispar multinucleocapsid nuclear polyhedrosis virus (LdMNPV) was studied. Viral specific transcripts expressed in cell culture at various times from 2 through 72 h postinfection were identified and their genomic origins mapped through Northern analysis. Sixty-five distinct transcripts were identified in this...

  18. Usability Evaluation of Public Web Mapping Sites

    NASA Astrophysics Data System (ADS)

    Wang, C.

    2014-04-01

    Web mapping sites are interactive maps that are accessed via Webpages. With the rapid development of Internet and Geographic Information System (GIS) field, public web mapping sites are not foreign to people. Nowadays, people use these web mapping sites for various reasons, in that increasing maps and related map services of web mapping sites are freely available for end users. Thus, increased users of web mapping sites led to more usability studies. Usability Engineering (UE), for instance, is an approach for analyzing and improving the usability of websites through examining and evaluating an interface. In this research, UE method was employed to explore usability problems of four public web mapping sites, analyze the problems quantitatively and provide guidelines for future design based on the test results. Firstly, the development progress for usability studies were described, and simultaneously several usability evaluation methods such as Usability Engineering (UE), User-Centered Design (UCD) and Human-Computer Interaction (HCI) were generally introduced. Then the method and procedure of experiments for the usability test were presented in detail. In this usability evaluation experiment, four public web mapping sites (Google Maps, Bing maps, Mapquest, Yahoo Maps) were chosen as the testing websites. And 42 people, who having different GIS skills (test users or experts), gender (male or female), age and nationality, participated in this test to complete the several test tasks in different teams. The test comprised three parts: a pretest background information questionnaire, several test tasks for quantitative statistics and progress analysis, and a posttest questionnaire. The pretest and posttest questionnaires focused on gaining the verbal explanation of their actions qualitatively. And the design for test tasks targeted at gathering quantitative data for the errors and problems of the websites. Then, the results mainly from the test part were analyzed. The

  19. Detection of myocardial ischemia by automated, motion-corrected, color-encoded perfusion maps compared with visual analysis of adenosine stress cardiovascular magnetic resonance imaging at 3 T: a pilot study.

    PubMed

    Doesch, Christina; Papavassiliu, Theano; Michaely, Henrik J; Attenberger, Ulrike I; Glielmi, Christopher; Süselbeck, Tim; Fink, Christian; Borggrefe, Martin; Schoenberg, Stefan O

    2013-09-01

    The purpose of this study was to compare automated, motion-corrected, color-encoded (AMC) perfusion maps with qualitative visual analysis of adenosine stress cardiovascular magnetic resonance imaging for detection of flow-limiting stenoses. Myocardial perfusion measurements applying the standard adenosine stress imaging protocol and a saturation-recovery temporal generalized autocalibrating partially parallel acquisition (t-GRAPPA) turbo fast low angle shot (Turbo FLASH) magnetic resonance imaging sequence were performed in 25 patients using a 3.0-T MAGNETOM Skyra (Siemens Healthcare Sector, Erlangen, Germany). Perfusion studies were analyzed using AMC perfusion maps and qualitative visual analysis. Angiographically detected coronary artery (CA) stenoses greater than 75% or 50% or more with a myocardial perfusion reserve index less than 1.5 were considered as hemodynamically relevant. Diagnostic performance and time requirement for both methods were compared. Interobserver and intraobserver reliability were also assessed. A total of 29 CA stenoses were included in the analysis. Sensitivity, specificity, positive predictive value, negative predictive value, and accuracy for detection of ischemia on a per-patient basis were comparable using the AMC perfusion maps compared to visual analysis. On a per-CA territory basis, the attribution of an ischemia to the respective vessel was facilitated using the AMC perfusion maps. Interobserver and intraobserver reliability were better for the AMC perfusion maps (concordance correlation coefficient, 0.94 and 0.93, respectively) compared to visual analysis (concordance correlation coefficient, 0.73 and 0.79, respectively). In addition, in comparison to visual analysis, the AMC perfusion maps were able to significantly reduce analysis time from 7.7 (3.1) to 3.2 (1.9) minutes (P < 0.0001). The AMC perfusion maps yielded a diagnostic performance on a per-patient and on a per-CA territory basis comparable with the visual analysis

  20. MareyMap Online: A User-Friendly Web Application and Database Service for Estimating Recombination Rates Using Physical and Genetic Maps.

    PubMed

    Siberchicot, Aurélie; Bessy, Adrien; Guéguen, Laurent; Marais, Gabriel A B

    2017-10-01

    Given the importance of meiotic recombination in biology, there is a need to develop robust methods to estimate meiotic recombination rates. A popular approach, called the Marey map approach, relies on comparing genetic and physical maps of a chromosome to estimate local recombination rates. In the past, we have implemented this approach in an R package called MareyMap, which includes many functionalities useful to get reliable recombination rate estimates in a semi-automated way. MareyMap has been used repeatedly in studies looking at the effect of recombination on genome evolution. Here, we propose a simpler user-friendly web service version of MareyMap, called MareyMap Online, which allows a user to get recombination rates from her/his own data or from a publicly available database that we offer in a few clicks. When the analysis is done, the user is asked whether her/his curated data can be placed in the database and shared with other users, which we hope will make meta-analysis on recombination rates including many species easy in the future. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.