Sample records for likelihood continuity mapping

  1. Speech processing using conditional observable maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, John; Nix, David

    A computer implemented method enables the recognition of speech and speech characteristics. Parameters are initialized of first probability density functions that map between the symbols in the vocabulary of one or more sequences of speech codes that represent speech sounds and a continuity map. Parameters are also initialized of second probability density functions that map between the elements in the vocabulary of one or more desired sequences of speech transcription symbols and the continuity map. The parameters of the probability density functions are then trained to maximize the probabilities of the desired sequences of speech-transcription symbols. A new sequence ofmore » speech codes is then input to the continuity map having the trained first and second probability function parameters. A smooth path is identified on the continuity map that has the maximum probability for the new sequence of speech codes. The probability of each speech transcription symbol for each input speech code can then be output.« less

  2. Speech processing using maximum likelihood continuity mapping

    DOEpatents

    Hogden, John E.

    2000-01-01

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  3. Speech processing using maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, J.E.

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  4. Likelihood-based modification of experimental crystal structure electron density maps

    DOEpatents

    Terwilliger, Thomas C [Sante Fe, NM

    2005-04-16

    A maximum-likelihood method for improves an electron density map of an experimental crystal structure. A likelihood of a set of structure factors {F.sub.h } is formed for the experimental crystal structure as (1) the likelihood of having obtained an observed set of structure factors {F.sub.h.sup.OBS } if structure factor set {F.sub.h } was correct, and (2) the likelihood that an electron density map resulting from {F.sub.h } is consistent with selected prior knowledge about the experimental crystal structure. The set of structure factors {F.sub.h } is then adjusted to maximize the likelihood of {F.sub.h } for the experimental crystal structure. An improved electron density map is constructed with the maximized structure factors.

  5. A maximum likelihood map of chromosome 1.

    PubMed Central

    Rao, D C; Keats, B J; Lalouel, J M; Morton, N E; Yee, S

    1979-01-01

    Thirteen loci are mapped on chromosome 1 from genetic evidence. The maximum likelihood map presented permits confirmation that Scianna (SC) and a fourteenth locus, phenylketonuria (PKU), are on chromosome 1, although the location of the latter on the PGM1-AMY segment is uncertain. Eight other controversial genetic assignments are rejected, providing a practical demonstration of the resolution which maximum likelihood theory brings to mapping. PMID:293128

  6. Geological mapping in northwestern Saudi Arabia using LANDSAT multispectral techniques

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Brown, G. F.; Moik, J. G.

    1975-01-01

    Various computer enhancement and data extraction systems using LANDSAT data were assessed and used to complement a continuing geologic mapping program. Interactive digital classification techniques using both the parallel-piped and maximum-likelihood statistical approaches achieve very limited success in areas of highly dissected terrain. Computer enhanced imagery developed by color compositing stretched MSS ratio data was constructed for a test site in northwestern Saudi Arabia. Initial results indicate that several igneous and sedimentary rock types can be discriminated.

  7. Distributed multimodal data fusion for large scale wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Ertin, Emre

    2006-05-01

    Sensor network technology has enabled new surveillance systems where sensor nodes equipped with processing and communication capabilities can collaboratively detect, classify and track targets of interest over a large surveillance area. In this paper we study distributed fusion of multimodal sensor data for extracting target information from a large scale sensor network. Optimal tracking, classification, and reporting of threat events require joint consideration of multiple sensor modalities. Multiple sensor modalities improve tracking by reducing the uncertainty in the track estimates as well as resolving track-sensor data association problems. Our approach to solving the fusion problem with large number of multimodal sensors is construction of likelihood maps. The likelihood maps provide a summary data for the solution of the detection, tracking and classification problem. The likelihood map presents the sensory information in an easy format for the decision makers to interpret and is suitable with fusion of spatial prior information such as maps, imaging data from stand-off imaging sensors. We follow a statistical approach to combine sensor data at different levels of uncertainty and resolution. The likelihood map transforms each sensor data stream to a spatio-temporal likelihood map ideally suitable for fusion with imaging sensor outputs and prior geographic information about the scene. We also discuss distributed computation of the likelihood map using a gossip based algorithm and present simulation results.

  8. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    PubMed

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  9. Map and map database of susceptibility to slope failure by sliding and earthflow in the Oakland area, California

    USGS Publications Warehouse

    Pike, R.J.; Graymer, R.W.; Roberts, Sebastian; Kalman, N.B.; Sobieszczyk, Steven

    2001-01-01

    Map data that predict the varying likelihood of landsliding can help public agencies make informed decisions on land use and zoning. This map, prepared in a geographic information system from a statistical model, estimates the relative likelihood of local slopes to fail by two processes common to an area of diverse geology, terrain, and land use centered on metropolitan Oakland. The model combines the following spatial data: (1) 120 bedrock and surficial geologic-map units, (2) ground slope calculated from a 30-m digital elevation model, (3) an inventory of 6,714 old landslide deposits (not distinguished by age or type of movement and excluding debris flows), and (4) the locations of 1,192 post-1970 landslides that damaged the built environment. The resulting index of likelihood, or susceptibility, plotted as a 1:50,000-scale map, is computed as a continuous variable over a large area (872 km2) at a comparatively fine (30 m) resolution. This new model complements landslide inventories by estimating susceptibility between existing landslide deposits, and improves upon prior susceptibility maps by quantifying the degree of susceptibility within those deposits. Susceptibility is defined for each geologic-map unit as the spatial frequency (areal percentage) of terrain occupied by old landslide deposits, adjusted locally by steepness of the topography. Susceptibility of terrain between the old landslide deposits is read directly from a slope histogram for each geologic-map unit, as the percentage (0.00 to 0.90) of 30-m cells in each one-degree slope interval that coincides with the deposits. Susceptibility within landslide deposits (0.00 to 1.33) is this same percentage raised by a multiplier (1.33) derived from the comparative frequency of recent failures within and outside the old deposits. Positive results from two evaluations of the model encourage its extension to the 10-county San Francisco Bay region and elsewhere. A similar map could be prepared for any area where the three basic constituents, a geologic map, a landslide inventory, and a slope map, are available in digital form. Added predictive power of the new susceptibility model may reside in attributes that remain to be explored?among them seismic shaking, distance to nearest road, and terrain elevation, aspect, relief, and curvature.

  10. Robust Likelihoods for Inflationary Gravitational Waves from Maps of Cosmic Microwave Background Polarization

    NASA Technical Reports Server (NTRS)

    Switzer, Eric Ryan; Watts, Duncan J.

    2016-01-01

    The B-mode polarization of the cosmic microwave background provides a unique window into tensor perturbations from inflationary gravitational waves. Survey effects complicate the estimation and description of the power spectrum on the largest angular scales. The pixel-space likelihood yields parameter distributions without the power spectrum as an intermediate step, but it does not have the large suite of tests available to power spectral methods. Searches for primordial B-modes must rigorously reject and rule out contamination. Many forms of contamination vary or are uncorrelated across epochs, frequencies, surveys, or other data treatment subsets. The cross power and the power spectrum of the difference of subset maps provide approaches to reject and isolate excess variance. We develop an analogous joint pixel-space likelihood. Contamination not modeled in the likelihood produces parameter-dependent bias and complicates the interpretation of the difference map. We describe a null test that consistently weights the difference map. Excess variance should either be explicitly modeled in the covariance or be removed through reprocessing the data.

  11. Urinary bladder segmentation in CT urography using deep-learning convolutional neural network and level sets

    PubMed Central

    Cha, Kenny H.; Hadjiiski, Lubomir; Samala, Ravi K.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.

    2016-01-01

    Purpose: The authors are developing a computerized system for bladder segmentation in CT urography (CTU) as a critical component for computer-aided detection of bladder cancer. Methods: A deep-learning convolutional neural network (DL-CNN) was trained to distinguish between the inside and the outside of the bladder using 160 000 regions of interest (ROI) from CTU images. The trained DL-CNN was used to estimate the likelihood of an ROI being inside the bladder for ROIs centered at each voxel in a CTU case, resulting in a likelihood map. Thresholding and hole-filling were applied to the map to generate the initial contour for the bladder, which was then refined by 3D and 2D level sets. The segmentation performance was evaluated using 173 cases: 81 cases in the training set (42 lesions, 21 wall thickenings, and 18 normal bladders) and 92 cases in the test set (43 lesions, 36 wall thickenings, and 13 normal bladders). The computerized segmentation accuracy using the DL likelihood map was compared to that using a likelihood map generated by Haar features and a random forest classifier, and that using our previous conjoint level set analysis and segmentation system (CLASS) without using a likelihood map. All methods were evaluated relative to the 3D hand-segmented reference contours. Results: With DL-CNN-based likelihood map and level sets, the average volume intersection ratio, average percent volume error, average absolute volume error, average minimum distance, and the Jaccard index for the test set were 81.9% ± 12.1%, 10.2% ± 16.2%, 14.0% ± 13.0%, 3.6 ± 2.0 mm, and 76.2% ± 11.8%, respectively. With the Haar-feature-based likelihood map and level sets, the corresponding values were 74.3% ± 12.7%, 13.0% ± 22.3%, 20.5% ± 15.7%, 5.7 ± 2.6 mm, and 66.7% ± 12.6%, respectively. With our previous CLASS with local contour refinement (LCR) method, the corresponding values were 78.0% ± 14.7%, 16.5% ± 16.8%, 18.2% ± 15.0%, 3.8 ± 2.3 mm, and 73.9% ± 13.5%, respectively. Conclusions: The authors demonstrated that the DL-CNN can overcome the strong boundary between two regions that have large difference in gray levels and provides a seamless mask to guide level set segmentation, which has been a problem for many gradient-based segmentation methods. Compared to our previous CLASS with LCR method, which required two user inputs to initialize the segmentation, DL-CNN with level sets achieved better segmentation performance while using a single user input. Compared to the Haar-feature-based likelihood map, the DL-CNN-based likelihood map could guide the level sets to achieve better segmentation. The results demonstrate the feasibility of our new approach of using DL-CNN in combination with level sets for segmentation of the bladder. PMID:27036584

  12. Lod scores for gene mapping in the presence of marker map uncertainty.

    PubMed

    Stringham, H M; Boehnke, M

    2001-07-01

    Multipoint lod scores are typically calculated for a grid of locus positions, moving the putative disease locus across a fixed map of genetic markers. Changing the order of a set of markers and/or the distances between the markers can make a substantial difference in the resulting lod score curve and the location and height of its maximum. The typical approach of using the best maximum likelihood marker map is not easily justified if other marker orders are nearly as likely and give substantially different lod score curves. To deal with this problem, we propose three weighted multipoint lod score statistics that make use of information from all plausible marker orders. In each of these statistics, the information conditional on a particular marker order is included in a weighted sum, with weight equal to the posterior probability of that order. We evaluate the type 1 error rate and power of these three statistics on the basis of results from simulated data, and compare these results to those obtained using the best maximum likelihood map and the map with the true marker order. We find that the lod score based on a weighted sum of maximum likelihoods improves on using only the best maximum likelihood map, having a type 1 error rate and power closest to that of using the true marker order in the simulation scenarios we considered. Copyright 2001 Wiley-Liss, Inc.

  13. GRASS 3.0 Programmer’s Manual

    DTIC Science & Technology

    1989-09-01

    cedcecer.amymil. §1 Introducticm -5- -5- Dnomet Ge~ de GRASS continues its development with several key objectives as a guide. Tie programmer should be awme of...They do NOT assurne anything about byte ordering in the cpu. : This neans that tie value is stored using as nmy bytes as required by an integer on de ...mimum likelihood classifier to produce a landcover map. 2 Derived cell files can be the results of image classfication procedues such as clustering

  14. Neural networks for learning and prediction with applications to remote sensing and speech perception

    NASA Astrophysics Data System (ADS)

    Gjaja, Marin N.

    1997-11-01

    Neural networks for supervised and unsupervised learning are developed and applied to problems in remote sensing, continuous map learning, and speech perception. Adaptive Resonance Theory (ART) models are real-time neural networks for category learning, pattern recognition, and prediction. Unsupervised fuzzy ART networks synthesize fuzzy logic and neural networks, and supervised ARTMAP networks incorporate ART modules for prediction and classification. New ART and ARTMAP methods resulting from analyses of data structure, parameter specification, and category selection are developed. Architectural modifications providing flexibility for a variety of applications are also introduced and explored. A new methodology for automatic mapping from Landsat Thematic Mapper (TM) and terrain data, based on fuzzy ARTMAP, is developed. System capabilities are tested on a challenging remote sensing problem, prediction of vegetation classes in the Cleveland National Forest from spectral and terrain features. After training at the pixel level, performance is tested at the stand level, using sites not seen during training. Results are compared to those of maximum likelihood classifiers, back propagation neural networks, and K-nearest neighbor algorithms. Best performance is obtained using a hybrid system based on a convex combination of fuzzy ARTMAP and maximum likelihood predictions. This work forms the foundation for additional studies exploring fuzzy ARTMAP's capability to estimate class mixture composition for non-homogeneous sites. Exploratory simulations apply ARTMAP to the problem of learning continuous multidimensional mappings. A novel system architecture retains basic ARTMAP properties of incremental and fast learning in an on-line setting while adding components to solve this class of problems. The perceptual magnet effect is a language-specific phenomenon arising early in infant speech development that is characterized by a warping of speech sound perception. An unsupervised neural network model is proposed that embodies two principal hypotheses supported by experimental data--that sensory experience guides language-specific development of an auditory neural map and that a population vector can predict psychological phenomena based on map cell activities. Model simulations show how a nonuniform distribution of map cell firing preferences can develop from language-specific input and give rise to the magnet effect.

  15. Efficient Bit-to-Symbol Likelihood Mappings

    NASA Technical Reports Server (NTRS)

    Moision, Bruce E.; Nakashima, Michael A.

    2010-01-01

    This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.

  16. Contamination of food products with Mycobacterium avium paratuberculosis: a systematic review.

    PubMed

    Eltholth, M M; Marsh, V R; Van Winden, S; Guitian, F J

    2009-10-01

    Although a causal link between Mycobacterium avium subspecies paratuberculosis (MAP) and Crohn's disease has not been proved, previous studies suggest that the potential routes of human exposure to MAP should be investigated. We conducted a systematic review of literature concerning the likelihood of contamination of food products with MAP and the likely changes in the quantity of MAP in dairy and meat products along their respective production chains. Relevant data were extracted from 65 research papers and synthesized qualitatively. Although estimates of the prevalence of Johne's disease are scarce, particularly for non-dairy herds, the available data suggest that the likelihood of contamination of raw milk with MAP in most studied regions is substantial. The presence of MAP in raw and pasteurized milk has been the subject of several studies which show that pasteurized milk is not always MAP-free and that the effectiveness of pasteurization in inactivating MAP depends on the initial concentration of the agent in raw milk. The most recent studies indicated that beef can be contaminated with MAP via dissemination of the pathogen in the tissues of infected animals. Currently available data suggests that the likelihood of dairy and meat products being contaminated with MAP on retail sale should not be ignored.

  17. Spacecraft Charging and the Microwave Anisotropy Probe Spacecraft

    NASA Technical Reports Server (NTRS)

    Timothy, VanSant J.; Neergaard, Linda F.

    1998-01-01

    The Microwave Anisotropy Probe (MAP), a MIDEX mission built in partnership between Princeton University and the NASA Goddard Space Flight Center (GSFC), will study the cosmic microwave background. It will be inserted into a highly elliptical earth orbit for several weeks and then use a lunar gravity assist to orbit around the second Lagrangian point (L2), 1.5 million kilometers, anti-sunward from the earth. The charging environment for the phasing loops and at L2 was evaluated. There is a limited set of data for L2; the GEOTAIL spacecraft measured relatively low spacecraft potentials (approx. 50 V maximum) near L2. The main area of concern for charging on the MAP spacecraft is the well-established threat posed by the "geosynchronous region" between 6-10 Re. The launch in the autumn of 2000 will coincide with the falling of the solar maximum, a period when the likelihood of a substorm is higher than usual. The likelihood of a substorm at that time has been roughly estimated to be on the order of 20% for a typical MAP mission profile. Because of the possibility of spacecraft charging, a requirement for conductive spacecraft surfaces was established early in the program. Subsequent NASCAP/GEO analyses for the MAP spacecraft demonstrated that a significant portion of the sunlit surface (solar cell cover glass and sunshade) could have nonconductive surfaces without significantly raising differential charging. The need for conductive materials on surfaces continually in eclipse has also been reinforced by NASCAP analyses.

  18. Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes

    NASA Astrophysics Data System (ADS)

    Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen

    2016-06-01

    Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.

  19. A Maximum Likelihood Approach to Functional Mapping of Longitudinal Binary Traits

    PubMed Central

    Wang, Chenguang; Li, Hongying; Wang, Zhong; Wang, Yaqun; Wang, Ningtao; Wang, Zuoheng; Wu, Rongling

    2013-01-01

    Despite their importance in biology and biomedicine, genetic mapping of binary traits that change over time has not been well explored. In this article, we develop a statistical model for mapping quantitative trait loci (QTLs) that govern longitudinal responses of binary traits. The model is constructed within the maximum likelihood framework by which the association between binary responses is modeled in terms of conditional log odds-ratios. With this parameterization, the maximum likelihood estimates (MLEs) of marginal mean parameters are robust to the misspecification of time dependence. We implement an iterative procedures to obtain the MLEs of QTL genotype-specific parameters that define longitudinal binary responses. The usefulness of the model was validated by analyzing a real example in rice. Simulation studies were performed to investigate the statistical properties of the model, showing that the model has power to identify and map specific QTLs responsible for the temporal pattern of binary traits. PMID:23183762

  20. Elevation maps of the San Francisco Bay region, California, a digital database

    USGS Publications Warehouse

    Graham, Scott E.; Pike, Richard J.

    1998-01-01

    PREFACE: Topography, the configuration of the land surface, plays a major role in various natural processes that have helped shape the ten-county San Francisco Bay region and continue to affect its development. Such processes include a dangerous type of landslide, the debris flow (Ellen and others, 1997) as well as other modes of slope failure that damage property but rarely threaten life directly?slumping, translational sliding, and earthflow (Wentworth and others, 1997). Different types of topographic information at both local and regional scales are helpful in assessing the likelihood of slope failure and the mapping the extent of its past activity, as well as addressing other issues in hazard mitigation and land-use policy. The most useful information is quantitative.

  1. Improving on hidden Markov models: An articulatorily constrained, maximum likelihood approach to speech recognition and speech coding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, J.

    The goal of the proposed research is to test a statistical model of speech recognition that incorporates the knowledge that speech is produced by relatively slow motions of the tongue, lips, and other speech articulators. This model is called Maximum Likelihood Continuity Mapping (Malcom). Many speech researchers believe that by using constraints imposed by articulator motions, we can improve or replace the current hidden Markov model based speech recognition algorithms. Unfortunately, previous efforts to incorporate information about articulation into speech recognition algorithms have suffered because (1) slight inaccuracies in our knowledge or the formulation of our knowledge about articulation maymore » decrease recognition performance, (2) small changes in the assumptions underlying models of speech production can lead to large changes in the speech derived from the models, and (3) collecting measurements of human articulator positions in sufficient quantity for training a speech recognition algorithm is still impractical. The most interesting (and in fact, unique) quality of Malcom is that, even though Malcom makes use of a mapping between acoustics and articulation, Malcom can be trained to recognize speech using only acoustic data. By learning the mapping between acoustics and articulation using only acoustic data, Malcom avoids the difficulties involved in collecting articulator position measurements and does not require an articulatory synthesizer model to estimate the mapping between vocal tract shapes and speech acoustics. Preliminary experiments that demonstrate that Malcom can learn the mapping between acoustics and articulation are discussed. Potential applications of Malcom aside from speech recognition are also discussed. Finally, specific deliverables resulting from the proposed research are described.« less

  2. Genetic mapping in the presence of genotyping errors.

    PubMed

    Cartwright, Dustin A; Troggio, Michela; Velasco, Riccardo; Gutin, Alexander

    2007-08-01

    Genetic maps are built using the genotypes of many related individuals. Genotyping errors in these data sets can distort genetic maps, especially by inflating the distances. We have extended the traditional likelihood model used for genetic mapping to include the possibility of genotyping errors. Each individual marker is assigned an error rate, which is inferred from the data, just as the genetic distances are. We have developed a software package, called TMAP, which uses this model to find maximum-likelihood maps for phase-known pedigrees. We have tested our methods using a data set in Vitis and on simulated data and confirmed that our method dramatically reduces the inflationary effect caused by increasing the number of markers and leads to more accurate orders.

  3. Genetic Mapping in the Presence of Genotyping Errors

    PubMed Central

    Cartwright, Dustin A.; Troggio, Michela; Velasco, Riccardo; Gutin, Alexander

    2007-01-01

    Genetic maps are built using the genotypes of many related individuals. Genotyping errors in these data sets can distort genetic maps, especially by inflating the distances. We have extended the traditional likelihood model used for genetic mapping to include the possibility of genotyping errors. Each individual marker is assigned an error rate, which is inferred from the data, just as the genetic distances are. We have developed a software package, called TMAP, which uses this model to find maximum-likelihood maps for phase-known pedigrees. We have tested our methods using a data set in Vitis and on simulated data and confirmed that our method dramatically reduces the inflationary effect caused by increasing the number of markers and leads to more accurate orders. PMID:17277374

  4. Robust Multipoint Water-Fat Separation Using Fat Likelihood Analysis

    PubMed Central

    Yu, Huanzhou; Reeder, Scott B.; Shimakawa, Ann; McKenzie, Charles A.; Brittain, Jean H.

    2016-01-01

    Fat suppression is an essential part of routine MRI scanning. Multiecho chemical-shift based water-fat separation methods estimate and correct for Bo field inhomogeneity. However, they must contend with the intrinsic challenge of water-fat ambiguity that can result in water-fat swapping. This problem arises because the signals from two chemical species, when both are modeled as a single discrete spectral peak, may appear indistinguishable in the presence of Bo off-resonance. In conventional methods, the water-fat ambiguity is typically removed by enforcing field map smoothness using region growing based algorithms. In reality, the fat spectrum has multiple spectral peaks. Using this spectral complexity, we introduce a novel concept that identifies water and fat for multiecho acquisitions by exploiting the spectral differences between water and fat. A fat likelihood map is produced to indicate if a pixel is likely to be water-dominant or fat-dominant by comparing the fitting residuals of two different signal models. The fat likelihood analysis and field map smoothness provide complementary information, and we designed an algorithm (Fat Likelihood Analysis for Multiecho Signals) to exploit both mechanisms. It is demonstrated in a wide variety of data that the Fat Likelihood Analysis for Multiecho Signals algorithm offers highly robust water-fat separation for 6-echo acquisitions, particularly in some previously challenging applications. PMID:21842498

  5. Averaged kick maps: less noise, more signal…and probably less bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pražnikar, Jure; Afonine, Pavel V.; Gunčar, Gregor

    2009-09-01

    Averaged kick maps are the sum of a series of individual kick maps, where each map is calculated from atomic coordinates modified by random shifts. These maps offer the possibility of an improved and less model-biased map interpretation. Use of reliable density maps is crucial for rapid and successful crystal structure determination. Here, the averaged kick (AK) map approach is investigated, its application is generalized and it is compared with other map-calculation methods. AK maps are the sum of a series of kick maps, where each kick map is calculated from atomic coordinates modified by random shifts. As such, theymore » are a numerical analogue of maximum-likelihood maps. AK maps can be unweighted or maximum-likelihood (σ{sub A}) weighted. Analysis shows that they are comparable and correspond better to the final model than σ{sub A} and simulated-annealing maps. The AK maps were challenged by a difficult structure-validation case, in which they were able to clarify the problematic region in the density without the need for model rebuilding. The conclusion is that AK maps can be useful throughout the entire progress of crystal structure determination, offering the possibility of improved map interpretation.« less

  6. Lung nodule malignancy prediction using multi-task convolutional neural network

    NASA Astrophysics Data System (ADS)

    Li, Xiuli; Kao, Yueying; Shen, Wei; Li, Xiang; Xie, Guotong

    2017-03-01

    In this paper, we investigated the problem of diagnostic lung nodule malignancy prediction using thoracic Computed Tomography (CT) screening. Unlike most existing studies classify the nodules into two types benign and malignancy, we interpreted the nodule malignancy prediction as a regression problem to predict continuous malignancy level. We proposed a joint multi-task learning algorithm using Convolutional Neural Network (CNN) to capture nodule heterogeneity by extracting discriminative features from alternatingly stacked layers. We trained a CNN regression model to predict the nodule malignancy, and designed a multi-task learning mechanism to simultaneously share knowledge among 9 different nodule characteristics (Subtlety, Calcification, Sphericity, Margin, Lobulation, Spiculation, Texture, Diameter and Malignancy), and improved the final prediction result. Each CNN would generate characteristic-specific feature representations, and then we applied multi-task learning on the features to predict the corresponding likelihood for that characteristic. We evaluated the proposed method on 2620 nodules CT scans from LIDC-IDRI dataset with the 5-fold cross validation strategy. The multitask CNN regression result for regression RMSE and mapped classification ACC were 0.830 and 83.03%, while the results for single task regression RMSE 0.894 and mapped classification ACC 74.9%. Experiments show that the proposed method could predict the lung nodule malignancy likelihood effectively and outperforms the state-of-the-art methods. The learning framework could easily be applied in other anomaly likelihood prediction problem, such as skin cancer and breast cancer. It demonstrated the possibility of our method facilitating the radiologists for nodule staging assessment and individual therapeutic planning.

  7. Cosmological parameters from a re-analysis of the WMAP 7 year low-resolution maps

    NASA Astrophysics Data System (ADS)

    Finelli, F.; De Rosa, A.; Gruppuso, A.; Paoletti, D.

    2013-06-01

    Cosmological parameters from Wilkinson Microwave Anisotropy Probe (WMAP) 7 year data are re-analysed by substituting a pixel-based likelihood estimator to the one delivered publicly by the WMAP team. Our pixel-based estimator handles exactly intensity and polarization in a joint manner, allowing us to use low-resolution maps and noise covariance matrices in T, Q, U at the same resolution, which in this work is 3.6°. We describe the features and the performances of the code implementing our pixel-based likelihood estimator. We perform a battery of tests on the application of our pixel-based likelihood routine to WMAP publicly available low-resolution foreground-cleaned products, in combination with the WMAP high-ℓ likelihood, reporting the differences on cosmological parameters evaluated by the full WMAP likelihood public package. The differences are not only due to the treatment of polarization, but also to the marginalization over monopole and dipole uncertainties present in the WMAP pixel likelihood code for temperature. The credible central value for the cosmological parameters change below the 1σ level with respect to the evaluation by the full WMAP 7 year likelihood code, with the largest difference in a shift to smaller values of the scalar spectral index nS.

  8. Slope maps of the San Francisco Bay region, California a digital database

    USGS Publications Warehouse

    Graham, Scott E.; Pike, Richard J.

    1998-01-01

    PREFACE: Topography, the configuration of the land surface, plays a major role in various natural processes that have helped shape the ten-county San Francisco Bay region and continue to affect its development. Such processes include a dangerous type of landslide, the debris flow (Ellen and others, 1997) as well as other modes of slope failure that damage property but rarely threaten life directly?slumping, translational sliding, and earthflow (Wentworth and others, 1997). Different types of topographic information at both local and regional scales are helpful in assessing the likelihood of slope failure and the mapping the extent of its past activity, as well as addressing other issues in hazard mitigation and land-use policy. The most useful information is quantitative. This report provides detailed digital data and plottable map files that depict in detail the most important single measure of ground-surface form for the Bay region, slope angle. We computed slope data for the entire region and each of its constituent counties from a new set of 35,000,000 digital elevations assembled from 200 local contour maps.

  9. Mapping grass communities based on multi-temporal Landsat TM imagery and environmental variables

    NASA Astrophysics Data System (ADS)

    Zeng, Yuandi; Liu, Yanfang; Liu, Yaolin; de Leeuw, Jan

    2007-06-01

    Information on the spatial distribution of grass communities in wetland is increasingly recognized as important for effective wetland management and biological conservation. Remote sensing techniques has been proved to be an effective alternative to intensive and costly ground surveys for mapping grass community. However, the mapping accuracy of grass communities in wetland is still not preferable. The aim of this paper is to develop an effective method to map grass communities in Poyang Lake Natural Reserve. Through statistic analysis, elevation is selected as an environmental variable for its high relationship with the distribution of grass communities; NDVI stacked from images of different months was used to generate Carex community map; the image in October was used to discriminate Miscanthus and Cynodon communities. Classifications were firstly performed with maximum likelihood classifier using single date satellite image with and without elevation; then layered classifications were performed using multi-temporal satellite imagery and elevation with maximum likelihood classifier, decision tree and artificial neural network separately. The results show that environmental variables can improve the mapping accuracy; and the classification with multitemporal imagery and elevation is significantly better than that with single date image and elevation (p=0.001). Besides, maximum likelihood (a=92.71%, k=0.90) and artificial neural network (a=94.79%, k=0.93) perform significantly better than decision tree (a=86.46%, k=0.83).

  10. A maximum likelihood algorithm for genome mapping of cytogenetic loci from meiotic configuration data.

    PubMed Central

    Reyes-Valdés, M H; Stelly, D M

    1995-01-01

    Frequencies of meiotic configurations in cytogenetic stocks are dependent on chiasma frequencies in segments defined by centromeres, breakpoints, and telomeres. The expectation maximization algorithm is proposed as a general method to perform maximum likelihood estimations of the chiasma frequencies in the intervals between such locations. The estimates can be translated via mapping functions into genetic maps of cytogenetic landmarks. One set of observational data was analyzed to exemplify application of these methods, results of which were largely concordant with other comparable data. The method was also tested by Monte Carlo simulation of frequencies of meiotic configurations from a monotelodisomic translocation heterozygote, assuming six different sample sizes. The estimate averages were always close to the values given initially to the parameters. The maximum likelihood estimation procedures can be extended readily to other kinds of cytogenetic stocks and allow the pooling of diverse cytogenetic data to collectively estimate lengths of segments, arms, and chromosomes. Images Fig. 1 PMID:7568226

  11. Two models for evaluating landslide hazards

    USGS Publications Warehouse

    Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.

    2006-01-01

    Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.

  12. Land cover mapping after the tsunami event over Nanggroe Aceh Darussalam (NAD) province, Indonesia

    NASA Astrophysics Data System (ADS)

    Lim, H. S.; MatJafri, M. Z.; Abdullah, K.; Alias, A. N.; Mohd. Saleh, N.; Wong, C. J.; Surbakti, M. S.

    2008-03-01

    Remote sensing offers an important means of detecting and analyzing temporal changes occurring in our landscape. This research used remote sensing to quantify land use/land cover changes at the Nanggroe Aceh Darussalam (Nad) province, Indonesia on a regional scale. The objective of this paper is to assess the changed produced from the analysis of Landsat TM data. A Landsat TM image was used to develop land cover classification map for the 27 March 2005. Four supervised classifications techniques (Maximum Likelihood, Minimum Distance-to- Mean, Parallelepiped and Parallelepiped with Maximum Likelihood Classifier Tiebreaker classifier) were performed to the satellite image. Training sites and accuracy assessment were needed for supervised classification techniques. The training sites were established using polygons based on the colour image. High detection accuracy (>80%) and overall Kappa (>0.80) were achieved by the Parallelepiped with Maximum Likelihood Classifier Tiebreaker classifier in this study. This preliminary study has produced a promising result. This indicates that land cover mapping can be carried out using remote sensing classification method of the satellite digital imagery.

  13. A Bayesian-Based Novel Methodology to Generate Reliable Site Response Mapping Sensitive to Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Chakraborty, A.; Goto, H.

    2017-12-01

    The 2011 off the Pacific coast of Tohoku earthquake caused severe damage in many areas further inside the mainland because of site-amplification. Furukawa district in Miyagi Prefecture, Japan recorded significant spatial differences in ground motion even at sub-kilometer scales. The site responses in the damage zone far exceeded the levels in the hazard maps. A reason why the mismatch occurred is that mapping follow only the mean value at the measurement locations with no regard to the data uncertainties and thus are not always reliable. Our research objective is to develop a methodology to incorporate data uncertainties in mapping and propose a reliable map. The methodology is based on a hierarchical Bayesian modeling of normally-distributed site responses in space where the mean (μ), site-specific variance (σ2) and between-sites variance(s2) parameters are treated as unknowns with a prior distribution. The observation data is artificially created site responses with varying means and variances for 150 seismic events across 50 locations in one-dimensional space. Spatially auto-correlated random effects were added to the mean (μ) using a conditionally autoregressive (CAR) prior. The inferences on the unknown parameters are done using Markov Chain Monte Carlo methods from the posterior distribution. The goal is to find reliable estimates of μ sensitive to uncertainties. During initial trials, we observed that the tau (=1/s2) parameter of CAR prior controls the μ estimation. Using a constraint, s = 1/(k×σ), five spatial models with varying k-values were created. We define reliability to be measured by the model likelihood and propose the maximum likelihood model to be highly reliable. The model with maximum likelihood was selected using a 5-fold cross-validation technique. The results show that the maximum likelihood model (μ*) follows the site-specific mean at low uncertainties and converges to the model-mean at higher uncertainties (Fig.1). This result is highly significant as it successfully incorporates the effect of data uncertainties in mapping. This novel approach can be applied to any research field using mapping techniques. The methodology is now being applied to real records from a very dense seismic network in Furukawa district, Miyagi Prefecture, Japan to generate a reliable map of the site responses.

  14. Mapping Relative Likelihood for the Presence of Naturally Occurring Asbestos in Placer and Eastern Sacramento Counties, California

    NASA Astrophysics Data System (ADS)

    Higgins, C. T.; Clinkenbeard, J. P.; Churchill, R. K.

    2006-12-01

    Naturally occurring asbestos (NOA) is a term applied to the geologic occurrence of six types of silicate minerals that have asbestiform habit. These include the serpentine mineral chrysotile and the amphibole minerals actinolite, amosite, anthophyllite, crocidolite, and tremolite; all are classified as known human carcinogens. NOA, which is likely to be present in at least 50 of the 58 counties of California, is most commonly associated with serpentinite, but has been identified in other geologic settings as well. Because of health concerns, knowledge of where NOA may be present is important to regulatory agencies and the public. To improve this knowledge, the California Geological Survey (CGS) has prepared NOA maps of Placer County and eastern Sacramento County; both counties contain geologic settings where NOA has been observed. The maps are based primarily on geologic information compiled and interpreted from existing geologic and soils maps and on limited fieldwork. The system of map units is modified from an earlier one developed by the CGS for an NOA map of nearby western El Dorado County. In the current system, the counties are subdivided into different areas based on relative likelihood for the presence of NOA. Three types of areas are defined as most likely, moderately likely, and least likely to contain NOA. A fourth type is defined as areas of faulting and shearing; these geologic structures may locally increase the likelihood for the presence of NOA within or adjacent to areas most likely or moderately likely to contain NOA. The maps do not indicate if NOA is present or absent in bedrock or soils at any particular location. Local air pollution control districts are using the maps to help determine where to minimize generation of and exposure to dust that may contain NOA. The maps and accompanying reports can be viewed at http://www.consrv.ca.gov/cgs/ under Hazardous Minerals.

  15. Five Methods for Estimating Angoff Cut Scores with IRT

    ERIC Educational Resources Information Center

    Wyse, Adam E.

    2017-01-01

    This article illustrates five different methods for estimating Angoff cut scores using item response theory (IRT) models. These include maximum likelihood (ML), expected a priori (EAP), modal a priori (MAP), and weighted maximum likelihood (WML) estimators, as well as the most commonly used approach based on translating ratings through the test…

  16. Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.

    PubMed

    Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia

    2017-04-01

    Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.

  17. NAVIS-An UGV Indoor Positioning System Using Laser Scan Matching for Large-Area Real-Time Applications

    PubMed Central

    Tang, Jian.; Chen, Yuwei.; Jaakkola, Anttoni.; Liu, Jinbing.; Hyyppä, Juha.; Hyyppä, Hannu.

    2014-01-01

    Laser scan matching with grid-based maps is a promising tool for real-time indoor positioning of mobile Unmanned Ground Vehicles (UGVs). While there are critical implementation problems, such as the ability to estimate the position by sensing the unknown indoor environment with sufficient accuracy and low enough latency for stable vehicle control, further development work is necessary. Unfortunately, most of the existing methods employ heuristics for quick positioning in which numerous accumulated errors easily lead to loss of positioning accuracy. This severely restricts its applications in large areas and over lengthy periods of time. This paper introduces an efficient real-time mobile UGV indoor positioning system for large-area applications using laser scan matching with an improved probabilistically-motivated Maximum Likelihood Estimation (IMLE) algorithm, which is based on a multi-resolution patch-divided grid likelihood map. Compared with traditional methods, the improvements embodied in IMLE include: (a) Iterative Closed Point (ICP) preprocessing, which adaptively decreases the search scope; (b) a totally brute search matching method on multi-resolution map layers, based on the likelihood value between current laser scan and the grid map within refined search scope, adopted to obtain the global optimum position at each scan matching; and (c) a patch-divided likelihood map supporting a large indoor area. A UGV platform called NAVIS was designed, manufactured, and tested based on a low-cost robot integrating a LiDAR and an odometer sensor to verify the IMLE algorithm. A series of experiments based on simulated data and field tests with NAVIS proved that the proposed IMEL algorithm is a better way to perform local scan matching that can offer a quick and stable positioning solution with high accuracy so it can be part of a large area localization/mapping, application. The NAVIS platform can reach an updating rate of 12 Hz in a feature-rich environment and 2 Hz even in a feature-poor environment, respectively. Therefore, it can be utilized in a real-time application. PMID:24999715

  18. NAVIS-An UGV indoor positioning system using laser scan matching for large-area real-time applications.

    PubMed

    Tang, Jian; Chen, Yuwei; Jaakkola, Anttoni; Liu, Jinbing; Hyyppä, Juha; Hyyppä, Hannu

    2014-07-04

    Laser scan matching with grid-based maps is a promising tool for real-time indoor positioning of mobile Unmanned Ground Vehicles (UGVs). While there are critical implementation problems, such as the ability to estimate the position by sensing the unknown indoor environment with sufficient accuracy and low enough latency for stable vehicle control, further development work is necessary. Unfortunately, most of the existing methods employ heuristics for quick positioning in which numerous accumulated errors easily lead to loss of positioning accuracy. This severely restricts its applications in large areas and over lengthy periods of time. This paper introduces an efficient real-time mobile UGV indoor positioning system for large-area applications using laser scan matching with an improved probabilistically-motivated Maximum Likelihood Estimation (IMLE) algorithm, which is based on a multi-resolution patch-divided grid likelihood map. Compared with traditional methods, the improvements embodied in IMLE include: (a) Iterative Closed Point (ICP) preprocessing, which adaptively decreases the search scope; (b) a totally brute search matching method on multi-resolution map layers, based on the likelihood value between current laser scan and the grid map within refined search scope, adopted to obtain the global optimum position at each scan matching; and (c) a patch-divided likelihood map supporting a large indoor area. A UGV platform called NAVIS was designed, manufactured, and tested based on a low-cost robot integrating a LiDAR and an odometer sensor to verify the IMLE algorithm. A series of experiments based on simulated data and field tests with NAVIS proved that the proposed IMEL algorithm is a better way to perform local scan matching that can offer a quick and stable positioning solution with high accuracy so it can be part of a large area localization/mapping, application. The NAVIS platform can reach an updating rate of 12 Hz in a feature-rich environment and 2 Hz even in a feature-poor environment, respectively. Therefore, it can be utilized in a real-time application.

  19. Improving estimates of genetic maps: a meta-analysis-based approach.

    PubMed

    Stewart, William C L

    2007-07-01

    Inaccurate genetic (or linkage) maps can reduce the power to detect linkage, increase type I error, and distort haplotype and relationship inference. To improve the accuracy of existing maps, I propose a meta-analysis-based method that combines independent map estimates into a single estimate of the linkage map. The method uses the variance of each independent map estimate to combine them efficiently, whether the map estimates use the same set of markers or not. As compared with a joint analysis of the pooled genotype data, the proposed method is attractive for three reasons: (1) it has comparable efficiency to the maximum likelihood map estimate when the pooled data are homogeneous; (2) relative to existing map estimation methods, it can have increased efficiency when the pooled data are heterogeneous; and (3) it avoids the practical difficulties of pooling human subjects data. On the basis of simulated data modeled after two real data sets, the proposed method can reduce the sampling variation of linkage maps commonly used in whole-genome linkage scans. Furthermore, when the independent map estimates are also maximum likelihood estimates, the proposed method performs as well as or better than when they are estimated by the program CRIMAP. Since variance estimates of maps may not always be available, I demonstrate the feasibility of three different variance estimators. Overall, the method should prove useful to investigators who need map positions for markers not contained in publicly available maps, and to those who wish to minimize the negative effects of inaccurate maps. Copyright 2007 Wiley-Liss, Inc.

  20. Mapping Quantitative Traits in Unselected Families: Algorithms and Examples

    PubMed Central

    Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David

    2009-01-01

    Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016

  1. A case study for the integration of predictive mineral potential maps

    NASA Astrophysics Data System (ADS)

    Lee, Saro; Oh, Hyun-Joo; Heo, Chul-Ho; Park, Inhye

    2014-09-01

    This study aims to elaborate on the mineral potential maps using various models and verify the accuracy for the epithermal gold (Au) — silver (Ag) deposits in a Geographic Information System (GIS) environment assuming that all deposits shared a common genesis. The maps of potential Au and Ag deposits were produced by geological data in Taebaeksan mineralized area, Korea. The methodological framework consists of three main steps: 1) identification of spatial relationships 2) quantification of such relationships and 3) combination of multiple quantified relationships. A spatial database containing 46 Au-Ag deposits was constructed using GIS. The spatial association between training deposits and 26 related factors were identified and quantified by probabilistic and statistical modelling. The mineral potential maps were generated by integrating all factors using the overlay method and recombined afterwards using the likelihood ratio model. They were verified by comparison with test mineral deposit locations. The verification revealed that the combined mineral potential map had the greatest accuracy (83.97%), whereas it was 72.24%, 65.85%, 72.23% and 71.02% for the likelihood ratio, weight of evidence, logistic regression and artificial neural network models, respectively. The mineral potential map can provide useful information for the mineral resource development.

  2. Linear functional minimization for inverse modeling

    DOE PAGES

    Barajas-Solano, David A.; Wohlberg, Brendt Egon; Vesselinov, Velimir Valentinov; ...

    2015-06-01

    In this paper, we present a novel inverse modeling strategy to estimate spatially distributed parameters of nonlinear models. The maximum a posteriori (MAP) estimators of these parameters are based on a likelihood functional, which contains spatially discrete measurements of the system parameters and spatiotemporally discrete measurements of the transient system states. The piecewise continuity prior for the parameters is expressed via Total Variation (TV) regularization. The MAP estimator is computed by minimizing a nonquadratic objective equipped with the TV operator. We apply this inversion algorithm to estimate hydraulic conductivity of a synthetic confined aquifer from measurements of conductivity and hydraulicmore » head. The synthetic conductivity field is composed of a low-conductivity heterogeneous intrusion into a high-conductivity heterogeneous medium. Our algorithm accurately reconstructs the location, orientation, and extent of the intrusion from the steady-state data only. Finally, addition of transient measurements of hydraulic head improves the parameter estimation, accurately reconstructing the conductivity field in the vicinity of observation locations.« less

  3. An Iterative Maximum a Posteriori Estimation of Proficiency Level to Detect Multiple Local Likelihood Maxima

    ERIC Educational Resources Information Center

    Magis, David; Raiche, Gilles

    2010-01-01

    In this article the authors focus on the issue of the nonuniqueness of the maximum likelihood (ML) estimator of proficiency level in item response theory (with special attention to logistic models). The usual maximum a posteriori (MAP) method offers a good alternative within that framework; however, this article highlights some drawbacks of its…

  4. Mapping the defoliation potential of gypsy moth

    Treesearch

    David A. Gansner; Stanford L. Arner; Rachel Riemann Hershey; Susan L. King

    1993-01-01

    A model that uses forest stand characteristics to estimate the likelihood of gypsy moth (Lymantria dispar) defoliation has been developed. It was applied to recent forest inventory plot data to produce susceptibility ratings and a map showing defoliation potential for counties in Pennsylvania and six adjacent states on new frontiers of infestation.

  5. Gaussian process inference for estimating pharmacokinetic parameters of dynamic contrast-enhanced MR images.

    PubMed

    Wang, Shijun; Liu, Peter; Turkbey, Baris; Choyke, Peter; Pinto, Peter; Summers, Ronald M

    2012-01-01

    In this paper, we propose a new pharmacokinetic model for parameter estimation of dynamic contrast-enhanced (DCE) MRI by using Gaussian process inference. Our model is based on the Tofts dual-compartment model for the description of tracer kinetics and the observed time series from DCE-MRI is treated as a Gaussian stochastic process. The parameter estimation is done through a maximum likelihood approach and we propose a variant of the coordinate descent method to solve this likelihood maximization problem. The new model was shown to outperform a baseline method on simulated data. Parametric maps generated on prostate DCE data with the new model also provided better enhancement of tumors, lower intensity on false positives, and better boundary delineation when compared with the baseline method. New statistical parameter maps from the process model were also found to be informative, particularly when paired with the PK parameter maps.

  6. A LANDSAT study of ephemeral and perennial rangeland vegetation and soils

    NASA Technical Reports Server (NTRS)

    Bentley, R. G., Jr. (Principal Investigator); Salmon-Drexler, B. C.; Bonner, W. J.; Vincent, R. K.

    1976-01-01

    The author has identified the following significant results. Several methods of computer processing were applied to LANDSAT data for mapping vegetation characteristics of perennial rangeland in Montana and ephemeral rangeland in Arizona. The choice of optimal processing technique was dependent on prescribed mapping and site condition. Single channel level slicing and ratioing of channels were used for simple enhancement. Predictive models for mapping percent vegetation cover based on data from field spectra and LANDSAT data were generated by multiple linear regression of six unique LANDSAT spectral ratios. Ratio gating logic and maximum likelihood classification were applied successfully to recognize plant communities in Montana. Maximum likelihood classification did little to improve recognition of terrain features when compared to a single channel density slice in sparsely vegetated Arizona. LANDSAT was found to be more sensitive to differences between plant communities based on percentages of vigorous vegetation than to actual physical or spectral differences among plant species.

  7. Fuzzy fractals, chaos, and noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zardecki, A.

    1997-05-01

    To distinguish between chaotic and noisy processes, the authors analyze one- and two-dimensional chaotic mappings, supplemented by the additive noise terms. The predictive power of a fuzzy rule-based system allows one to distinguish ergodic and chaotic time series: in an ergodic series the likelihood of finding large numbers is small compared to the likelihood of finding them in a chaotic series. In the case of two dimensions, they consider the fractal fuzzy sets whose {alpha}-cuts are fractals, arising in the context of a quadratic mapping in the extended complex plane. In an example provided by the Julia set, the conceptmore » of Hausdorff dimension enables one to decide in favor of chaotic or noisy evolution.« less

  8. Consumer preferences for beef color and packaging did not affect eating satisfaction.

    PubMed

    Carpenter, C E; Cornforth, D P; Whittier, D

    2001-04-01

    We investigated whether consumer preferences for beef colors (red, purple, and brown) or for beef packaging systems (modified atmosphere, MAP; vacuum skin pack, VSP; or overwrap with polyvinyl chloride, PVC) influenced taste scores of beef steaks and patties. To test beef color effects, boneless beef top loin steaks (choice) and ground beef patties (20% fat) were packaged in different atmospheres to promote development of red, purple, and brown color. To test effects of package type, steaks and patties were pre-treated with carbon monoxide in MAP to promote development of red color, and some meat was repackaged using VSP or PVC overwrap. The differently colored and packaged meats were separately displayed for members of four consumer panels who evaluated appearance and indicated their likelihood to purchase similar meat. Next, the panelists tasted meat samples from what they had been told were the packaging treatments just observed. However, the meat samples actually served were from a single untreated steak or patty. Thus, any difference in taste scores should reflect expectations established during the visual evaluation. The same ballot and sample coding were used for both the visual and taste evaluations. Color and packaging influenced (P<0.001) appearance scores and likelihood to purchase. Appearance scores were rated red>purple >brown and PVC >VSP>MAP. Appearance scores and likelihood to purchase were correlated (r=0.9). However, color or packaging did not affect (P>0.5) taste scores. Thus, consumer preferences for beef color and packaging influenced likelihood to purchase, but did not bias eating satisfaction.

  9. Modeling forest bird species' likelihood of occurrence in Utah with Forest Inventory and Analysis and Landfire map products and ecologically based pseudo-absence points

    Treesearch

    Phoebe L. Zarnetske; Thomas C., Jr. Edwards; Gretchen G. Moisen

    2007-01-01

    Estimating species likelihood of occurrence across extensive landscapes is a powerful management tool. Unfortunately, available occurrence data for landscape-scale modeling is often lacking and usually only in the form of observed presences. Ecologically based pseudo-absence points were generated from within habitat envelopes to accompany presence-only data in habitat...

  10. New estimates of the CMB angular power spectra from the WMAP 5 year low-resolution data

    NASA Astrophysics Data System (ADS)

    Gruppuso, A.; de Rosa, A.; Cabella, P.; Paci, F.; Finelli, F.; Natoli, P.; de Gasperis, G.; Mandolesi, N.

    2009-11-01

    A quadratic maximum likelihood (QML) estimator is applied to the Wilkinson Microwave Anisotropy Probe (WMAP) 5 year low-resolution maps to compute the cosmic microwave background angular power spectra (APS) at large scales for both temperature and polarization. Estimates and error bars for the six APS are provided up to l = 32 and compared, when possible, to those obtained by the WMAP team, without finding any inconsistency. The conditional likelihood slices are also computed for the Cl of all the six power spectra from l = 2 to 10 through a pixel-based likelihood code. Both the codes treat the covariance for (T, Q, U) in a single matrix without employing any approximation. The inputs of both the codes (foreground-reduced maps, related covariances and masks) are provided by the WMAP team. The peaks of the likelihood slices are always consistent with the QML estimates within the error bars; however, an excellent agreement occurs when the QML estimates are used as a fiducial power spectrum instead of the best-fitting theoretical power spectrum. By the full computation of the conditional likelihood on the estimated spectra, the value of the temperature quadrupole CTTl=2 is found to be less than 2σ away from the WMAP 5 year Λ cold dark matter best-fitting value. The BB spectrum is found to be well consistent with zero, and upper limits on the B modes are provided. The parity odd signals TB and EB are found to be consistent with zero.

  11. Measurement of CIB power spectra with CAM-SPEC from Planck HFI maps

    NASA Astrophysics Data System (ADS)

    Mak, Suet Ying; Challinor, Anthony; Efstathiou, George; Lagache, Guilaine

    2015-08-01

    We present new measurements of the cosmic infrared background (CIB) anisotropies and its first likelihood using Planck HFI data at 353, 545, and 857 GHz. The measurements are based on cross-frequency power spectra and likelihood analysis using the CAM-SPEC package, rather than map based template removal of foregrounds as done in previous Planck CIB analysis. We construct the likelihood of the CIB temperature fluctuations, an extension of CAM-SPEC likelihood as used in CMB analysis to higher frequency, and use it to drive the best estimate of the CIB power spectrum over three decades in multiple moment, l, covering 50 ≤ l ≤ 2500. We adopt parametric models of the CIB and foreground contaminants (Galactic cirrus, infrared point sources, and cosmic microwave background anisotropies), and calibrate the dataset uniformly across frequencies with known Planck beam and noise properties in the likelihood construction. We validate our likelihood through simulations and extensive suite of consistency tests, and assess the impact of instrumental and data selection effects on the final CIB power spectrum constraints. Two approaches are developed for interpreting the CIB power spectrum. The first approach is based on simple parametric model which model the cross frequency power using amplitudes, correlation coefficients, and known multipole dependence. The second approach is based on the physical models for galaxy clustering and the evolution of infrared emission of galaxies. The new approaches fit all auto- and cross- power spectra very well, with the best fit of χ2ν = 1.04 (parametric model). Using the best foreground solution, we find that the cleaned CIB power spectra are in good agreement with previous Planck and Herschel measurements.

  12. Root Cause Analysis: Learning from Adverse Safety Events.

    PubMed

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.

  13. An Investigation of the Standard Errors of Expected A Posteriori Ability Estimates.

    ERIC Educational Resources Information Center

    De Ayala, R. J.; And Others

    Expected a posteriori has a number of advantages over maximum likelihood estimation or maximum a posteriori (MAP) estimation methods. These include ability estimates (thetas) for all response patterns, less regression towards the mean than MAP ability estimates, and a lower average squared error. R. D. Bock and R. J. Mislevy (1982) state that the…

  14. Comparing Forest/Nonforest Classifications of Landsat TM Imagery for Stratifying FIA Estimates of Forest Land Area

    Treesearch

    Mark D. Nelson; Ronald E. McRoberts; Greg C. Liknes; Geoffrey R. Holden

    2005-01-01

    Landsat Thematic Mapper (TM) satellite imagery and Forest Inventory and Analysis (FIA) plot data were used to construct forest/nonforest maps of Mapping Zone 41, National Land Cover Dataset 2000 (NLCD 2000). Stratification approaches resulting from Maximum Likelihood, Fuzzy Convolution, Logistic Regression, and k-Nearest Neighbors classification/prediction methods were...

  15. Quasi- and pseudo-maximum likelihood estimators for discretely observed continuous-time Markov branching processes

    PubMed Central

    Chen, Rui; Hyrien, Ollivier

    2011-01-01

    This article deals with quasi- and pseudo-likelihood estimation in a class of continuous-time multi-type Markov branching processes observed at discrete points in time. “Conventional” and conditional estimation are discussed for both approaches. We compare their properties and identify situations where they lead to asymptotically equivalent estimators. Both approaches possess robustness properties, and coincide with maximum likelihood estimation in some cases. Quasi-likelihood functions involving only linear combinations of the data may be unable to estimate all model parameters. Remedial measures exist, including the resort either to non-linear functions of the data or to conditioning the moments on appropriate sigma-algebras. The method of pseudo-likelihood may also resolve this issue. We investigate the properties of these approaches in three examples: the pure birth process, the linear birth-and-death process, and a two-type process that generalizes the previous two examples. Simulations studies are conducted to evaluate performance in finite samples. PMID:21552356

  16. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    NASA Technical Reports Server (NTRS)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.

  17. Likelihood ratios for glaucoma diagnosis using spectral-domain optical coherence tomography.

    PubMed

    Lisboa, Renato; Mansouri, Kaweh; Zangwill, Linda M; Weinreb, Robert N; Medeiros, Felipe A

    2013-11-01

    To present a methodology for calculating likelihood ratios for glaucoma diagnosis for continuous retinal nerve fiber layer (RNFL) thickness measurements from spectral-domain optical coherence tomography (spectral-domain OCT). Observational cohort study. A total of 262 eyes of 187 patients with glaucoma and 190 eyes of 100 control subjects were included in the study. Subjects were recruited from the Diagnostic Innovations Glaucoma Study. Eyes with preperimetric and perimetric glaucomatous damage were included in the glaucoma group. The control group was composed of healthy eyes with normal visual fields from subjects recruited from the general population. All eyes underwent RNFL imaging with Spectralis spectral-domain OCT. Likelihood ratios for glaucoma diagnosis were estimated for specific global RNFL thickness measurements using a methodology based on estimating the tangents to the receiver operating characteristic (ROC) curve. Likelihood ratios could be determined for continuous values of average RNFL thickness. Average RNFL thickness values lower than 86 μm were associated with positive likelihood ratios (ie, likelihood ratios greater than 1), whereas RNFL thickness values higher than 86 μm were associated with negative likelihood ratios (ie, likelihood ratios smaller than 1). A modified Fagan nomogram was provided to assist calculation of posttest probability of disease from the calculated likelihood ratios and pretest probability of disease. The methodology allowed calculation of likelihood ratios for specific RNFL thickness values. By avoiding arbitrary categorization of test results, it potentially allows for an improved integration of test results into diagnostic clinical decision making. Copyright © 2013. Published by Elsevier Inc.

  18. On equivalent parameter learning in simplified feature space based on Bayesian asymptotic analysis.

    PubMed

    Yamazaki, Keisuke

    2012-07-01

    Parametric models for sequential data, such as hidden Markov models, stochastic context-free grammars, and linear dynamical systems, are widely used in time-series analysis and structural data analysis. Computation of the likelihood function is one of primary considerations in many learning methods. Iterative calculation of the likelihood such as the model selection is still time-consuming though there are effective algorithms based on dynamic programming. The present paper studies parameter learning in a simplified feature space to reduce the computational cost. Simplifying data is a common technique seen in feature selection and dimension reduction though an oversimplified space causes adverse learning results. Therefore, we mathematically investigate a condition of the feature map to have an asymptotically equivalent convergence point of estimated parameters, referred to as the vicarious map. As a demonstration to find vicarious maps, we consider the feature space, which limits the length of data, and derive a necessary length for parameter learning in hidden Markov models. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Minimization for conditional simulation: Relationship to optimal transport

    NASA Astrophysics Data System (ADS)

    Oliver, Dean S.

    2014-05-01

    In this paper, we consider the problem of generating independent samples from a conditional distribution when independent samples from the prior distribution are available. Although there are exact methods for sampling from the posterior (e.g. Markov chain Monte Carlo or acceptance/rejection), these methods tend to be computationally demanding when evaluation of the likelihood function is expensive, as it is for most geoscience applications. As an alternative, in this paper we discuss deterministic mappings of variables distributed according to the prior to variables distributed according to the posterior. Although any deterministic mappings might be equally useful, we will focus our discussion on a class of algorithms that obtain implicit mappings by minimization of a cost function that includes measures of data mismatch and model variable mismatch. Algorithms of this type include quasi-linear estimation, randomized maximum likelihood, perturbed observation ensemble Kalman filter, and ensemble of perturbed analyses (4D-Var). When the prior pdf is Gaussian and the observation operators are linear, we show that these minimization-based simulation methods solve an optimal transport problem with a nonstandard cost function. When the observation operators are nonlinear, however, the mapping of variables from the prior to the posterior obtained from those methods is only approximate. Errors arise from neglect of the Jacobian determinant of the transformation and from the possibility of discontinuous mappings.

  20. PepMapper: a collaborative web tool for mapping epitopes from affinity-selected peptides.

    PubMed

    Chen, Wenhan; Guo, William W; Huang, Yanxin; Ma, Zhiqiang

    2012-01-01

    Epitope mapping from affinity-selected peptides has become popular in epitope prediction, and correspondingly many Web-based tools have been developed in recent years. However, the performance of these tools varies in different circumstances. To address this problem, we employed an ensemble approach to incorporate two popular Web tools, MimoPro and Pep-3D-Search, together for taking advantages offered by both methods so as to give users more options for their specific purposes of epitope-peptide mapping. The combined operation of Union finds as many associated peptides as possible from both methods, which increases sensitivity in finding potential epitopic regions on a given antigen surface. The combined operation of Intersection achieves to some extent the mutual verification by the two methods and hence increases the likelihood of locating the genuine epitopic region on a given antigen in relation to the interacting peptides. The Consistency between Intersection and Union is an indirect sufficient condition to assess the likelihood of successful peptide-epitope mapping. On average from 27 tests, the combined operations of PepMapper outperformed either MimoPro or Pep-3D-Search alone. Therefore, PepMapper is another multipurpose mapping tool for epitope prediction from affinity-selected peptides. The Web server can be freely accessed at: http://informatics.nenu.edu.cn/PepMapper/

  1. Landslide susceptibility mapping for a landslide-prone area (Findikli, NE of Turkey) by likelihood-frequency ratio and weighted linear combination models

    NASA Astrophysics Data System (ADS)

    Akgun, Aykut; Dag, Serhat; Bulut, Fikri

    2008-05-01

    Landslides are very common natural problems in the Black Sea Region of Turkey due to the steep topography, improper use of land cover and adverse climatic conditions for landslides. In the western part of region, many studies have been carried out especially in the last decade for landslide susceptibility mapping using different evaluation methods such as deterministic approach, landslide distribution, qualitative, statistical and distribution-free analyses. The purpose of this study is to produce landslide susceptibility maps of a landslide-prone area (Findikli district, Rize) located at the eastern part of the Black Sea Region of Turkey by likelihood frequency ratio (LRM) model and weighted linear combination (WLC) model and to compare the results obtained. For this purpose, landslide inventory map of the area were prepared for the years of 1983 and 1995 by detailed field surveys and aerial-photography studies. Slope angle, slope aspect, lithology, distance from drainage lines, distance from roads and the land-cover of the study area are considered as the landslide-conditioning parameters. The differences between the susceptibility maps derived by the LRM and the WLC models are relatively minor when broad-based classifications are taken into account. However, the WLC map showed more details but the other map produced by LRM model produced weak results. The reason for this result is considered to be the fact that the majority of pixels in the LRM map have high values than the WLC-derived susceptibility map. In order to validate the two susceptibility maps, both of them were compared with the landslide inventory map. Although the landslides do not exist in the very high susceptibility class of the both maps, 79% of the landslides fall into the high and very high susceptibility zones of the WLC map while this is 49% for the LRM map. This shows that the WLC model exhibited higher performance than the LRM model.

  2. Accurate motor mapping in awake common marmosets using micro-electrocorticographical stimulation and stochastic threshold estimation

    NASA Astrophysics Data System (ADS)

    Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty; Castagnola, Elisa; Ansaldo, Alberto; Sato, Kenta; Awiszus, Friedemann; Seki, Kazuhiko; Ricci, Davide; Fadiga, Luciano; Iriki, Atsushi; Ushiba, Junichi

    2018-06-01

    Objective. Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test-retest reliability of epidural cortical stimulation (ECS) mapping has not been examined in detail. Many previous studies defined evoked movements and motor thresholds by visual inspection, and thus, lacked quantitative measurements. A reliable and quantitative motor map is important to elucidate the mechanisms of motor cortical reorganization. The objective of the current study was to perform reliable ECS mapping of motor representations based on the motor thresholds, which were stochastically estimated by motor evoked potentials and chronically implanted micro-electrocorticographical (µECoG) electrode arrays, in common marmosets. Approach. ECS was applied using the implanted µECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through a modified maximum likelihood threshold-hunting algorithm fitted with the recorded data from marmosets. Further, a computer simulation confirmed reliability of the algorithm. Main results. Computer simulation suggested that the modified maximum likelihood threshold-hunting algorithm enabled to estimate motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Significance. Using implanted µECoG electrode arrays and a modified motor threshold-hunting algorithm, we were able to achieve reliable motor mapping in common marmosets with the ECS system.

  3. Accurate motor mapping in awake common marmosets using micro-electrocorticographical stimulation and stochastic threshold estimation.

    PubMed

    Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty; Castagnola, Elisa; Ansaldo, Alberto; Sato, Kenta; Awiszus, Friedemann; Seki, Kazuhiko; Ricci, Davide; Fadiga, Luciano; Iriki, Atsushi; Ushiba, Junichi

    2018-06-01

    Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test-retest reliability of epidural cortical stimulation (ECS) mapping has not been examined in detail. Many previous studies defined evoked movements and motor thresholds by visual inspection, and thus, lacked quantitative measurements. A reliable and quantitative motor map is important to elucidate the mechanisms of motor cortical reorganization. The objective of the current study was to perform reliable ECS mapping of motor representations based on the motor thresholds, which were stochastically estimated by motor evoked potentials and chronically implanted micro-electrocorticographical (µECoG) electrode arrays, in common marmosets. ECS was applied using the implanted µECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through a modified maximum likelihood threshold-hunting algorithm fitted with the recorded data from marmosets. Further, a computer simulation confirmed reliability of the algorithm. Computer simulation suggested that the modified maximum likelihood threshold-hunting algorithm enabled to estimate motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Using implanted µECoG electrode arrays and a modified motor threshold-hunting algorithm, we were able to achieve reliable motor mapping in common marmosets with the ECS system.

  4. Empirical likelihood-based confidence intervals for the sensitivity of a continuous-scale diagnostic test at a fixed level of specificity.

    PubMed

    Gengsheng Qin; Davis, Angela E; Jing, Bing-Yi

    2011-06-01

    For a continuous-scale diagnostic test, it is often of interest to find the range of the sensitivity of the test at the cut-off that yields a desired specificity. In this article, we first define a profile empirical likelihood ratio for the sensitivity of a continuous-scale diagnostic test and show that its limiting distribution is a scaled chi-square distribution. We then propose two new empirical likelihood-based confidence intervals for the sensitivity of the test at a fixed level of specificity by using the scaled chi-square distribution. Simulation studies are conducted to compare the finite sample performance of the newly proposed intervals with the existing intervals for the sensitivity in terms of coverage probability. A real example is used to illustrate the application of the recommended methods.

  5. Implications of climate change on the distribution of the tick vector Ixodes scapularis and risk for Lyme disease in the Texas-Mexico transboundary region

    USDA-ARS?s Scientific Manuscript database

    Disease risk maps are important tools that help ascertain the likelihood of exposure to specific infectious agents. Understanding how climate change may affect the suitability of habitats for ticks will improve the accuracy of risk maps of tick-borne pathogen transmission in humans and domestic anim...

  6. Factors That Impact Registered Nurses' Decisions to Continue Providing Care to Older Adults

    ERIC Educational Resources Information Center

    Bosfield, Saundra

    2013-01-01

    The purpose of this study was to investigate if there is a significant difference in the following: (a) nurses' likelihood to remain in geriatrics between age groups (those over 40 years of age and those under 40 years of age); (b) nurses' likelihood to remain in geriatrics and personality traits; (c) nurses' likelihood to remain in geriatrics…

  7. An Activation Likelihood Estimation Meta-Analysis Study of Simple Motor Movements in Older and Young Adults

    PubMed Central

    Turesky, Ted K.; Turkeltaub, Peter E.; Eden, Guinevere F.

    2016-01-01

    The functional neuroanatomy of finger movements has been characterized with neuroimaging in young adults. However, less is known about the aging motor system. Several studies have contrasted movement-related activity in older versus young adults, but there is inconsistency among their findings. To address this, we conducted an activation likelihood estimation (ALE) meta-analysis on within-group data from older adults and young adults performing regularly paced right-hand finger movement tasks in response to external stimuli. We hypothesized that older adults would show a greater likelihood of activation in right cortical motor areas (i.e., ipsilateral to the side of movement) compared to young adults. ALE maps were examined for conjunction and between-group differences. Older adults showed overlapping likelihoods of activation with young adults in left primary sensorimotor cortex (SM1), bilateral supplementary motor area, bilateral insula, left thalamus, and right anterior cerebellum. Their ALE map differed from that of the young adults in right SM1 (extending into dorsal premotor cortex), right supramarginal gyrus, medial premotor cortex, and right posterior cerebellum. The finding that older adults uniquely use ipsilateral regions for right-hand finger movements and show age-dependent modulations in regions recruited by both age groups provides a foundation by which to understand age-related motor decline and motor disorders. PMID:27799910

  8. Adverse Effects of Electronic Cigarette Use: A Concept Mapping Approach

    PubMed Central

    Nasim, Aashir; Rosas, Scott

    2016-01-01

    Abstract Introduction: Electronic cigarette (ECIG) use has grown rapidly in popularity within a short period of time. As ECIG products continue to evolve and more individuals begin using ECIGs, it is important to understand the potential adverse effects that are associated with ECIG use. The purpose of this study was to examine and describe the acute adverse effects associated with ECIG use. Methods: This study used an integrated, mixed-method participatory approach called concept mapping (CM). Experienced ECIG users ( n = 85) provided statements that answered the focus prompt “A specific negative or unpleasant effect (ie, physical or psychological) that I have experienced either during or immediately after using an electronic cigarette device is…” in an online program. Participants sorted these statements into piles of common themes and rated each statement. Using multidimensional scaling and hierarchical cluster analysis, a concept map of the adverse effects statements was created. Results: Participants generated 79 statements that completed the focus prompt and were retained by researchers. Analysis generated a map containing five clusters that characterized perceived adverse effects of ECIG use: Stigma, Worry/Guilt, Addiction Signs, Physical Effects, and Device/Vapor Problems. Conclusions: ECIG use is associated with adverse effects that should be monitored as ECIGs continue to grow in popularity. If ECIGs are to be regulated, policies should be created that minimize the likelihood of user identified adverse effects. Implications: This article provides a list of adverse effects reported by experienced ECIG users. This article organizes these effects into a conceptual model that may be useful for better understanding the adverse outcomes associated with ECIG use. These identified adverse effects may be useful for health professionals and policy makers. Health professionals should be aware of potential negative health effects that may be associated with ECIG use and policy makers could design ECIG regulations that minimize the risk of the adverse effects reported by ECIG users in this study. PMID:26563262

  9. Clinical Paresthesia Atlas Illustrates Likelihood of Coverage Based on Spinal Cord Stimulator Electrode Location.

    PubMed

    Taghva, Alexander; Karst, Edward; Underwood, Paul

    2017-08-01

    Concordant paresthesia coverage is an independent predictor of pain relief following spinal cord stimulation (SCS). Using aggregate data, our objective is to produce a map of paresthesia coverage as a function of electrode location in SCS. This retrospective analysis used x-rays, SCS programming data, and paresthesia coverage maps from the EMPOWER registry of SCS implants for chronic neuropathic pain. Spinal level of dorsal column stimulation was determined by x-ray adjudication and active cathodes in patient programs. Likelihood of paresthesia coverage was determined as a function of stimulating electrode location. Segments of paresthesia coverage were grouped anatomically. Fisher's exact test was used to identify significant differences in likelihood of paresthesia coverage as a function of spinal stimulation level. In the 178 patients analyzed, the most prevalent areas of paresthesia coverage were buttocks, anterior and posterior thigh (each 98%), and low back (94%). Unwanted paresthesia at the ribs occurred in 8% of patients. There were significant differences in the likelihood of achieving paresthesia, with higher thoracic levels (T5, T6, and T7) more likely to achieve low back coverage but also more likely to introduce paresthesia felt at the ribs. Higher levels in the thoracic spine were associated with greater coverage of the buttocks, back, and thigh, and with lesser coverage of the leg and foot. This paresthesia atlas uses real-world, aggregate data to determine likelihood of paresthesia coverage as a function of stimulating electrode location. It represents an application of "big data" techniques, and a step toward achieving personalized SCS therapy tailored to the individual's chronic pain. © 2017 International Neuromodulation Society.

  10. Functional reorganisation in chronic pain and neural correlates of pain sensitisation: A coordinate based meta-analysis of 266 cutaneous pain fMRI studies.

    PubMed

    Tanasescu, Radu; Cottam, William J; Condon, Laura; Tench, Christopher R; Auer, Dorothee P

    2016-09-01

    Maladaptive mechanisms of pain processing in chronic pain conditions (CP) are poorly understood. We used coordinate based meta-analysis of 266 fMRI pain studies to study functional brain reorganisation in CP and experimental models of hyperalgesia. The pattern of nociceptive brain activation was similar in CP, hyperalgesia and normalgesia in controls. However, elevated likelihood of activation was detected in the left putamen, left frontal gyrus and right insula in CP comparing stimuli of the most painful vs. other site. Meta-analysis of contrast maps showed no difference between CP, controls, mood conditions. In contrast, experimental hyperalgesia induced stronger activation in the bilateral insula, left cingulate and right frontal gyrus. Activation likelihood maps support a shared neural pain signature of cutaneous nociception in CP and controls. We also present a double dissociation between neural correlates of transient and persistent pain sensitisation with general increased activation intensity but unchanged pattern in experimental hyperalgesia and, by contrast, focally increased activation likelihood, but unchanged intensity, in CP when stimulated at the most painful body part. Copyright © 2016. Published by Elsevier Ltd.

  11. Satellite information on Orlando, Florida. [coordination of LANDSAT and Skylab data and EREP photography

    NASA Technical Reports Server (NTRS)

    Hannah, J. W.; Thomas, G. L.; Esparza, F.

    1975-01-01

    A land use map of Orange County, Florida was prepared from EREP photography while LANDSAT and EREP multispectral scanner data were used to provide more detailed information on Orlando and its suburbs. The generalized maps were prepared by tracing the patterns on an overlay, using an enlarging viewer. Digital analysis of the multispectral scanner data was basically the maximum likelihood classification method with training sample input and computer printer mapping of the results. Urban features delineated by the maps are discussed. It is concluded that computer classification, accompanied by human interpretation and manual simplification can produce land use maps which are useful on a regional, county, and city basis.

  12. Box-Cox transformation for QTL mapping.

    PubMed

    Yang, Runqing; Yi, Nengjun; Xu, Shizhong

    2006-01-01

    The maximum likelihood method of QTL mapping assumes that the phenotypic values of a quantitative trait follow a normal distribution. If the assumption is violated, some forms of transformation should be taken to make the assumption approximately true. The Box-Cox transformation is a general transformation method which can be applied to many different types of data. The flexibility of the Box-Cox transformation is due to a variable, called transformation factor, appearing in the Box-Cox formula. We developed a maximum likelihood method that treats the transformation factor as an unknown parameter, which is estimated from the data simultaneously along with the QTL parameters. The method makes an objective choice of data transformation and thus can be applied to QTL analysis for many different types of data. Simulation studies show that (1) Box-Cox transformation can substantially increase the power of QTL detection; (2) Box-Cox transformation can replace some specialized transformation methods that are commonly used in QTL mapping; and (3) applying the Box-Cox transformation to data already normally distributed does not harm the result.

  13. Approximate likelihood approaches for detecting the influence of primordial gravitational waves in cosmic microwave background polarization

    NASA Astrophysics Data System (ADS)

    Pan, Zhen; Anderes, Ethan; Knox, Lloyd

    2018-05-01

    One of the major targets for next-generation cosmic microwave background (CMB) experiments is the detection of the primordial B-mode signal. Planning is under way for Stage-IV experiments that are projected to have instrumental noise small enough to make lensing and foregrounds the dominant source of uncertainty for estimating the tensor-to-scalar ratio r from polarization maps. This makes delensing a crucial part of future CMB polarization science. In this paper we present a likelihood method for estimating the tensor-to-scalar ratio r from CMB polarization observations, which combines the benefits of a full-scale likelihood approach with the tractability of the quadratic delensing technique. This method is a pixel space, all order likelihood analysis of the quadratic delensed B modes, and it essentially builds upon the quadratic delenser by taking into account all order lensing and pixel space anomalies. Its tractability relies on a crucial factorization of the pixel space covariance matrix of the polarization observations which allows one to compute the full Gaussian approximate likelihood profile, as a function of r , at the same computational cost of a single likelihood evaluation.

  14. Gaussian statistics of the cosmic microwave background: Correlation of temperature extrema in the COBE DMR two-year sky maps

    NASA Technical Reports Server (NTRS)

    Kogut, A.; Banday, A. J.; Bennett, C. L.; Hinshaw, G.; Lubin, P. M.; Smoot, G. F.

    1995-01-01

    We use the two-point correlation function of the extrema points (peaks and valleys) in the Cosmic Background Explorer (COBE) Differential Microwave Radiometers (DMR) 2 year sky maps as a test for non-Gaussian temperature distribution in the cosmic microwave background anisotropy. A maximum-likelihood analysis compares the DMR data to n = 1 toy models whose random-phase spherical harmonic components a(sub lm) are drawn from either Gaussian, chi-square, or log-normal parent populations. The likelihood of the 53 GHz (A+B)/2 data is greatest for the exact Gaussian model. There is less than 10% chance that the non-Gaussian models tested describe the DMR data, limited primarily by type II errors in the statistical inference. The extrema correlation function is a stronger test for this class of non-Gaussian models than topological statistics such as the genus.

  15. Inverse Ising problem in continuous time: A latent variable approach

    NASA Astrophysics Data System (ADS)

    Donner, Christian; Opper, Manfred

    2017-12-01

    We consider the inverse Ising problem: the inference of network couplings from observed spin trajectories for a model with continuous time Glauber dynamics. By introducing two sets of auxiliary latent random variables we render the likelihood into a form which allows for simple iterative inference algorithms with analytical updates. The variables are (1) Poisson variables to linearize an exponential term which is typical for point process likelihoods and (2) Pólya-Gamma variables, which make the likelihood quadratic in the coupling parameters. Using the augmented likelihood, we derive an expectation-maximization (EM) algorithm to obtain the maximum likelihood estimate of network parameters. Using a third set of latent variables we extend the EM algorithm to sparse couplings via L1 regularization. Finally, we develop an efficient approximate Bayesian inference algorithm using a variational approach. We demonstrate the performance of our algorithms on data simulated from an Ising model. For data which are simulated from a more biologically plausible network with spiking neurons, we show that the Ising model captures well the low order statistics of the data and how the Ising couplings are related to the underlying synaptic structure of the simulated network.

  16. A high-density transcript linkage map with 1,845 expressed genes positioned by microarray-based Single Feature Polymorphisms (SFP) in Eucalyptus

    PubMed Central

    2011-01-01

    Background Technological advances are progressively increasing the application of genomics to a wider array of economically and ecologically important species. High-density maps enriched for transcribed genes facilitate the discovery of connections between genes and phenotypes. We report the construction of a high-density linkage map of expressed genes for the heterozygous genome of Eucalyptus using Single Feature Polymorphism (SFP) markers. Results SFP discovery and mapping was achieved using pseudo-testcross screening and selective mapping to simultaneously optimize linkage mapping and microarray costs. SFP genotyping was carried out by hybridizing complementary RNA prepared from 4.5 year-old trees xylem to an SFP array containing 103,000 25-mer oligonucleotide probes representing 20,726 unigenes derived from a modest size expressed sequence tags collection. An SFP-mapping microarray with 43,777 selected candidate SFP probes representing 15,698 genes was subsequently designed and used to genotype SFPs in a larger subset of the segregating population drawn by selective mapping. A total of 1,845 genes were mapped, with 884 of them ordered with high likelihood support on a framework map anchored to 180 microsatellites with average density of 1.2 cM. Using more probes per unigene increased by two-fold the likelihood of detecting segregating SFPs eventually resulting in more genes mapped. In silico validation showed that 87% of the SFPs map to the expected location on the 4.5X draft sequence of the Eucalyptus grandis genome. Conclusions The Eucalyptus 1,845 gene map is the most highly enriched map for transcriptional information for any forest tree species to date. It represents a major improvement on the number of genes previously positioned on Eucalyptus maps and provides an initial glimpse at the gene space for this global tree genome. A general protocol is proposed to build high-density transcript linkage maps in less characterized plant species by SFP genotyping with a concurrent objective of reducing microarray costs. HIgh-density gene-rich maps represent a powerful resource to assist gene discovery endeavors when used in combination with QTL and association mapping and should be especially valuable to assist the assembly of reference genome sequences soon to come for several plant and animal species. PMID:21492453

  17. PROBABILISTIC CROSS-IDENTIFICATION IN CROWDED FIELDS AS AN ASSIGNMENT PROBLEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budavári, Tamás; Basu, Amitabh, E-mail: budavari@jhu.edu, E-mail: basu.amitabh@jhu.edu

    2016-10-01

    One of the outstanding challenges of cross-identification is multiplicity: detections in crowded regions of the sky are often linked to more than one candidate associations of similar likelihoods. We map the resulting maximum likelihood partitioning to the fundamental assignment problem of discrete mathematics and efficiently solve the two-way catalog-level matching in the realm of combinatorial optimization using the so-called Hungarian algorithm. We introduce the method, demonstrate its performance in a mock universe where the true associations are known, and discuss the applicability of the new procedure to large surveys.

  18. Probabilistic Cross-identification in Crowded Fields as an Assignment Problem

    NASA Astrophysics Data System (ADS)

    Budavári, Tamás; Basu, Amitabh

    2016-10-01

    One of the outstanding challenges of cross-identification is multiplicity: detections in crowded regions of the sky are often linked to more than one candidate associations of similar likelihoods. We map the resulting maximum likelihood partitioning to the fundamental assignment problem of discrete mathematics and efficiently solve the two-way catalog-level matching in the realm of combinatorial optimization using the so-called Hungarian algorithm. We introduce the method, demonstrate its performance in a mock universe where the true associations are known, and discuss the applicability of the new procedure to large surveys.

  19. Mapping of quantitative trait loci using the skew-normal distribution.

    PubMed

    Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos

    2007-11-01

    In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.

  20. Using known map category marginal frequencies to improve estimates of thematic map accuracy

    NASA Technical Reports Server (NTRS)

    Card, D. H.

    1982-01-01

    By means of two simple sampling plans suggested in the accuracy-assessment literature, it is shown how one can use knowledge of map-category relative sizes to improve estimates of various probabilities. The fact that maximum likelihood estimates of cell probabilities for the simple random sampling and map category-stratified sampling were identical has permitted a unified treatment of the contingency-table analysis. A rigorous analysis of the effect of sampling independently within map categories is made possible by results for the stratified case. It is noted that such matters as optimal sample size selection for the achievement of a desired level of precision in various estimators are irrelevant, since the estimators derived are valid irrespective of how sample sizes are chosen.

  1. Maximum-Likelihood Detection Of Noncoherent CPM

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  2. Uncertainty in the profitability of fertilizer management based on various sampling designs.

    NASA Astrophysics Data System (ADS)

    Muhammed, Shibu; Ben, Marchant; Webster, Richard; Milne, Alice; Dailey, Gordon; Whitmore, Andrew

    2016-04-01

    Many farmers sample their soil to measure the concentrations of plant nutrients, including phosphorus (P), so as to decide how much fertilizer to apply. Now that fertilizer can be applied at variable rates, farmers want to know whether maps of nutrient concentration made from grid samples or from field subdivisions (zones within their fields) are merited: do such maps lead to greater profit than would a single measurement on a bulked sample for each field when all costs are taken into account? We have examined the merits of grid-based and zone-based sampling strategies over single field-based averages using continuous spatial data on wheat yields at harvest in six fields in southern England and simulated concentrations of P in the soil. Features of the spatial variation in the yields provide predictions about which sampling scheme is likely to be most cost effective, but there is uncertainty associated with these predictions that must be communicated to farmers. Where variograms of the yield have large variances and long effective ranges, grid-sampling and mapping nutrients are likely to be cost-effective. Where effective ranges are short, sampling must be dense to reveal the spatial variation and may be expensive. In these circumstances variable-rate application of fertilizer is likely to be impracticable and almost certainly not cost-effective. We have explored several methods for communicating these results and found that the most effective method was using probability maps that show the likelihood of grid-based and zone-based sampling being more profitable that a field-based estimate.

  3. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    PubMed

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  4. Landsat continuity: Issues and opportunities for land cover monitoring

    USGS Publications Warehouse

    Wulder, M.A.; White, Joanne C.; Goward, S.N.; Masek, J.G.; Irons, J.R.; Herold, M.; Cohen, W.B.; Loveland, Thomas R.; Woodcock, C.E.

    2008-01-01

    Initiated in 1972, the Landsat program has provided a continuous record of earth observation for 35 years. The assemblage of Landsat spatial, spectral, and temporal resolutions, over a reasonably sized image extent, results in imagery that can be processed to represent land cover over large areas with an amount of spatial detail that is absolutely unique and indispensable for monitoring, management, and scientific activities. Recent technical problems with the two existing Landsat satellites, and delays in the development and launch of a successor, increase the likelihood that a gap in Landsat continuity may occur. In this communication, we identify the key features of the Landsat program that have resulted in the extensive use of Landsat data for large area land cover mapping and monitoring. We then augment this list of key features by examining the data needs of existing large area land cover monitoring programs. Subsequently, we use this list as a basis for reviewing the current constellation of earth observation satellites to identify potential alternative data sources for large area land cover applications. Notions of a virtual constellation of satellites to meet large area land cover mapping and monitoring needs are also presented. Finally, research priorities that would facilitate the integration of these alternative data sources into existing large area land cover monitoring programs are identified. Continuity of the Landsat program and the measurements provided are critical for scientific, environmental, economic, and social purposes. It is difficult to overstate the importance of Landsat; there are no other systems in orbit, or planned for launch in the short-term, that can duplicate or approach replication, of the measurements and information conferred by Landsat. While technical and political options are being pursued, there is no satellite image data stream poised to enter the National Satellite Land Remote Sensing Data Archive should system failures occur to Landsat-5 and -7.

  5. An improved image non-blind image deblurring method based on FoEs

    NASA Astrophysics Data System (ADS)

    Zhu, Qidan; Sun, Lei

    2013-03-01

    Traditional non-blind image deblurring algorithms always use maximum a posterior(MAP). MAP estimates involving natural image priors can reduce the ripples effectively in contrast to maximum likelihood(ML). However, they have been found lacking in terms of restoration performance. Based on this issue, we utilize MAP with KL penalty to replace traditional MAP. We develop an image reconstruction algorithm that minimizes the KL divergence between the reference distribution and the prior distribution. The approximate KL penalty can restrain over-smooth caused by MAP. We use three groups of images and Harris corner detection to prove our method. The experimental results show that our algorithm of non-blind image restoration can effectively reduce the ringing effect and exhibit the state-of-the-art deblurring results.

  6. Planck 2015 results. I. Overview of products and scientific results

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Adam, R.; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Alves, M. I. R.; Argüeso, F.; Arnaud, M.; Arroja, F.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Ballardini, M.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Basak, S.; Battaglia, P.; Battaner, E.; Battye, R.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bertincourt, B.; Bielewicz, P.; Bikmaev, I.; Bock, J. J.; Böhringer, H.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burenin, R.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carvalho, P.; Casaponsa, B.; Castex, G.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.-R.; Chiang, H. C.; Chluba, J.; Chon, G.; Christensen, P. R.; Church, S.; Clemens, M.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Comis, B.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Eisenhardt, P. R. M.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Fantaye, Y.; Farhang, M.; Feeney, S.; Fergusson, J.; Fernandez-Cobos, R.; Feroz, F.; Finelli, F.; Florido, E.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschet, C.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Génova-Santos, R. T.; Gerbino, M.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Giusarma, E.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Grainge, K. J. B.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hamann, J.; Handley, W.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Heavens, A.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Ilić, S.; Jaffe, A. H.; Jaffe, T. R.; Jin, T.; Jones, W. C.; Juvela, M.; Karakci, A.; Keihänen, E.; Keskitalo, R.; Khamitov, I.; Kiiveri, K.; Kim, J.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Langer, M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leahy, J. P.; Lellouch, E.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Levrier, F.; Lewis, A.; Liguori, M.; Lilje, P. B.; Lilley, M.; Linden-Vørnle, M.; Lindholm, V.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Ma, Y.-Z.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mak, D. S. Y.; Mandolesi, N.; Mangilli, A.; Marchini, A.; Marcos-Caballero, A.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martinelli, M.; Martínez-González, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; McEwen, J. D.; McGehee, P.; Mei, S.; Meinhold, P. R.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Moreno, R.; Morgante, G.; Mortlock, D.; Moss, A.; Mottet, S.; Münchmeyer, M.; Munshi, D.; Murphy, J. A.; Narimani, A.; Naselsky, P.; Nastasi, A.; Nati, F.; Natoli, P.; Negrello, M.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Olamaie, M.; Oppermann, N.; Orlando, E.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Pandolfi, S.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Peel, M.; Peiris, H. V.; Pelkonen, V.-M.; Perdereau, O.; Perotto, L.; Perrott, Y. C.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pogosyan, D.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Racine, B.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Roman, M.; Romelli, E.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rouillé d'Orfeuil, B.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Ruiz-Granados, B.; Rumsey, C.; Rusholme, B.; Said, N.; Salvatelli, V.; Salvati, L.; Sandri, M.; Sanghera, H. S.; Santos, D.; Saunders, R. D. E.; Sauvé, A.; Savelainen, M.; Savini, G.; Schaefer, B. M.; Schammel, M. P.; Scott, D.; Seiffert, M. D.; Serra, P.; Shellard, E. P. S.; Shimwell, T. W.; Shiraishi, M.; Smith, K.; Souradeep, T.; Spencer, L. D.; Spinelli, M.; Stanford, S. A.; Stern, D.; Stolyarov, V.; Stompor, R.; Strong, A. W.; Sudiwala, R.; Sunyaev, R.; Sutter, P.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Texier, D.; Toffolatti, L.; Tomasi, M.; Tornikoski, M.; Tramonte, D.; Tristram, M.; Troja, A.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, F.; Vassallo, T.; Vibert, L.; Vidal, M.; Viel, M.; Vielva, P.; Villa, F.; Wade, L. A.; Walter, B.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Welikala, N.; Weller, J.; White, M.; White, S. D. M.; Wilkinson, A.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    The European Space Agency's Planck satellite, which is dedicated to studying the early Universe and its subsequent evolution, was launched on 14 May 2009. It scanned the microwave and submillimetre sky continuously between 12 August 2009 and 23 October 2013. In February 2015, ESA and the Planck Collaboration released the second set of cosmology products based ondata from the entire Planck mission, including both temperature and polarization, along with a set of scientific and technical papers and a web-based explanatory supplement. This paper gives an overview of the main characteristics of the data and the data products in the release, as well as the associated cosmological and astrophysical science results and papers. The data products include maps of the cosmic microwave background (CMB), the thermal Sunyaev-Zeldovich effect, diffuse foregrounds in temperature and polarization, catalogues of compact Galactic and extragalactic sources (including separate catalogues of Sunyaev-Zeldovich clusters and Galactic cold clumps), and extensive simulations of signals and noise used in assessing uncertainties and the performance of the analysis methods. The likelihood code used to assess cosmological models against the Planck data is described, along with a CMB lensing likelihood. Scientific results include cosmological parameters derived from CMB power spectra, gravitational lensing, and cluster counts, as well as constraints on inflation, non-Gaussianity, primordial magnetic fields, dark energy, and modified gravity, and new results on low-frequency Galactic foregrounds.

  7. Decoding fMRI events in sensorimotor motor network using sparse paradigm free mapping and activation likelihood estimates.

    PubMed

    Tan, Francisca M; Caballero-Gaudes, César; Mullinger, Karen J; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L; Francis, Susan T; Gowland, Penny A

    2017-11-01

    Most functional MRI (fMRI) studies map task-driven brain activity using a block or event-related paradigm. Sparse paradigm free mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information, but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of activation likelihood estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the sensorimotor network (SMN) to six motor functions (left/right fingers, left/right toes, swallowing, and eye blinks). We validated the framework using simultaneous electromyography (EMG)-fMRI experiments and motor tasks with short and long duration, and random interstimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events were 77 ± 13% and 74 ± 16%, respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55% and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this article discusses methodological implications and improvements to increase the decoding performance. Hum Brain Mapp 38:5778-5794, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  8. Decoding fMRI events in Sensorimotor Motor Network using Sparse Paradigm Free Mapping and Activation Likelihood Estimates

    PubMed Central

    Tan, Francisca M.; Caballero-Gaudes, César; Mullinger, Karen J.; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L.; Francis, Susan T.; Gowland, Penny A.

    2017-01-01

    Most fMRI studies map task-driven brain activity using a block or event-related paradigm. Sparse Paradigm Free Mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information; but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of Activation Likelihood Estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the Sensorimotor Network (SMN) to six motor function (left/right fingers, left/right toes, swallowing and eye blinks). We validated the framework using simultaneous Electromyography-fMRI experiments and motor tasks with short and long duration, and random inter-stimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events was 77 ± 13% and 74 ± 16% respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55 and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this paper discusses methodological implications and improvements to increase the decoding performance. PMID:28815863

  9. Applying six classifiers to airborne hyperspectral imagery for detecting giant reed

    USDA-ARS?s Scientific Manuscript database

    This study evaluated and compared six different image classifiers, including minimum distance (MD), Mahalanobis distance (MAHD), maximum likelihood (ML), spectral angle mapper (SAM), mixture tuned matched filtering (MTMF) and support vector machine (SVM), for detecting and mapping giant reed (Arundo...

  10. Likelihood Ratios for Glaucoma Diagnosis Using Spectral Domain Optical Coherence Tomography

    PubMed Central

    Lisboa, Renato; Mansouri, Kaweh; Zangwill, Linda M.; Weinreb, Robert N.; Medeiros, Felipe A.

    2014-01-01

    Purpose To present a methodology for calculating likelihood ratios for glaucoma diagnosis for continuous retinal nerve fiber layer (RNFL) thickness measurements from spectral domain optical coherence tomography (spectral-domain OCT). Design Observational cohort study. Methods 262 eyes of 187 patients with glaucoma and 190 eyes of 100 control subjects were included in the study. Subjects were recruited from the Diagnostic Innovations Glaucoma Study. Eyes with preperimetric and perimetric glaucomatous damage were included in the glaucoma group. The control group was composed of healthy eyes with normal visual fields from subjects recruited from the general population. All eyes underwent RNFL imaging with Spectralis spectral-domain OCT. Likelihood ratios for glaucoma diagnosis were estimated for specific global RNFL thickness measurements using a methodology based on estimating the tangents to the Receiver Operating Characteristic (ROC) curve. Results Likelihood ratios could be determined for continuous values of average RNFL thickness. Average RNFL thickness values lower than 86μm were associated with positive LRs, i.e., LRs greater than 1; whereas RNFL thickness values higher than 86μm were associated with negative LRs, i.e., LRs smaller than 1. A modified Fagan nomogram was provided to assist calculation of post-test probability of disease from the calculated likelihood ratios and pretest probability of disease. Conclusion The methodology allowed calculation of likelihood ratios for specific RNFL thickness values. By avoiding arbitrary categorization of test results, it potentially allows for an improved integration of test results into diagnostic clinical decision-making. PMID:23972303

  11. Assessment of goals and priorities in patients with a chronic condition: a secondary quantitative analysis of determinants across 11 countries.

    PubMed

    Vermunt, Neeltje P C A; Westert, Gert P; Olde Rikkert, Marcel G M; Faber, Marjan J

    2018-03-01

    To assess the impact of patient characteristics, patient-professional engagement, communication and context on the probability that healthcare professionals will discuss goals or priorities with older patients. Secondary analysis of cross-sectional data from the 2014 Commonwealth Fund International Health Policy Survey of Older Adults. 11 western countries. Community-dwelling adults, aged 55 or older. Assessment of goals and priorities. The final sample size consisted of 17,222 respondents, 54% of whom reported an assessment of their goals and priorities (AGP) by healthcare professionals. In logistic regression model 1, which was used to analyse the entire population, the determinants found to have moderate to large effects on the likelihood of AGP were information exchange on stress, diet or exercise, or both. Country (living in Sweden) and continuity of care (no regular professional or organisation) had moderate to large negative effects on the likelihood of AGP. In model 2, which focussed on respondents who experienced continuity of care, country and information exchange on stress and lifestyle were the main determinants of AGP, with comparable odds ratios to model 1. Furthermore, a professional asking questions also increased the likelihood of AGP. Continuity of care and information exchange is associated with a higher probability of AGP, while people living in Sweden are less likely to experience these assessments. Further study is required to determine whether increasing information exchange and professionals asking more questions may improve goal setting with older patients. Key points   A patient goal-oriented approach can be beneficial for older patients with chronic conditions or multimorbidity; however, discussing goals with these patients is not a common practice. The likelihood of discussing goals varies by country, occurring most commonly in the USA, and least often in Sweden. Country-level differences in continuity of care and questions asked by a regularly visited professional affect the goal discussion probability. Patient characteristics, including age, have less impact than expected on the likelihood of sharing goals.

  12. Chemical landscape analysis with the OpenTox framework.

    PubMed

    Jeliazkova, Nina; Jeliazkov, Vedrin

    2012-01-01

    The Structure-Activity Relationships (SAR) landscape and activity cliffs concepts have their origins in medicinal chemistry and receptor-ligand interactions modelling. While intuitive, the definition of an activity cliff as a "pair of structurally similar compounds with large differences in potency" is commonly recognized as ambiguous. This paper proposes a new and efficient method for identifying activity cliffs and visualization of activity landscapes. The activity cliffs definition could be improved to reflect not the cliff steepness alone, but also the rate of the change of the steepness. The method requires explicitly setting similarity and activity difference thresholds, but provides means to explore multiple thresholds and to visualize in a single map how the thresholds affect the activity cliff identification. The identification of the activity cliffs is addressed by reformulating the problem as a statistical one, by introducing a probabilistic measure, namely, calculating the likelihood of a compound having large activity difference compared to other compounds, while being highly similar to them. The likelihood is effectively a quantification of a SAS Map with defined thresholds. Calculating the likelihood relies on four counts only, and does not require the pairwise matrix storage. This is a significant advantage, especially when processing large datasets. The method generates a list of individual compounds, ranked according to the likelihood of their involvement in the formation of activity cliffs, and goes beyond characterizing cliffs by structure pairs only. The visualisation is implemented by considering the activity plane fixed and analysing the irregularities of the similarity itself. It provides a convenient analogy to a topographic map and may help identifying the most appropriate similarity representation for each specific SAR space. The proposed method has been applied to several datasets, representing different biological activities. Finally, the method is implemented as part of an existing open source Ambit package and could be accessed via an OpenTox API compliant web service and via an interactive application, running within a modern, JavaScript enabled web browser. Combined with the functionalities already offered by the OpenTox framework, like data sharing and remote calculations, it could be a useful tool for exploring chemical landscapes online.

  13. Mixture Factor Analysis for Approximating a Nonnormally Distributed Continuous Latent Factor with Continuous and Dichotomous Observed Variables

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo

    2012-01-01

    Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…

  14. The Buccaneer software for automated model building. 1. Tracing protein chains.

    PubMed

    Cowtan, Kevin

    2006-09-01

    A new technique for the automated tracing of protein chains in experimental electron-density maps is described. The technique relies on the repeated application of an oriented electron-density likelihood target function to identify likely C(alpha) positions. This function is applied both in the location of a few promising ;seed' positions in the map and to grow those initial C(alpha) positions into extended chain fragments. Techniques for assembling the chain fragments into an initial chain trace are discussed.

  15. Mapping soil types from multispectral scanner data.

    NASA Technical Reports Server (NTRS)

    Kristof, S. J.; Zachary, A. L.

    1971-01-01

    Multispectral remote sensing and computer-implemented pattern recognition techniques were used for automatic ?mapping' of soil types. This approach involves subjective selection of a set of reference samples from a gray-level display of spectral variations which was generated by a computer. Each resolution element is then classified using a maximum likelihood ratio. Output is a computer printout on which the researcher assigns a different symbol to each class. Four soil test areas in Indiana were experimentally examined using this approach, and partially successful results were obtained.

  16. Maximum a posteriori decoder for digital communications

    NASA Technical Reports Server (NTRS)

    Altes, Richard A. (Inventor)

    1997-01-01

    A system and method for decoding by identification of the most likely phase coded signal corresponding to received data. The present invention has particular application to communication with signals that experience spurious random phase perturbations. The generalized estimator-correlator uses a maximum a posteriori (MAP) estimator to generate phase estimates for correlation with incoming data samples and for correlation with mean phases indicative of unique hypothesized signals. The result is a MAP likelihood statistic for each hypothesized transmission, wherein the highest value statistic identifies the transmitted signal.

  17. Feature Statistics Modulate the Activation of Meaning during Spoken Word Processing

    ERIC Educational Resources Information Center

    Devereux, Barry J.; Taylor, Kirsten I.; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K.

    2016-01-01

    Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in ("distinctiveness/sharedness") and likelihood of co-occurrence ("correlational…

  18. Localizing multiple X chromosome-linked retinitis pigmentosa loci using multilocus homogeneity tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ott, J.; Terwilliger, J.D.; Bhattacharya, S.

    1990-01-01

    Multilocus linkage analysis of 62 family pedigrees with X chromosome-linked retinitis pigmentosa (XLRP) was undertaken to determine the presence of possible multiple disease loci and to reliability estimate their map location. Multilocus homogeneity tests furnish convincing evidence for the presence of two XLRP loci, the likelihood ratio being 6.4 {times} 10{sup 9}:1 in a favor of two versus a single XLRP locus and gave accurate estimates for their map location. In 60-75% of the families, location of an XLRP gene was estimated at 1 centimorgan distal to OTC, and in 25-40% of the families, an XLRP locus was located halfwaymore » between DXS14 (p58-1) and DXZ1 (Xcen), with an estimated recombination fraction of 25% between the two XLRP loci. There is also good evidence for third XLRP locus, midway between DXS28 (C7) and DXS164 (pERT87), supported by a likelihood ratio of 293:1 for three versus two XLRP loci.« less

  19. Using variable rate models to identify genes under selection in sequence pairs: their validity and limitations for EST sequences.

    PubMed

    Church, Sheri A; Livingstone, Kevin; Lai, Zhao; Kozik, Alexander; Knapp, Steven J; Michelmore, Richard W; Rieseberg, Loren H

    2007-02-01

    Using likelihood-based variable selection models, we determined if positive selection was acting on 523 EST sequence pairs from two lineages of sunflower and lettuce. Variable rate models are generally not used for comparisons of sequence pairs due to the limited information and the inaccuracy of estimates of specific substitution rates. However, previous studies have shown that the likelihood ratio test (LRT) is reliable for detecting positive selection, even with low numbers of sequences. These analyses identified 56 genes that show a signature of selection, of which 75% were not identified by simpler models that average selection across codons. Subsequent mapping studies in sunflower show four of five of the positively selected genes identified by these methods mapped to domestication QTLs. We discuss the validity and limitations of using variable rate models for comparisons of sequence pairs, as well as the limitations of using ESTs for identification of positively selected genes.

  20. A heuristic multi-criteria classification approach incorporating data quality information for choropleth mapping

    PubMed Central

    Sun, Min; Wong, David; Kronenfeld, Barry

    2016-01-01

    Despite conceptual and technology advancements in cartography over the decades, choropleth map design and classification fail to address a fundamental issue: estimates that are statistically indifferent may be assigned to different classes on maps or vice versa. Recently, the class separability concept was introduced as a map classification criterion to evaluate the likelihood that estimates in two classes are statistical different. Unfortunately, choropleth maps created according to the separability criterion usually have highly unbalanced classes. To produce reasonably separable but more balanced classes, we propose a heuristic classification approach to consider not just the class separability criterion but also other classification criteria such as evenness and intra-class variability. A geovisual-analytic package was developed to support the heuristic mapping process to evaluate the trade-off between relevant criteria and to select the most preferable classification. Class break values can be adjusted to improve the performance of a classification. PMID:28286426

  1. Audio Tracking in Noisy Environments by Acoustic Map and Spectral Signature.

    PubMed

    Crocco, Marco; Martelli, Samuele; Trucco, Andrea; Zunino, Andrea; Murino, Vittorio

    2018-05-01

    A novel method is proposed for generic target tracking by audio measurements from a microphone array. To cope with noisy environments characterized by persistent and high energy interfering sources, a classification map (CM) based on spectral signatures is calculated by means of a machine learning algorithm. Next, the CM is combined with the acoustic map, describing the spatial distribution of sound energy, in order to obtain a cleaned joint map in which contributions from the disturbing sources are removed. A likelihood function is derived from this map and fed to a particle filter yielding the target location estimation on the acoustic image. The method is tested on two real environments, addressing both speaker and vehicle tracking. The comparison with a couple of trackers, relying on the acoustic map only, shows a sharp improvement in performance, paving the way to the application of audio tracking in real challenging environments.

  2. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  3. Self-Organizing Hidden Markov Model Map (SOHMMM): Biological Sequence Clustering and Cluster Visualization.

    PubMed

    Ferles, Christos; Beaufort, William-Scott; Ferle, Vanessa

    2017-01-01

    The present study devises mapping methodologies and projection techniques that visualize and demonstrate biological sequence data clustering results. The Sequence Data Density Display (SDDD) and Sequence Likelihood Projection (SLP) visualizations represent the input symbolical sequences in a lower-dimensional space in such a way that the clusters and relations of data elements are depicted graphically. Both operate in combination/synergy with the Self-Organizing Hidden Markov Model Map (SOHMMM). The resulting unified framework is in position to analyze automatically and directly raw sequence data. This analysis is carried out with little, or even complete absence of, prior information/domain knowledge.

  4. Maximum Likelihood Item Easiness Models for Test Theory without an Answer Key

    ERIC Educational Resources Information Center

    France, Stephen L.; Batchelder, William H.

    2015-01-01

    Cultural consensus theory (CCT) is a data aggregation technique with many applications in the social and behavioral sciences. We describe the intuition and theory behind a set of CCT models for continuous type data using maximum likelihood inference methodology. We describe how bias parameters can be incorporated into these models. We introduce…

  5. Planning applications in East Central Florida

    NASA Technical Reports Server (NTRS)

    Hannah, J. W. (Principal Investigator); Thomas, G. L.; Esparza, F.; Millard, J. J.

    1974-01-01

    The author has identified the following significant results. This is a study of applications of ERTS data to planning problems, especially as applicable to East Central Florida. The primary method has been computer analysis of digital data, with visual analysis of images serving to supplement the digital analysis. The principal method of analysis was supervised maximum likelihood classification, supplemented by density slicing and mapping of ratios of band intensities. Land-use maps have been prepared for several urban and non-urban sectors. Thematic maps have been found to be a useful form of the land-use maps. Change-monitoring has been found to be an appropriate and useful application. Mapping of marsh regions has been found effective and useful in this region. Local planners have participated in selecting training samples and in the checking and interpretation of results.

  6. Bi-orthogonal Symbol Mapping and Detection in Optical CDMA Communication System

    NASA Astrophysics Data System (ADS)

    Liu, Maw-Yang

    2017-12-01

    In this paper, the bi-orthogonal symbol mapping and detection scheme is investigated in time-spreading wavelength-hopping optical CDMA communication system. The carrier-hopping prime code is exploited as signature sequence, whose put-of-phase autocorrelation is zero. Based on the orthogonality of carrier-hopping prime code, the equal weight orthogonal signaling scheme can be constructed, and the proposed scheme using bi-orthogonal symbol mapping and detection can be developed. The transmitted binary data bits are mapped into corresponding bi-orthogonal symbols, where the orthogonal matrix code and its complement are utilized. In the receiver, the received bi-orthogonal data symbol is fed into the maximum likelihood decoder for detection. Under such symbol mapping and detection, the proposed scheme can greatly enlarge the Euclidean distance; hence, the system performance can be drastically improved.

  7. A Real-World Study of Switching From Allopurinol to Febuxostat in a Health Plan Database.

    PubMed

    Altan, Aylin; Shiozawa, Aki; Bancroft, Tim; Singh, Jasvinder A

    2015-12-01

    The objective of this study was to assess the real-world comparative effectiveness of continuing on allopurinol versus switching to febuxostat. In a retrospective claims data study of enrollees in health plans affiliated with Optum, we evaluated patients from February 1, 2009, to May 31, 2012, with a gout diagnosis, a pharmacy claim for allopurinol or febuxostat, and at least 1 serum uric acid (SUA) result available during the follow-up period. Univariate and multivariable-adjusted analyses (controlling for patient demographics and clinical factors) assessed the likelihood of SUA lowering and achievement of target SUA of less than 6.0 mg/dL or less than 5.0 mg/dL in allopurinol continuers versus febuxostat switchers. The final study population included 748 subjects who switched to febuxostat from allopurinol and 4795 continuing users of allopurinol. The most common doses of allopurinol were 300 mg/d or less in 95% of allopurinol continuers and 93% of febuxostat switchers (prior to switching); the most common dose of febuxostat was 40 mg/d, in 77% of febuxostat switchers (after switching). Compared with allopurinol continuers, febuxostat switchers had greater (1) mean preindex SUA, 8.0 mg/dL versus 6.6 mg/dL (P < 0.001); (2) likelihood of postindex SUA of less than 6.0 mg/dL, 62.2% versus 58.7% (P = 0.072); (3) likelihood of postindex SUA of less than 5.0 mg/dL, 38.9% versus 29.6% (P < 0.001); and (4) decrease in SUA, 1.8 (SD, 2.2) mg/dL versus 0.4 (SD, 1.7) mg/dL (P < 0.001). In multivariable-adjusted analyses, compared with allopurinol continuers, febuxostat switchers had significantly higher likelihood of achieving SUA of less than 6.0 mg/dL (40% higher) and SUA of less than 5.0 mg/dL (83% higher). In this "real-world" setting, many patients with gout not surprisingly were not treated with maximum permitted doses of allopurinol. Patients switched to febuxostat were more likely to achieve target SUA levels than those who continued on generally stable doses of allopurinol.

  8. The evaluation of multi-structure, multi-atlas pelvic anatomy features in a prostate MR lymphography CAD system

    NASA Astrophysics Data System (ADS)

    Meijs, M.; Debats, O.; Huisman, H.

    2015-03-01

    In prostate cancer, the detection of metastatic lymph nodes indicates progression from localized disease to metastasized cancer. The detection of positive lymph nodes is, however, a complex and time consuming task for experienced radiologists. Assistance of a two-stage Computer-Aided Detection (CAD) system in MR Lymphography (MRL) is not yet feasible due to the large number of false positives in the first stage of the system. By introducing a multi-structure, multi-atlas segmentation, using an affine transformation followed by a B-spline transformation for registration, the organ location is given by a mean density probability map. The atlas segmentation is semi-automatically drawn with ITK-SNAP, using Active Contour Segmentation. Each anatomic structure is identified by a label number. Registration is performed using Elastix, using Mutual Information and an Adaptive Stochastic Gradient optimization. The dataset consists of the MRL scans of ten patients, with lymph nodes manually annotated in consensus by two expert readers. The feature map of the CAD system consists of the Multi-Atlas and various other features (e.g. Normalized Intensity and multi-scale Blobness). The voxel-based Gentleboost classifier is evaluated using ROC analysis with cross validation. We show in a set of 10 studies that adding multi-structure, multi-atlas anatomical structure likelihood features improves the quality of the lymph node voxel likelihood map. Multiple structure anatomy maps may thus make MRL CAD more feasible.

  9. Using NASA Satellite Observations to Map Wildfire Risk in the United States for Allocation of Fire Management Resources

    NASA Astrophysics Data System (ADS)

    Farahmand, A.; Reager, J. T., II; Behrangi, A.; Stavros, E. N.; Randerson, J. T.

    2017-12-01

    Fires are a key disturbance globally acting as a catalyst for terrestrial ecosystem change and contributing significantly to both carbon emissions and changes in surface albedo. The socioeconomic impacts of wildfire activities are also significant with wildfire activity results in billions of dollars of losses every year. Fire size, area burned and frequency are increasing, thus the likelihood of fire danger, defined by United States National Interagency Fire Center (NFIC) as the demand of fire management resources as a function of how flammable fuels (a function of ignitability, consumability and availability) are from normal, is an important step toward reducing costs associated with wildfires. Numerous studies have aimed to predict the likelihood of fire danger, but few studies use remote sensing data to map fire danger at scales commensurate with regional management decisions (e.g., deployment of resources nationally throughout fire season with seasonal and monthly prediction). Here, we use NASA Gravity Recovery And Climate Experiment (GRACE) assimilated surface soil moisture, NASA Atmospheric Infrared Sounder (AIRS) vapor pressure deficit, NASA Moderate Resolution Imaging Spectroradiometer (MODIS) enhanced vegetation index products and landcover products, along with US Forest Service historical fire activity data to generate probabilistic monthly fire potential maps in the United States. These maps can be useful in not only government operational allocation of fire management resources, but also improving understanding of the Earth System and how it is changing in order to refine predictions of fire extremes.

  10. Linkage disequilibrium interval mapping of quantitative trait loci.

    PubMed

    Boitard, Simon; Abdallah, Jihad; de Rochambeau, Hubert; Cierco-Ayrolles, Christine; Mangin, Brigitte

    2006-03-16

    For many years gene mapping studies have been performed through linkage analyses based on pedigree data. Recently, linkage disequilibrium methods based on unrelated individuals have been advocated as powerful tools to refine estimates of gene location. Many strategies have been proposed to deal with simply inherited disease traits. However, locating quantitative trait loci is statistically more challenging and considerable research is needed to provide robust and computationally efficient methods. Under a three-locus Wright-Fisher model, we derived approximate expressions for the expected haplotype frequencies in a population. We considered haplotypes comprising one trait locus and two flanking markers. Using these theoretical expressions, we built a likelihood-maximization method, called HAPim, for estimating the location of a quantitative trait locus. For each postulated position, the method only requires information from the two flanking markers. Over a wide range of simulation scenarios it was found to be more accurate than a two-marker composite likelihood method. It also performed as well as identity by descent methods, whilst being valuable in a wider range of populations. Our method makes efficient use of marker information, and can be valuable for fine mapping purposes. Its performance is increased if multiallelic markers are available. Several improvements can be developed to account for more complex evolution scenarios or provide robust confidence intervals for the location estimates.

  11. Measuring galaxy cluster masses with CMB lensing using a Maximum Likelihood estimator: statistical and systematic error budgets for future experiments

    NASA Astrophysics Data System (ADS)

    Raghunathan, Srinivasan; Patil, Sanjaykumar; Baxter, Eric J.; Bianchini, Federico; Bleem, Lindsey E.; Crawford, Thomas M.; Holder, Gilbert P.; Manzotti, Alessandro; Reichardt, Christian L.

    2017-08-01

    We develop a Maximum Likelihood estimator (MLE) to measure the masses of galaxy clusters through the impact of gravitational lensing on the temperature and polarization anisotropies of the cosmic microwave background (CMB). We show that, at low noise levels in temperature, this optimal estimator outperforms the standard quadratic estimator by a factor of two. For polarization, we show that the Stokes Q/U maps can be used instead of the traditional E- and B-mode maps without losing information. We test and quantify the bias in the recovered lensing mass for a comprehensive list of potential systematic errors. Using realistic simulations, we examine the cluster mass uncertainties from CMB-cluster lensing as a function of an experiment's beam size and noise level. We predict the cluster mass uncertainties will be 3 - 6% for SPT-3G, AdvACT, and Simons Array experiments with 10,000 clusters and less than 1% for the CMB-S4 experiment with a sample containing 100,000 clusters. The mass constraints from CMB polarization are very sensitive to the experimental beam size and map noise level: for a factor of three reduction in either the beam size or noise level, the lensing signal-to-noise improves by roughly a factor of two.

  12. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    PubMed

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  13. Defoliation potential of gypsy moth

    Treesearch

    David A. Gansner; David A. Drake; Stanford L. Arner; Rachel R. Hershey; Susan L. King; Susan L. King

    1993-01-01

    A model that uses forest stand characteristics to estimate the likelihood of gypsy moth (Lymantria dispar L.) defoliation has been developed. It was applied to recent forest inventory plot data to produce susceptibility ratings and maps showing current defoliation potential in a seven-state area where gypsy moth is an immediate threat.

  14. Maximum-likelihood block detection of noncoherent continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K.; Divsalar, Dariush

    1993-01-01

    This paper examines maximum-likelihood block detection of uncoded full response CPM over an additive white Gaussian noise (AWGN) channel. Both the maximum-likelihood metrics and the bit error probability performances of the associated detection algorithms are considered. The special and popular case of minimum-shift-keying (MSK) corresponding to h = 0.5 and constant amplitude frequency pulse is treated separately. The many new receiver structures that result from this investigation can be compared to the traditional ones that have been used in the past both from the standpoint of simplicity of implementation and optimality of performance.

  15. Design of simplified maximum-likelihood receivers for multiuser CPM systems.

    PubMed

    Bing, Li; Bai, Baoming

    2014-01-01

    A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.

  16. TU-FG-201-12: Designing a Risk-Based Quality Assurance Program for a Newly Implemented Y-90 Microspheres Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vile, D; Zhang, L; Cuttino, L

    2016-06-15

    Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less

  17. Relative impact of previous disturbance history on the likelihood of additional disturbance in the Northern United States Forest Service USFS Region

    NASA Astrophysics Data System (ADS)

    Hernandez, A. J.

    2015-12-01

    The Landsat archive is increasingly being used to detect trends in the occurrence of forest disturbance. Beyond information about the amount of area affected, forest managers need to know if and how disturbance regimes change. The National Forest System (NFS) has developed a comprehensive plan for carbon monitoring that requires a detailed temporal mapping of forest disturbances across 75 million hectares. A long-term annual time series that shows the timing, extent, and type of disturbance beginning in 1990 and ending in 2011 has been prepared for several USFS Regions, including the Northern Region. Our mapping starts with an automated detection of annual disturbances using a time series of historical Landsat imagery. Automated detections are meticulously inspected, corrected and labeled using various USFS ancillary datasets. The resulting maps of verified disturbance show the timing and types are fires, harvests, insect activity, disease, and abiotic (wind, drought, avalanche) damage. Also, the magnitude of each change event is modeled in terms of the proportion of canopy cover lost. The sequence of disturbances for every pixel since 1990 has been consistently mapped and is available across the entirety of NFS. Our datasets contain sufficient information to describe the frequency of stand replacement, as well as how often disturbance results in only a partial loss of canopy. This information provides empirical insight into how an initial disturbance may predispose a stand to further disturbance, and it also show a climatic signal in the occurrence of processes such as fire and insect epidemics. Thus, we have the information to model the likelihood of occurrence of certain disturbances after a given event (i.e. if we have a fire in the past what does that do to the likelihood of occurrence of insects in the future). Here, we explore if previous disturbance history is a reliable predictor of additional disturbance in the future and we present results of applying logistic regression to obtain predicted probabilities of occurrence of additional disturbance types. We describe responses in additional disturbance and prominent trends for each major forest type.

  18. Measuring and partitioning the high-order linkage disequilibrium by multiple order Markov chains.

    PubMed

    Kim, Yunjung; Feng, Sheng; Zeng, Zhao-Bang

    2008-05-01

    A map of the background levels of disequilibrium between nearby markers can be useful for association mapping studies. In order to assess the background levels of linkage disequilibrium (LD), multilocus LD measures are more advantageous than pairwise LD measures because the combined analysis of pairwise LD measures is not adequate to detect simultaneous allele associations among multiple markers. Various multilocus LD measures based on haplotypes have been proposed. However, most of these measures provide a single index of association among multiple markers and does not reveal the complex patterns and different levels of LD structure. In this paper, we employ non-homogeneous, multiple order Markov Chain models as a statistical framework to measure and partition the LD among multiple markers into components due to different orders of marker associations. Using a sliding window of multiple markers on phased haplotype data, we compute corresponding likelihoods for different Markov Chain (MC) orders in each window. The log-likelihood difference between the lowest MC order model (MC0) and the highest MC order model in each window is used as a measure of the total LD or the overall deviation from the gametic equilibrium for the window. Then, we partition the total LD into lower order disequilibria and estimate the effects from two-, three-, and higher order disequilibria. The relationship between different orders of LD and the log-likelihood difference involving two different orders of MC models are explored. By applying our method to the phased haplotype data in the ENCODE regions of the HapMap project, we are able to identify high/low multilocus LD regions. Our results reveal that the most LD in the HapMap data is attributed to the LD between adjacent pairs of markers across the whole region. LD between adjacent pairs of markers appears to be more significant in high multilocus LD regions than in low multilocus LD regions. We also find that as the multilocus total LD increases, the effects of high-order LD tends to get weaker due to the lack of observed multilocus haplotypes. The overall estimates of first, second, third, and fourth order LD across the ENCODE regions are 64, 23, 9, and 3%.

  19. Low Altitude AVIRIS Data for Mapping Land Cover in Yellowstone National Park: Use of Isodata Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Spruce, Joe

    2001-01-01

    Yellowstone National Park (YNP) contains a diversity of land cover. YNP managers need site-specific land cover maps, which may be produced more effectively using high-resolution hyperspectral imagery. ISODATA clustering techniques have aided operational multispectral image classification and may benefit certain hyperspectral data applications if optimally applied. In response, a study was performed for an area in northeast YNP using 11 select bands of low-altitude AVIRIS data calibrated to ground reflectance. These data were subjected to ISODATA clustering and Maximum Likelihood Classification techniques to produce a moderately detailed land cover map. The latter has good apparent overall agreement with field surveys and aerial photo interpretation.

  20. CARHTA GENE: multipopulation integrated genetic and radiation hybrid mapping.

    PubMed

    de Givry, Simon; Bouchez, Martin; Chabrier, Patrick; Milan, Denis; Schiex, Thomas

    2005-04-15

    CAR(H)(T)A GENE: is an integrated genetic and radiation hybrid (RH) mapping tool which can deal with multiple populations, including mixtures of genetic and RH data. CAR(H)(T)A GENE: performs multipoint maximum likelihood estimations with accelerated expectation-maximization algorithms for some pedigrees and has sophisticated algorithms for marker ordering. Dedicated heuristics for framework mapping are also included. CAR(H)(T)A GENE: can be used as a C++ library, through a shell command and a graphical interface. The XML output for companion tools is integrated. The program is available free of charge from www.inra.fr/bia/T/CarthaGene for Linux, Windows and Solaris machines (with Open Source). tschiex@toulouse.inra.fr.

  1. A tree island approach to inferring phylogeny in the ant subfamily Formicinae, with especial reference to the evolution of weaving.

    PubMed

    Johnson, Rebecca N; Agapow, Paul-Michael; Crozier, Ross H

    2003-11-01

    The ant subfamily Formicinae is a large assemblage (2458 species (J. Nat. Hist. 29 (1995) 1037), including species that weave leaf nests together with larval silk and in which the metapleural gland-the ancestrally defining ant character-has been secondarily lost. We used sequences from two mitochondrial genes (cytochrome b and cytochrome oxidase 2) from 18 formicine and 4 outgroup taxa to derive a robust phylogeny, employing a search for tree islands using 10000 randomly constructed trees as starting points and deriving a maximum likelihood consensus tree from the ML tree and those not significantly different from it. Non-parametric bootstrapping showed that the ML consensus tree fit the data significantly better than three scenarios based on morphology, with that of Bolton (Identification Guide to the Ant Genera of the World, Harvard University Press, Cambridge, MA) being the best among these alternative trees. Trait mapping showed that weaving had arisen at least four times and possibly been lost once. A maximum likelihood analysis showed that loss of the metapleural gland is significantly associated with the weaver life-pattern. The graph of the frequencies with which trees were discovered versus their likelihood indicates that trees with high likelihoods have much larger basins of attraction than those with lower likelihoods. While this result indicates that single searches are more likely to find high- than low-likelihood tree islands, it also indicates that searching only for the single best tree may lose important information.

  2. Indoor Positioning System Using Magnetic Field Map Navigation and an Encoder System

    PubMed Central

    Kim, Han-Sol; Seo, Woojin; Baek, Kwang-Ryul

    2017-01-01

    In the indoor environment, variation of the magnetic field is caused by building structures, and magnetic field map navigation is based on this feature. In order to estimate position using this navigation, a three-axis magnetic field must be measured at every point to build a magnetic field map. After the magnetic field map is obtained, the position of the mobile robot can be estimated with a likelihood function whereby the measured magnetic field data and the magnetic field map are used. However, if only magnetic field map navigation is used, the estimated position can have large errors. In order to improve performance, we propose a particle filter system that integrates magnetic field map navigation and an encoder system. In this paper, multiple magnetic sensors and three magnetic field maps (a horizontal intensity map, a vertical intensity map, and a direction information map) are used to update the weights of particles. As a result, the proposed system estimates the position and orientation of a mobile robot more accurately than previous systems. Also, when the number of magnetic sensors increases, this paper shows that system performance improves. Finally, experiment results are shown from the proposed system that was implemented and evaluated. PMID:28327513

  3. Indoor Positioning System Using Magnetic Field Map Navigation and an Encoder System.

    PubMed

    Kim, Han-Sol; Seo, Woojin; Baek, Kwang-Ryul

    2017-03-22

    In the indoor environment, variation of the magnetic field is caused by building structures, and magnetic field map navigation is based on this feature. In order to estimate position using this navigation, a three-axis magnetic field must be measured at every point to build a magnetic field map. After the magnetic field map is obtained, the position of the mobile robot can be estimated with a likelihood function whereby the measured magnetic field data and the magnetic field map are used. However, if only magnetic field map navigation is used, the estimated position can have large errors. In order to improve performance, we propose a particle filter system that integrates magnetic field map navigation and an encoder system. In this paper, multiple magnetic sensors and three magnetic field maps (a horizontal intensity map, a vertical intensity map, and a direction information map) are used to update the weights of particles. As a result, the proposed system estimates the position and orientation of a mobile robot more accurately than previous systems. Also, when the number of magnetic sensors increases, this paper shows that system performance improves. Finally, experiment results are shown from the proposed system that was implemented and evaluated.

  4. ATAC Autocuer Modeling Analysis.

    DTIC Science & Technology

    1981-01-01

    the analysis of the simple rectangular scrnentation (1) is based on detection and estimation theory (2). This approach uses the concept of maximum ...continuous wave forms. In order to develop the principles of maximum likelihood, it is con- venient to develop the principles for the "classical...the concept of maximum likelihood is significant in that it provides the optimum performance of the detection/estimation problem. With a knowledge of

  5. Planck 2015 results: I. Overview of products and scientific results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adam, R.; Ade, P. A. R.; Aghanim, N.

    The European Space Agency’s Planck satellite, which is dedicated to studying the early Universe and its subsequent evolution, was launched on 14 May 2009. It scanned the microwave and submillimetre sky continuously between 12 August 2009 and 23 October 2013. In February 2015, ESA and the Planck Collaboration released the second set of cosmology products based ondata from the entire Planck mission, including both temperature and polarization, along with a set of scientific and technical papers and a web-based explanatory supplement. This study gives an overview of the main characteristics of the data and the data products in the release,more » as well as the associated cosmological and astrophysical science results and papers. The data products include maps of the cosmic microwave background (CMB), the thermal Sunyaev-Zeldovich effect, diffuse foregrounds in temperature and polarization, catalogues of compact Galactic and extragalactic sources (including separate catalogues of Sunyaev-Zeldovich clusters and Galactic cold clumps), and extensive simulations of signals and noise used in assessing uncertainties and the performance of the analysis methods. The likelihood code used to assess cosmological models against the Planck data is described, along with a CMB lensing likelihood. Finally, scientific results include cosmological parameters derived from CMB power spectra, gravitational lensing, and cluster counts, as well as constraints on inflation, non-Gaussianity, primordial magnetic fields, dark energy, and modified gravity, and new results on low-frequency Galactic foregrounds.« less

  6. Planck 2015 results: I. Overview of products and scientific results

    DOE PAGES

    Adam, R.; Ade, P. A. R.; Aghanim, N.; ...

    2016-09-20

    The European Space Agency’s Planck satellite, which is dedicated to studying the early Universe and its subsequent evolution, was launched on 14 May 2009. It scanned the microwave and submillimetre sky continuously between 12 August 2009 and 23 October 2013. In February 2015, ESA and the Planck Collaboration released the second set of cosmology products based ondata from the entire Planck mission, including both temperature and polarization, along with a set of scientific and technical papers and a web-based explanatory supplement. This study gives an overview of the main characteristics of the data and the data products in the release,more » as well as the associated cosmological and astrophysical science results and papers. The data products include maps of the cosmic microwave background (CMB), the thermal Sunyaev-Zeldovich effect, diffuse foregrounds in temperature and polarization, catalogues of compact Galactic and extragalactic sources (including separate catalogues of Sunyaev-Zeldovich clusters and Galactic cold clumps), and extensive simulations of signals and noise used in assessing uncertainties and the performance of the analysis methods. The likelihood code used to assess cosmological models against the Planck data is described, along with a CMB lensing likelihood. Finally, scientific results include cosmological parameters derived from CMB power spectra, gravitational lensing, and cluster counts, as well as constraints on inflation, non-Gaussianity, primordial magnetic fields, dark energy, and modified gravity, and new results on low-frequency Galactic foregrounds.« less

  7. Data integration in physiology using Bayes’ rule and minimum Bayes’ factors: deubiquitylating enzymes in the renal collecting duct

    PubMed Central

    Xue, Zhe; Chen, Jia-Xu; Zhao, Yue; Medvar, Barbara

    2017-01-01

    A major challenge in physiology is to exploit the many large-scale data sets available from “-omic” studies to seek answers to key physiological questions. In previous studies, Bayes’ theorem has been used for this purpose. This approach requires a means to map continuously distributed experimental data to probabilities (likelihood values) to derive posterior probabilities from the combination of prior probabilities and new data. Here, we introduce the use of minimum Bayes’ factors for this purpose and illustrate the approach by addressing a physiological question, “Which deubiquitylating enzymes (DUBs) encoded by mammalian genomes are most likely to regulate plasma membrane transport processes in renal cortical collecting duct principal cells?” To do this, we have created a comprehensive online database of 110 DUBs present in the mammalian genome (https://hpcwebapps.cit.nih.gov/ESBL/Database/DUBs/). We used Bayes’ theorem to integrate available information from large-scale data sets derived from proteomic and transcriptomic studies of renal collecting duct cells to rank the 110 known DUBs with regard to likelihood of interacting with and regulating transport processes. The top-ranked DUBs were OTUB1, USP14, PSMD7, PSMD14, USP7, USP9X, OTUD4, USP10, and UCHL5. Among these USP7, USP9X, OTUD4, and USP10 are known to be involved in endosomal trafficking and have potential roles in endosomal recycling of plasma membrane proteins in the mammalian cortical collecting duct. PMID:28039431

  8. Data integration in physiology using Bayes' rule and minimum Bayes' factors: deubiquitylating enzymes in the renal collecting duct.

    PubMed

    Xue, Zhe; Chen, Jia-Xu; Zhao, Yue; Medvar, Barbara; Knepper, Mark A

    2017-03-01

    A major challenge in physiology is to exploit the many large-scale data sets available from "-omic" studies to seek answers to key physiological questions. In previous studies, Bayes' theorem has been used for this purpose. This approach requires a means to map continuously distributed experimental data to probabilities (likelihood values) to derive posterior probabilities from the combination of prior probabilities and new data. Here, we introduce the use of minimum Bayes' factors for this purpose and illustrate the approach by addressing a physiological question, "Which deubiquitylating enzymes (DUBs) encoded by mammalian genomes are most likely to regulate plasma membrane transport processes in renal cortical collecting duct principal cells?" To do this, we have created a comprehensive online database of 110 DUBs present in the mammalian genome (https://hpcwebapps.cit.nih.gov/ESBL/Database/DUBs/). We used Bayes' theorem to integrate available information from large-scale data sets derived from proteomic and transcriptomic studies of renal collecting duct cells to rank the 110 known DUBs with regard to likelihood of interacting with and regulating transport processes. The top-ranked DUBs were OTUB1, USP14, PSMD7, PSMD14, USP7, USP9X, OTUD4, USP10, and UCHL5. Among these USP7, USP9X, OTUD4, and USP10 are known to be involved in endosomal trafficking and have potential roles in endosomal recycling of plasma membrane proteins in the mammalian cortical collecting duct. Copyright © 2017 the American Physiological Society.

  9. Five-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Bayesian Estimation of Cosmic Microwave Background Polarization Maps

    NASA Astrophysics Data System (ADS)

    Dunkley, J.; Spergel, D. N.; Komatsu, E.; Hinshaw, G.; Larson, D.; Nolta, M. R.; Odegard, N.; Page, L.; Bennett, C. L.; Gold, B.; Hill, R. S.; Jarosik, N.; Weiland, J. L.; Halpern, M.; Kogut, A.; Limon, M.; Meyer, S. S.; Tucker, G. S.; Wollack, E.; Wright, E. L.

    2009-08-01

    We describe a sampling method to estimate the polarized cosmic microwave background (CMB) signal from observed maps of the sky. We use a Metropolis-within-Gibbs algorithm to estimate the polarized CMB map, containing Q and U Stokes parameters at each pixel, and its covariance matrix. These can be used as inputs for cosmological analyses. The polarized sky signal is parameterized as the sum of three components: CMB, synchrotron emission, and thermal dust emission. The polarized Galactic components are modeled with spatially varying power-law spectral indices for the synchrotron, and a fixed power law for the dust, and their component maps are estimated as by-products. We apply the method to simulated low-resolution maps with pixels of side 7.2 deg, using diagonal and full noise realizations drawn from the WMAP noise matrices. The CMB maps are recovered with goodness of fit consistent with errors. Computing the likelihood of the E-mode power in the maps as a function of optical depth to reionization, τ, for fixed temperature anisotropy power, we recover τ = 0.091 ± 0.019 for a simulation with input τ = 0.1, and mean τ = 0.098 averaged over 10 simulations. A "null" simulation with no polarized CMB signal has maximum likelihood consistent with τ = 0. The method is applied to the five-year WMAP data, using the K, Ka, Q, and V channels. We find τ = 0.090 ± 0.019, compared to τ = 0.086 ± 0.016 from the template-cleaned maps used in the primary WMAP analysis. The synchrotron spectral index, β, averaged over high signal-to-noise pixels with standard deviation σ(β) < 0.25, but excluding ~6% of the sky masked in the Galactic plane, is -3.03 ± 0.04. This estimate does not vary significantly with Galactic latitude, although includes an informative prior. WMAP is the result of a partnership between Princeton University and NASA's Goddard Space Flight Center. Scientific guidance is provided by the WMAP Science Team.

  10. Wildfire potential mapping over the state of Mississippi: A land surface modeling approach

    Treesearch

    William H. Cooke; Georgy V. Mostovoy; Valentine G. Anantharaj; W. Matt Jolly

    2012-01-01

    A relationship between the likelihood of wildfires and various drought metrics (soil moisture-based fire potential indices) were examined over the southern part of Mississippi. The following three indices were tested and used to simulate spatial and temporal wildfire probability changes: (1) the accumulated difference between daily precipitation and potential...

  11. Categorical likelihood method for combining NDVI and elevation information for cotton precision agricultural applications

    USDA-ARS?s Scientific Manuscript database

    This presentation investigates an algorithm to fuse the Normalized Difference Vegetation Index (NDVI) with LiDAR elevation data to produce a map useful for the site-specific scouting and pest management (Willers et al. 1999; 2005; 2009) of the cotton insect pests, the tarnished plant bug (Lygus lin...

  12. Mapping benthic macroalgal communities in the coastal zone using CHRIS-PROBA mode 2 images

    NASA Astrophysics Data System (ADS)

    Casal, G.; Kutser, T.; Domínguez-Gómez, J. A.; Sánchez-Carnero, N.; Freire, J.

    2011-09-01

    The ecological importance of benthic macroalgal communities in coastal ecosystems has been recognised worldwide and the application of remote sensing to study these communities presents certain advantages respect to in situ methods. The present study used three CHRIS-PROBA images to analyse macroalgal communities distribution in the Seno de Corcubión (NW Spain). The use of this sensor represent a challenge given that its design, build and deployment programme is intended to follow the principles of the "faster, better, cheaper". To assess the application of this sensor to macroalgal mapping, two types of classifications were carried out: Maximum Likelihood and Spectral Angle Mapper (SAM). Maximum Likelihood classifier showed positive results, reaching overall accuracy percentages higher than 90% and kappa coefficients higher than 0.80 for the bottom classes shallow submerged sand, deep submerged sand, macroalgae less than 5 m and macroalgae between 5 and 10 m depth. The differentiation among macroalgal groups using SAM classifications showed positive results for green seaweeds although the differentiation between brown and red algae was not clear in the study area.

  13. A visualization tool to support decision making in environmental and biological planning

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Conzelmann, Craig; Suir, Kevin J.

    2014-01-01

    Large-scale ecosystem management involves consideration of many factors for informed decision making. The EverVIEW Data Viewer is a cross-platform desktop decision support tool to help decision makers compare simulation model outputs from competing plans for restoring Florida's Greater Everglades. The integration of NetCDF metadata conventions into EverVIEW allows end-users from multiple institutions within and beyond the Everglades restoration community to share information and tools. Our development process incorporates continuous interaction with targeted end-users for increased likelihood of adoption. One of EverVIEW's signature features is side-by-side map panels, which can be used to simultaneously compare species or habitat impacts from alternative restoration plans. Other features include examination of potential restoration plan impacts across multiple geographic or tabular displays, and animation through time. As a result of an iterative, standards-driven approach, EverVIEW is relevant to large-scale planning beyond Florida, and is used in multiple biological planning efforts in the United States.

  14. Back to Normal! Gaussianizing posterior distributions for cosmological probes

    NASA Astrophysics Data System (ADS)

    Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.

    2014-05-01

    We present a method to map multivariate non-Gaussian posterior probability densities into Gaussian ones via nonlinear Box-Cox transformations, and generalizations thereof. This is analogous to the search for normal parameters in the CMB, but can in principle be applied to any probability density that is continuous and unimodal. The search for the optimally Gaussianizing transformation amongst the Box-Cox family is performed via a maximum likelihood formalism. We can judge the quality of the found transformation a posteriori: qualitatively via statistical tests of Gaussianity, and more illustratively by how well it reproduces the credible regions. The method permits an analytical reconstruction of the posterior from a sample, e.g. a Markov chain, and simplifies the subsequent joint analysis with other experiments. Furthermore, it permits the characterization of a non-Gaussian posterior in a compact and efficient way. The expression for the non-Gaussian posterior can be employed to find analytic formulae for the Bayesian evidence, and consequently be used for model comparison.

  15. Mapping quantitative trait loci for binary trait in the F2:3 design.

    PubMed

    Zhu, Chengsong; Zhang, Yuan-Ming; Guo, Zhigang

    2008-12-01

    In the analysis of inheritance of quantitative traits with low heritability, an F(2:3) design that genotypes plants in F(2) and phenotypes plants in F(2:3) progeny is often used in plant genetics. Although statistical approaches for mapping quantitative trait loci (QTL) in the F(2:3) design have been well developed, those for binary traits of biological interest and economic importance are seldom addressed. In this study, an attempt was made to map binary trait loci (BTL) in the F(2:3) design. The fundamental idea was: the F(2) plants were genotyped, all phenotypic values of each F(2:3) progeny were measured for binary trait, and these binary trait values and the marker genotype informations were used to detect BTL under the penetrance and liability models. The proposed method was verified by a series of Monte-Carlo simulation experiments. These results showed that maximum likelihood approaches under the penetrance and liability models provide accurate estimates for the effects and the locations of BTL with high statistical power, even under of low heritability. Moreover, the penetrance model is as efficient as the liability model, and the F(2:3) design is more efficient than classical F(2) design, even though only a single progeny is collected from each F(2:3) family. With the maximum likelihood approaches under the penetrance and the liability models developed in this study, we can map binary traits as we can do for quantitative trait in the F(2:3) design.

  16. A Real-World Study of Switching From Allopurinol to Febuxostat in a Health Plan Database

    PubMed Central

    Altan, Aylin; Shiozawa, Aki; Bancroft, Tim; Singh, Jasvinder A.

    2015-01-01

    Objective The objective of this study was to assess the real-world comparative effectiveness of continuing on allopurinol versus switching to febuxostat. Methods In a retrospective claims data study of enrollees in health plans affiliated with Optum, we evaluated patients from February 1, 2009, to May 31, 2012, with a gout diagnosis, a pharmacy claim for allopurinol or febuxostat, and at least 1 serum uric acid (SUA) result available during the follow-up period. Univariate and multivariable-adjusted analyses (controlling for patient demographics and clinical factors) assessed the likelihood of SUA lowering and achievement of target SUA of less than 6.0 mg/dL or less than 5.0 mg/dL in allopurinol continuers versus febuxostat switchers. Results The final study population included 748 subjects who switched to febuxostat from allopurinol and 4795 continuing users of allopurinol. The most common doses of allopurinol were 300 mg/d or less in 95% of allopurinol continuers and 93% of febuxostat switchers (prior to switching); the most common dose of febuxostat was 40 mg/d, in 77% of febuxostat switchers (after switching). Compared with allopurinol continuers, febuxostat switchers had greater (1) mean preindex SUA, 8.0 mg/dL versus 6.6 mg/dL (P < 0.001); (2) likelihood of postindex SUA of less than 6.0 mg/dL, 62.2% versus 58.7% (P = 0.072); (3) likelihood of postindex SUA of less than 5.0 mg/dL, 38.9% versus 29.6% (P < 0.001); and (4) decrease in SUA, 1.8 (SD, 2.2) mg/dL versus 0.4 (SD, 1.7) mg/dL (P < 0.001). In multivariable-adjusted analyses, compared with allopurinol continuers, febuxostat switchers had significantly higher likelihood of achieving SUA of less than 6.0 mg/dL (40% higher) and SUA of less than 5.0 mg/dL (83% higher). Conclusions In this “real-world” setting, many patients with gout not surprisingly were not treated with maximum permitted doses of allopurinol. Patients switched to febuxostat were more likely to achieve target SUA levels than those who continued on generally stable doses of allopurinol. PMID:26580304

  17. White Matter Injuries in Mild Traumatic Brain Injury and Posttraumatic Migraines: Diffusion Entropy Analysis.

    PubMed

    Delic, Joseph; Alhilali, Lea M; Hughes, Marion A; Gumus, Serter; Fakhran, Saeed

    2016-06-01

    Purpose To determine the performance of Shannon entropy (SE) as a diagnostic tool in patients with mild traumatic brain injury (mTBI) with posttraumatic migraines (PTMs) and those without PTMs on the basis of analysis of fractional anisotropy (FA) maps. Materials and Methods The institutional review board approved this retrospective study, with waiver of informed consent. FA maps were obtained and neurocognitive testing was performed in 74 patients with mTBI (57 with PTM, 17 without PTM). FA maps were obtained in 22 healthy control subjects and in 20 control patients with migraine headaches. Mean FA and SE were extracted from total brain FA histograms and were compared between patients with mTBI and control subjects and between patients with and those without PTM. Mean FA and SE were correlated with clinical variables and were used to determine the areas under the receiver operating characteristic curve (AUCs) and likelihood ratios for mTBI and development of PTM. Results Patients with mTBI had significantly lower SE (P < .001) and trended toward lower mean FA (P = .07) compared with control subjects. SE inversely correlated with time to recovery (TTR) (r = -0.272, P = .02). Patients with mTBI with PTM had significantly lower SE (P < .001) but not mean FA (P = .15) than did other patients with mTBI. SE provided better discrimination between patients with mTBI and control subjects than mean FA (AUC = 0.92; P = .01), as well as better discrimination between patients with mTBI with PTM and those without PTM (AUC = 0.85; P < .001). SE of less than 0.751 resulted in a 16.1 increased likelihood of having experienced mTBI and a 3.2 increased likelihood of developing PTM. Conclusion SE more accurately reveals mTBI than mean FA, more accurately reveals those patients with mTBI who develop PTM, and inversely correlates with TTR. (©) RSNA, 2016.

  18. Alternative stable states and phase shifts in coral reefs under anthropogenic stress.

    PubMed

    Fung, Tak; Seymour, Robert M; Johnson, Craig R

    2011-04-01

    Ecosystems with alternative stable states (ASS) may shift discontinuously from one stable state to another as environmental parameters cross a threshold. Reversal can then be difficult due to hysteresis effects. This contrasts with continuous state changes in response to changing environmental parameters, which are less difficult to reverse. Worldwide degradation of coral reefs, involving "phase shifts" from coral to algal dominance, highlights the pressing need to determine the likelihood of discontinuous phase shifts in coral reefs, in contrast to continuous shifts with no ASS. However, there is little evidence either for or against the existence of ASS for coral reefs. We use dynamic models to investigate the likelihood of continuous and discontinuous phase shifts in coral reefs subject to sustained environmental perturbation by fishing, nutrification, and sedimentation. Our modeling results suggest that coral reefs with or without anthropogenic stress can exhibit ASS, such that discontinuous phase shifts can occur. We also find evidence to support the view that high macroalgal growth rates and low grazing rates on macroalgae favor ASS in coral reefs. Further, our results suggest that the three stressors studied, either alone or in combination, can increase the likelihood of both continuous and discontinuous phase shifts by altering the competitive balance between corals and algae. However, in contrast to continuous phase shifts, we find that discontinuous shifts occur only in model coral reefs with parameter values near the extremes of their empirically determined ranges. This suggests that continuous shifts are more likely than discontinuous shifts in coral reefs. Our results also suggest that, for ecosystems in general, tackling multiple human stressors simultaneously maximizes resilience to phase shifts, ASS, and hysteresis, leading to improvements in ecosystem health and functioning.

  19. Measuring galaxy cluster masses with CMB lensing using a Maximum Likelihood estimator: statistical and systematic error budgets for future experiments

    DOE PAGES

    Raghunathan, Srinivasan; Patil, Sanjaykumar; Baxter, Eric J.; ...

    2017-08-25

    We develop a Maximum Likelihood estimator (MLE) to measure the masses of galaxy clusters through the impact of gravitational lensing on the temperature and polarization anisotropies of the cosmic microwave background (CMB). We show that, at low noise levels in temperature, this optimal estimator outperforms the standard quadratic estimator by a factor of two. For polarization, we show that the Stokes Q/U maps can be used instead of the traditional E- and B-mode maps without losing information. We test and quantify the bias in the recovered lensing mass for a comprehensive list of potential systematic errors. Using realistic simulations, wemore » examine the cluster mass uncertainties from CMB-cluster lensing as a function of an experiment’s beam size and noise level. We predict the cluster mass uncertainties will be 3 - 6% for SPT-3G, AdvACT, and Simons Array experiments with 10,000 clusters and less than 1% for the CMB-S4 experiment with a sample containing 100,000 clusters. The mass constraints from CMB polarization are very sensitive to the experimental beam size and map noise level: for a factor of three reduction in either the beam size or noise level, the lensing signal-to-noise improves by roughly a factor of two.« less

  20. Measuring galaxy cluster masses with CMB lensing using a Maximum Likelihood estimator: statistical and systematic error budgets for future experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raghunathan, Srinivasan; Patil, Sanjaykumar; Baxter, Eric J.

    We develop a Maximum Likelihood estimator (MLE) to measure the masses of galaxy clusters through the impact of gravitational lensing on the temperature and polarization anisotropies of the cosmic microwave background (CMB). We show that, at low noise levels in temperature, this optimal estimator outperforms the standard quadratic estimator by a factor of two. For polarization, we show that the Stokes Q/U maps can be used instead of the traditional E- and B-mode maps without losing information. We test and quantify the bias in the recovered lensing mass for a comprehensive list of potential systematic errors. Using realistic simulations, wemore » examine the cluster mass uncertainties from CMB-cluster lensing as a function of an experiment’s beam size and noise level. We predict the cluster mass uncertainties will be 3 - 6% for SPT-3G, AdvACT, and Simons Array experiments with 10,000 clusters and less than 1% for the CMB-S4 experiment with a sample containing 100,000 clusters. The mass constraints from CMB polarization are very sensitive to the experimental beam size and map noise level: for a factor of three reduction in either the beam size or noise level, the lensing signal-to-noise improves by roughly a factor of two.« less

  1. Measuring galaxy cluster masses with CMB lensing using a Maximum Likelihood estimator: statistical and systematic error budgets for future experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raghunathan, Srinivasan; Patil, Sanjaykumar; Bianchini, Federico

    We develop a Maximum Likelihood estimator (MLE) to measure the masses of galaxy clusters through the impact of gravitational lensing on the temperature and polarization anisotropies of the cosmic microwave background (CMB). We show that, at low noise levels in temperature, this optimal estimator outperforms the standard quadratic estimator by a factor of two. For polarization, we show that the Stokes Q/U maps can be used instead of the traditional E- and B-mode maps without losing information. We test and quantify the bias in the recovered lensing mass for a comprehensive list of potential systematic errors. Using realistic simulations, wemore » examine the cluster mass uncertainties from CMB-cluster lensing as a function of an experiment's beam size and noise level. We predict the cluster mass uncertainties will be 3 - 6% for SPT-3G, AdvACT, and Simons Array experiments with 10,000 clusters and less than 1% for the CMB-S4 experiment with a sample containing 100,000 clusters. The mass constraints from CMB polarization are very sensitive to the experimental beam size and map noise level: for a factor of three reduction in either the beam size or noise level, the lensing signal-to-noise improves by roughly a factor of two.« less

  2. Exact likelihood evaluations and foreground marginalization in low resolution WMAP data

    NASA Astrophysics Data System (ADS)

    Slosar, Anže; Seljak, Uroš; Makarov, Alexey

    2004-06-01

    The large scale anisotropies of Wilkinson Microwave Anisotropy Probe (WMAP) data have attracted a lot of attention and have been a source of controversy, with many favorite cosmological models being apparently disfavored by the power spectrum estimates at low l. All the existing analyses of theoretical models are based on approximations for the likelihood function, which are likely to be inaccurate on large scales. Here we present exact evaluations of the likelihood of the low multipoles by direct inversion of the theoretical covariance matrix for low resolution WMAP maps. We project out the unwanted galactic contaminants using the WMAP derived maps of these foregrounds. This improves over the template based foreground subtraction used in the original analysis, which can remove some of the cosmological signal and may lead to a suppression of power. As a result we find an increase in power at low multipoles. For the quadrupole the maximum likelihood values are rather uncertain and vary between 140 and 220 μK2. On the other hand, the probability distribution away from the peak is robust and, assuming a uniform prior between 0 and 2000 μK2, the probability of having the true value above 1200 μK2 (as predicted by the simplest cold dark matter model with a cosmological constant) is 10%, a factor of 2.5 higher than predicted by the WMAP likelihood code. We do not find the correlation function to be unusual beyond the low quadrupole value. We develop a fast likelihood evaluation routine that can be used instead of WMAP routines for low l values. We apply it to the Markov chain Monte Carlo analysis to compare the cosmological parameters between the two cases. The new analysis of WMAP either alone or jointly with the Sloan Digital Sky Survey (SDSS) and the Very Small Array (VSA) data reduces the evidence for running to less than 1σ, giving αs=-0.022±0.033 for the combined case. The new analysis prefers about a 1σ lower value of Ωm, a consequence of an increased integrated Sachs-Wolfe (ISW) effect contribution required by the increase in the spectrum at low l. These results suggest that the details of foreground removal and full likelihood analysis are important for parameter estimation from the WMAP data. They are robust in the sense that they do not change significantly with frequency, mask, or details of foreground template marginalization. The marginalization approach presented here is the most conservative method to remove the foregrounds and should be particularly useful in the analysis of polarization, where foreground contamination may be much more severe.

  3. Seismic risk maps

    USGS Publications Warehouse

    Perkins, David M.; Mallis, Robert

    1974-01-01

    What is the possibility that an earthquake will occur near you? Have you ever wondered what the chances were that an earthquake could affect you at home or at work? Perhaps you are planning to move to a part of the country subject to frequent earthquakes. How do you determine the relative likelihood that an earthquake will cause damage at a particular place.  

  4. Scaffolding Learner-Centered Curricular Coherence Using Learning Maps and Diagnostic Assessments Designed around Mathematics Learning Trajectories

    ERIC Educational Resources Information Center

    Confrey, Jere; Gianopulos, Garron; McGowan, William; Shah, Meetal; Belcher, Michael

    2017-01-01

    The paper describes how designers used the construct of learning trajectories to create a tool, Math-Mapper 6-8, to help scaffold curricula toward increased learner-centered coherence. It defines "learner-centered curricular coherence" as "an organizational means to promote a high likelihood that each learner traverses one of many…

  5. Terrain Classification Using Multi-Wavelength Lidar Data

    DTIC Science & Technology

    2015-09-01

    Figure 9. Pseudo- NDVI of three layers within the vertical structure of the forest. (Top) First return from the LiDAR instrument, including the ground...in NDVI throughout the vertical canopy. ........................................................17 Figure 10. Optech Titan operating wavelengths...and Ranging LMS LiDAR Mapping Suite ML Maximum Likelihood NIR Near Infrared N-D VIS n-Dimensional Visualizer NDVI Normalized Difference

  6. Achieving behaviour change for detection of Lynch syndrome using the Theoretical Domains Framework Implementation (TDFI) approach: a study protocol.

    PubMed

    Taylor, Natalie; Long, Janet C; Debono, Deborah; Williams, Rachel; Salisbury, Elizabeth; O'Neill, Sharron; Eykman, Elizabeth; Braithwaite, Jeffrey; Chin, Melvin

    2016-03-12

    Lynch syndrome is an inherited disorder associated with a range of cancers, and found in 2-5 % of colorectal cancers. Lynch syndrome is diagnosed through a combination of significant family and clinical history and pathology. The definitive diagnostic germline test requires formal patient consent after genetic counselling. If diagnosed early, carriers of Lynch syndrome can undergo increased surveillance for cancers, which in turn can prevent late stage cancers, optimise treatment and decrease mortality for themselves and their relatives. However, over the past decade, international studies have reported that only a small proportion of individuals with suspected Lynch syndrome were referred for genetic consultation and possible genetic testing. The aim of this project is to use behaviour change theory and implementation science approaches to increase the number and speed of healthcare professional referrals of colorectal cancer patients with a high-likelihood risk of Lynch syndrome to appropriate genetic counselling services. The six-step Theoretical Domains Framework Implementation (TDFI) approach will be used at two large, metropolitan hospitals treating colorectal cancer patients. Steps are: 1) form local multidisciplinary teams to map current referral processes; 2) identify target behaviours that may lead to increased referrals using discussion supported by a retrospective audit; 3) identify barriers to those behaviours using the validated Influences on Patient Safety Behaviours Questionnaire and TDFI guided focus groups; 4) co-design interventions to address barriers using focus groups; 5) co-implement interventions; and 6) evaluate intervention impact. Chi square analysis will be used to test the difference in the proportion of high-likelihood risk Lynch syndrome patients being referred for genetic testing before and after intervention implementation. A paired t-test will be used to assess the mean time from the pathology test results to referral for high-likelihood Lynch syndrome patients pre-post intervention. Run charts will be used to continuously monitor change in referrals over time, based on scheduled monthly audits. This project is based on a tested and refined implementation strategy (TDFI approach). Enhancing the process of identifying and referring people at high-likelihood risk of Lynch syndrome for genetic counselling will improve outcomes for patients and their relatives, and potentially save public money.

  7. Predictors of continued problem drinking and substance use following military discharge.

    PubMed

    Norman, Sonya B; Schmied, Emily; Larson, Gerald E

    2014-07-01

    The goals of the present study were to (a) examine change in rates of problem alcohol/substance use among a sample of veterans between their last year of military service and their first year following separation, (b) identify predictors of continued problem use in the first year after separation, and (c) evaluate the hypothesis that avoidant coping, posttraumatic stress disorder (PTSD) symptoms, and chronic stress place individuals at particularly high risk for continued problem use. Participants (N = 1,599) completed self-report measures before and during the year following separation. Participants who endorsed either having used more than intended or wanting or needing to cut down during the past year were considered to have problem use. Of 742 participants reporting problem substance use at baseline, 42% reported continued problem substance use at follow-up ("persistors"). Persistors reported more trouble adjusting to civilian life, had a greater likelihood of driving while intoxicated, and had a greater likelihood of aggression. Multivariate analyses showed that avoidant coping score at baseline and higher PTSD symptom score and greater sensation seeking at follow up predicted continued problem use. Understanding risk factors for continued problem use is a prerequisite for targeted prevention of chronic problems and associated negative life consequences.

  8. Global mapping of infectious disease

    PubMed Central

    Hay, Simon I.; Battle, Katherine E.; Pigott, David M.; Smith, David L.; Moyes, Catherine L.; Bhatt, Samir; Brownstein, John S.; Collier, Nigel; Myers, Monica F.; George, Dylan B.; Gething, Peter W.

    2013-01-01

    The primary aim of this review was to evaluate the state of knowledge of the geographical distribution of all infectious diseases of clinical significance to humans. A systematic review was conducted to enumerate cartographic progress, with respect to the data available for mapping and the methods currently applied. The results helped define the minimum information requirements for mapping infectious disease occurrence, and a quantitative framework for assessing the mapping opportunities for all infectious diseases. This revealed that of 355 infectious diseases identified, 174 (49%) have a strong rationale for mapping and of these only 7 (4%) had been comprehensively mapped. A variety of ambitions, such as the quantification of the global burden of infectious disease, international biosurveillance, assessing the likelihood of infectious disease outbreaks and exploring the propensity for infectious disease evolution and emergence, are limited by these omissions. An overview of the factors hindering progress in disease cartography is provided. It is argued that rapid improvement in the landscape of infectious diseases mapping can be made by embracing non-conventional data sources, automation of geo-positioning and mapping procedures enabled by machine learning and information technology, respectively, in addition to harnessing labour of the volunteer ‘cognitive surplus’ through crowdsourcing. PMID:23382431

  9. An unsupervised video foreground co-localization and segmentation process by incorporating motion cues and frame features

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Zhang, Qian; Zheng, Chi; Qiu, Guoping

    2018-04-01

    Video foreground segmentation is one of the key problems in video processing. In this paper, we proposed a novel and fully unsupervised approach for foreground object co-localization and segmentation of unconstrained videos. We firstly compute both the actual edges and motion boundaries of the video frames, and then align them by their HOG feature maps. Then, by filling the occlusions generated by the aligned edges, we obtained more precise masks about the foreground object. Such motion-based masks could be derived as the motion-based likelihood. Moreover, the color-base likelihood is adopted for the segmentation process. Experimental Results show that our approach outperforms most of the State-of-the-art algorithms.

  10. ILP-based maximum likelihood genome scaffolding

    PubMed Central

    2014-01-01

    Background Interest in de novo genome assembly has been renewed in the past decade due to rapid advances in high-throughput sequencing (HTS) technologies which generate relatively short reads resulting in highly fragmented assemblies consisting of contigs. Additional long-range linkage information is typically used to orient, order, and link contigs into larger structures referred to as scaffolds. Due to library preparation artifacts and erroneous mapping of reads originating from repeats, scaffolding remains a challenging problem. In this paper, we provide a scalable scaffolding algorithm (SILP2) employing a maximum likelihood model capturing read mapping uncertainty and/or non-uniformity of contig coverage which is solved using integer linear programming. A Non-Serial Dynamic Programming (NSDP) paradigm is applied to render our algorithm useful in the processing of larger mammalian genomes. To compare scaffolding tools, we employ novel quantitative metrics in addition to the extant metrics in the field. We have also expanded the set of experiments to include scaffolding of low-complexity metagenomic samples. Results SILP2 achieves better scalability throughg a more efficient NSDP algorithm than previous release of SILP. The results show that SILP2 compares favorably to previous methods OPERA and MIP in both scalability and accuracy for scaffolding single genomes of up to human size, and significantly outperforms them on scaffolding low-complexity metagenomic samples. Conclusions Equipped with NSDP, SILP2 is able to scaffold large mammalian genomes, resulting in the longest and most accurate scaffolds. The ILP formulation for the maximum likelihood model is shown to be flexible enough to handle metagenomic samples. PMID:25253180

  11. Planck intermediate results: XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    DOE PAGES

    Aghanim, N.; Ashdown, M.; Aumont, J.; ...

    2016-12-12

    This study describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducingmore » significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. Finally, in a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.« less

  12. Planck intermediate results: XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghanim, N.; Ashdown, M.; Aumont, J.

    This study describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducingmore » significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. Finally, in a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.« less

  13. Planck intermediate results. XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Aghanim, N.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Ballardini, M.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battye, R.; Benabed, K.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carron, J.; Challinor, A.; Chiang, H. C.; Colombo, L. P. L.; Combet, C.; Comis, B.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Fantaye, Y.; Finelli, F.; Forastieri, F.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Génova-Santos, R. T.; Gerbino, M.; Ghosh, T.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Helou, G.; Henrot-Versillé, S.; Herranz, D.; Hivon, E.; Huang, Z.; Ilić, S.; Jaffe, A. H.; Jones, W. C.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Knox, L.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Langer, M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leahy, J. P.; Levrier, F.; Liguori, M.; Lilje, P. B.; López-Caniego, M.; Ma, Y.-Z.; Macías-Pérez, J. F.; Maggio, G.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Matarrese, S.; Mauri, N.; McEwen, J. D.; Meinhold, P. R.; Melchiorri, A.; Mennella, A.; Migliaccio, M.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Moss, A.; Mottet, S.; Naselsky, P.; Natoli, P.; Oxborrow, C. A.; Pagano, L.; Paoletti, D.; Partridge, B.; Patanchon, G.; Patrizii, L.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Plaszczynski, S.; Polastri, L.; Polenta, G.; Puget, J.-L.; Rachen, J. P.; Racine, B.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Rocha, G.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Ruiz-Granados, B.; Salvati, L.; Sandri, M.; Savelainen, M.; Scott, D.; Sirri, G.; Sunyaev, R.; Suur-Uski, A.-S.; Tauber, J. A.; Tenti, M.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Valiviita, J.; Van Tent, F.; Vibert, L.; Vielva, P.; Villa, F.; Vittorio, N.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; White, M.; Zacchei, A.; Zonca, A.

    2016-12-01

    This paper describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducing significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. In a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.

  14. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  15. State Space Modeling of Time-Varying Contemporaneous and Lagged Relations in Connectivity Maps

    PubMed Central

    Molenaar, Peter C. M.; Beltz, Adriene M.; Gates, Kathleen M.; Wilson, Stephen J.

    2017-01-01

    Most connectivity mapping techniques for neuroimaging data assume stationarity (i.e., network parameters are constant across time), but this assumption does not always hold true. The authors provide a description of a new approach for simultaneously detecting time-varying (or dynamic) contemporaneous and lagged relations in brain connectivity maps. Specifically, they use a novel raw data likelihood estimation technique (involving a second-order extended Kalman filter/smoother embedded in a nonlinear optimizer) to determine the variances of the random walks associated with state space model parameters and their autoregressive components. The authors illustrate their approach with simulated and blood oxygen level-dependent functional magnetic resonance imaging data from 30 daily cigarette smokers performing a verbal working memory task, focusing on seven regions of interest (ROIs). Twelve participants had dynamic directed functional connectivity maps: Eleven had one or more time-varying contemporaneous ROI state loadings, and one had a time-varying autoregressive parameter. Compared to smokers without dynamic maps, smokers with dynamic maps performed the task with greater accuracy. Thus, accurate detection of dynamic brain processes is meaningfully related to behavior in a clinical sample. PMID:26546863

  16. Hybrid ICA-Bayesian Network approach reveals distinct effective connectivity differences in schizophrenia

    PubMed Central

    Kim, D.; Burge, J.; Lane, T.; Pearlson, G. D; Kiehl, K. A; Calhoun, V. D.

    2008-01-01

    We utilized a discrete dynamic Bayesian network (dDBN) approach (Burge et al., 2007) to determine differences in brain regions between patients with schizophrenia and healthy controls on a measure of effective connectivity, termed the approximate conditional likelihood score (ACL) (Burge and Lane, 2005). The ACL score represents a class-discriminative measure of effective connectivity by measuring the relative likelihood of the correlation between brain regions in one group versus another. The algorithm is capable of finding non-linear relationships between brain regions because it uses discrete rather than continuous values and attempts to model temporal relationships with a first-order Markov and stationary assumption constraint (Papoulis, 1991). Since Bayesian networks are overly sensitive to noisy data, we introduced an independent component analysis (ICA) filtering approach that attempted to reduce the noise found in fMRI data by unmixing the raw datasets into a set of independent spatial component maps. Components that represented noise were removed and the remaining components reconstructed into the dimensions of the original fMRI datasets. We applied the dDBN algorithm to a group of 35 patients with schizophrenia and 35 matched healthy controls using an ICA filtered and unfiltered approach. We determined that filtering the data significantly improved the magnitude of the ACL score. Patients showed the greatest ACL scores in several regions, most markedly the cerebellar vermis and hemispheres. Our findings suggest that schizophrenia patients exhibit weaker connectivity than healthy controls in multiple regions, including bilateral temporal and frontal cortices, plus cerebellum during an auditory paradigm. PMID:18602482

  17. Fragment assignment in the cloud with eXpress-D

    PubMed Central

    2013-01-01

    Background Probabilistic assignment of ambiguously mapped fragments produced by high-throughput sequencing experiments has been demonstrated to greatly improve accuracy in the analysis of RNA-Seq and ChIP-Seq, and is an essential step in many other sequence census experiments. A maximum likelihood method using the expectation-maximization (EM) algorithm for optimization is commonly used to solve this problem. However, batch EM-based approaches do not scale well with the size of sequencing datasets, which have been increasing dramatically over the past few years. Thus, current approaches to fragment assignment rely on heuristics or approximations for tractability. Results We present an implementation of a distributed EM solution to the fragment assignment problem using Spark, a data analytics framework that can scale by leveraging compute clusters within datacenters–“the cloud”. We demonstrate that our implementation easily scales to billions of sequenced fragments, while providing the exact maximum likelihood assignment of ambiguous fragments. The accuracy of the method is shown to be an improvement over the most widely used tools available and can be run in a constant amount of time when cluster resources are scaled linearly with the amount of input data. Conclusions The cloud offers one solution for the difficulties faced in the analysis of massive high-thoughput sequencing data, which continue to grow rapidly. Researchers in bioinformatics must follow developments in distributed systems–such as new frameworks like Spark–for ways to port existing methods to the cloud and help them scale to the datasets of the future. Our software, eXpress-D, is freely available at: http://github.com/adarob/express-d. PMID:24314033

  18. United States Geological Survey fire science: fire danger monitoring and forecasting

    USGS Publications Warehouse

    Eidenshink, Jeff C.; Howard, Stephen M.

    2012-01-01

    Each day, the U.S. Geological Survey produces 7-day forecasts for all Federal lands of the distributions of number of ignitions, number of fires above a given size, and conditional probabilities of fires growing larger than a specified size. The large fire probability map is an estimate of the likelihood that ignitions will become large fires. The large fire forecast map is a probability estimate of the number of fires on federal lands exceeding 100 acres in the forthcoming week. The ignition forecast map is a probability estimate of the number of fires on Federal land greater than 1 acre in the forthcoming week. The extreme event forecast is the probability estimate of the number of fires on Federal land that may exceed 5,000 acres in the forthcoming week.

  19. Inferring the parameters of a Markov process from snapshots of the steady state

    NASA Astrophysics Data System (ADS)

    Dettmer, Simon L.; Berg, Johannes

    2018-02-01

    We seek to infer the parameters of an ergodic Markov process from samples taken independently from the steady state. Our focus is on non-equilibrium processes, where the steady state is not described by the Boltzmann measure, but is generally unknown and hard to compute, which prevents the application of established equilibrium inference methods. We propose a quantity we call propagator likelihood, which takes on the role of the likelihood in equilibrium processes. This propagator likelihood is based on fictitious transitions between those configurations of the system which occur in the samples. The propagator likelihood can be derived by minimising the relative entropy between the empirical distribution and a distribution generated by propagating the empirical distribution forward in time. Maximising the propagator likelihood leads to an efficient reconstruction of the parameters of the underlying model in different systems, both with discrete configurations and with continuous configurations. We apply the method to non-equilibrium models from statistical physics and theoretical biology, including the asymmetric simple exclusion process (ASEP), the kinetic Ising model, and replicator dynamics.

  20. A Semiparametric Approach for Composite Functional Mapping of Dynamic Quantitative Traits

    PubMed Central

    Yang, Runqing; Gao, Huijiang; Wang, Xin; Zhang, Ji; Zeng, Zhao-Bang; Wu, Rongling

    2007-01-01

    Functional mapping has emerged as a powerful tool for mapping quantitative trait loci (QTL) that control developmental patterns of complex dynamic traits. Original functional mapping has been constructed within the context of simple interval mapping, without consideration of separate multiple linked QTL for a dynamic trait. In this article, we present a statistical framework for mapping QTL that affect dynamic traits by capitalizing on the strengths of functional mapping and composite interval mapping. Within this so-called composite functional-mapping framework, functional mapping models the time-dependent genetic effects of a QTL tested within a marker interval using a biologically meaningful parametric function, whereas composite interval mapping models the time-dependent genetic effects of the markers outside the test interval to control the genome background using a flexible nonparametric approach based on Legendre polynomials. Such a semiparametric framework was formulated by a maximum-likelihood model and implemented with the EM algorithm, allowing for the estimation and the test of the mathematical parameters that define the QTL effects and the regression coefficients of the Legendre polynomials that describe the marker effects. Simulation studies were performed to investigate the statistical behavior of composite functional mapping and compare its advantage in separating multiple linked QTL as compared to functional mapping. We used the new mapping approach to analyze a genetic mapping example in rice, leading to the identification of multiple QTL, some of which are linked on the same chromosome, that control the developmental trajectory of leaf age. PMID:17947431

  1. STOCHASTIC DUELS WITH HOMING,

    DTIC Science & Technology

    Duels where both marksmen ’home’ or ’zero in’ on one another are here considered, and the effect of this on the win probability is determined. It is...leads to win probabilities that can be straightforwardly evaluated. Maximum-likelihood estimation of the hit probability and homing from field data is outlined. The solutions of the duels are displayed as contour maps. (Author)

  2. Mineral resource potential map of the Spanish Peaks Wilderness Study Area, Huerfano and Las Animas counties, Colorado

    USGS Publications Warehouse

    Budding, Karin E.; Kluender, Steven E.

    1983-01-01

    The depth of several thousand feet at which coal may underlie the surface rocks of the study area makes it a resource with little likelihood of development. The potential for oil and gas appears low because of the apparent lack of structural traps and the intense igneous activity in the area.

  3. Anaysis of the quality of image data required by the LANDSAT-4 Thematic Mapper and Multispectral Scanner. [agricultural and forest cover types in California

    NASA Technical Reports Server (NTRS)

    Colwell, R. N. (Principal Investigator)

    1984-01-01

    The spatial, geometric, and radiometric qualities of LANDSAT 4 thematic mapper (TM) and multispectral scanner (MSS) data were evaluated by interpreting, through visual and computer means, film and digital products for selected agricultural and forest cover types in California. Multispectral analyses employing Bayesian maximum likelihood, discrete relaxation, and unsupervised clustering algorithms were used to compare the usefulness of TM and MSS data for discriminating individual cover types. Some of the significant results are as follows: (1) for maximizing the interpretability of agricultural and forest resources, TM color composites should contain spectral bands in the visible, near-reflectance infrared, and middle-reflectance infrared regions, namely TM 4 and TM % and must contain TM 4 in all cases even at the expense of excluding TM 5; (2) using enlarged TM film products, planimetric accuracy of mapped poins was within 91 meters (RMSE east) and 117 meters (RMSE north); (3) using TM digital products, planimetric accuracy of mapped points was within 12.0 meters (RMSE east) and 13.7 meters (RMSE north); and (4) applying a contextual classification algorithm to TM data provided classification accuracies competitive with Bayesian maximum likelihood.

  4. Does Availability of Worksite Supports for Physical Activity Differ by Industry and Occupation?

    PubMed

    Dodson, Elizabeth A; Hipp, J Aaron; Lee, Jung Ae; Yang, Lin; Marx, Christine M; Tabak, Rachel G; Brownson, Ross C

    2018-03-01

    To explore combinations of worksite supports (WSS) for physical activity (PA) that may assist employees in meeting PA recommendations and to investigate how availability of WSS differs across industries and occupations. Cross-sectional. Several Missouri metropolitan areas. Adults employed >20 h/wk outside the home. Survey utilized existing self-reported measures (eg, presence of WSS for PA) and the International Physical Activity Questionnaire. Logistic regression was conducted for 2 outcome variables: leisure and transportation PA. Independent variables included 16 WSS. Of particular interest were interaction effects between WSS variables. Analyses were stratified by 5 occupation and 7 industry types. Overall, 2013 people completed the survey (46% response rate). Often, availability of 1 WSS did not increase the likelihood of meeting PA recommendations, but several pairs of WSS did. For example, in business occupations, the odds of meeting PA recommendations through transportation PA increased when employees had access to showers and incentives to bike/walk (odds ratio [OR] = 1.6; 95% confidence interval [CI] = 1.16-2.22); showers and maps (OR = 1.25; 1.02-1.55); maps and incentives to bike/walk (OR = 1.48; 1.04-2.12). Various combinations of WSS may increase the likelihood that employees will meet PA recommendations. Many are of low or no cost, including flexible time for exercise and maps of worksite-adjacent walk/bike routes. Findings may be instructive for employers seeking to improve employee health through worksite PA.

  5. Analysis of correlations between the occurrence of anti-MAP antibodies in blood serum and the presence of DNA-MAP in milk.

    PubMed

    Wiszniewska-Łaszczych, A; Szteyn, J; Smolińska, A

    2009-01-01

    Paratuberculosis (Johne's disease) is a chronic, infectious enteritis of both domestic and wild ruminants. Unfortunately, the problem of MAP infections is not linked only with the health status of animals and potential direct and indirect economic losses in bovine herds (of dairy cattle in particular). MAP bacilli present in food of animal origin (milk in particular) are likely to lead to the development of the disease in humans. Fast and effective diagnosis of the disease in animals, especially of its subclinical form, may prevent the transmission of the germ to humans. The study was aimed at analyzing the correlations between the occurance of seropositive and serodoubtful reaction in the ELISA test and the presence of DNA-MAP in udder milk. The results suggest that half of the population of animals with positive and doubtful serological responces against John's disease are likely to be a potential source of germ transmission into humans. The fact of detecting DNA-MAP in 1/3 of all milk samples points to the likelihood of occurrence of MAP bacilli in milk of animals not displaying seropositive or serodoubtful responses.

  6. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  7. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  8. Zero entropy continuous interval maps and MMLS-MMA property

    NASA Astrophysics Data System (ADS)

    Jiang, Yunping

    2018-06-01

    We prove that the flow generated by any continuous interval map with zero topological entropy is minimally mean-attractable and minimally mean-L-stable. One of the consequences is that any oscillating sequence is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy. In particular, the Möbius function is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy (Sarnak’s conjecture for continuous interval maps). Another consequence is a non-trivial example of a flow having discrete spectrum. We also define a log-uniform oscillating sequence and show a result in ergodic theory for comparison. This material is based upon work supported by the National Science Foundation. It is also partially supported by a collaboration grant from the Simons Foundation (grant number 523341) and PSC-CUNY awards and a grant from NSFC (grant number 11571122).

  9. Attitude determination and calibration using a recursive maximum likelihood-based adaptive Kalman filter

    NASA Technical Reports Server (NTRS)

    Kelly, D. A.; Fermelia, A.; Lee, G. K. F.

    1990-01-01

    An adaptive Kalman filter design that utilizes recursive maximum likelihood parameter identification is discussed. At the center of this design is the Kalman filter itself, which has the responsibility for attitude determination. At the same time, the identification algorithm is continually identifying the system parameters. The approach is applicable to nonlinear, as well as linear systems. This adaptive Kalman filter design has much potential for real time implementation, especially considering the fast clock speeds, cache memory and internal RAM available today. The recursive maximum likelihood algorithm is discussed in detail, with special attention directed towards its unique matrix formulation. The procedure for using the algorithm is described along with comments on how this algorithm interacts with the Kalman filter.

  10. 40 CFR 146.9 - Criteria for establishing permitting priorities.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS General Provisions § 146...) Likelihood of contamination of underground sources of drinking water; (d) Potentially affected population; (e...

  11. 40 CFR 146.9 - Criteria for establishing permitting priorities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS General Provisions § 146...) Likelihood of contamination of underground sources of drinking water; (d) Potentially affected population; (e...

  12. 40 CFR 146.9 - Criteria for establishing permitting priorities.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS General Provisions § 146...) Likelihood of contamination of underground sources of drinking water; (d) Potentially affected population; (e...

  13. 40 CFR 146.9 - Criteria for establishing permitting priorities.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS General Provisions § 146...) Likelihood of contamination of underground sources of drinking water; (d) Potentially affected population; (e...

  14. The random continued fraction transformation

    NASA Astrophysics Data System (ADS)

    Kalle, Charlene; Kempton, Tom; Verbitskiy, Evgeny

    2017-03-01

    We introduce a random dynamical system related to continued fraction expansions. It uses random combinations of the Gauss map and the Rényi (or backwards) continued fraction map. We explore the continued fraction expansions that this system produces, as well as the dynamical properties of the system.

  15. Methodology and method and apparatus for signaling with capacity optimized constellations

    NASA Technical Reports Server (NTRS)

    Barsoum, Maged F. (Inventor); Jones, Christopher R. (Inventor)

    2011-01-01

    Communication systems having transmitter, includes a coder configured to receive user bits and output encoded bits at an expanded output encoded bit rate, a mapper configured to map encoded bits to symbols in a symbol constellation, a modulator configured to generate a signal for transmission via the communication channel using symbols generated by the mapper. In addition, the receiver includes a demodulator configured to demodulate the received signal via the communication channel, a demapper configured to estimate likelihoods from the demodulated signal, a decoder that is configured to estimate decoded bits from the likelihoods generated by the demapper. Furthermore, the symbol constellation is a capacity optimized geometrically spaced symbol constellation that provides a given capacity at a reduced signal-to-noise ratio compared to a signal constellation that maximizes d.sub.min.

  16. Linkage analysis by genotyping of sibling populations: a genetic map for the potato cyst nematode constructed using a "pseudo-F2" mapping strategy.

    PubMed

    Rouppe van der Voort, J N; van Eck, H J; van Zandvoort, P M; Overmars, H; Helder, J; Bakker, J

    1999-07-01

    A mapping strategy is described for the construction of a linkage map of a non-inbred species in which individual offspring genotypes are not amenable to marker analysis. After one extra generation of random mating, the segregating progeny was propagated, and bulked populations of offspring were analyzed. Although the resulting population structure is different from that of commonly used mapping populations, we show that the maximum likelihood formula for a normal F2 is applicable for the estimation of recombination. This "pseudo-F2" mapping strategy, in combination with the development of an AFLP assay for single cysts, facilitated the construction of a linkage map for the potato cyst nematode Globodera rostochiensis. Using 12 pre-selected AFLP primer combinations, a total of 66 segregating markers were identified, 62 of which were mapped to nine linkage groups. These 62 AFLP markers are randomly distributed and cover about 65% of the genome. An estimate of the physical size of the Globodera genome was obtained from comparisons of the number of AFLP fragments obtained with the values for Caenorhabditis elegans. The methodology presented here resulted in the first genomic map for a cyst nematode. The low value of the kilobase/centimorgan (kb/cM) ratio for the Globodera genome will facilitate map-based cloning of genes that mediate the interaction between the nematode and its host plant.

  17. Automatic Modulation Classification of Common Communication and Pulse Compression Radar Waveforms using Cyclic Features

    DTIC Science & Technology

    2013-03-01

    intermediate frequency LFM linear frequency modulation MAP maximum a posteriori MATLAB® matrix laboratory ML maximun likelihood OFDM orthogonal frequency...spectrum, frequency hopping, and orthogonal frequency division multiplexing ( OFDM ) modulations. Feature analysis would be a good research thrust to...determine feature relevance and decide if removing any features improves performance. Also, extending the system for simulations using a MIMO receiver or

  18. Genetic Modifiers of Ovarian Cancer

    DTIC Science & Technology

    2014-08-01

    samples from many countries. To account for population stratification, the genotyping data in combination with HapMap data (CEU, Yoruban, Han Chinese...Cambridge, we evaluated associations with both breast and ovarian cancer using a retrospective likelihood model. This accounts for the age extremes of...carriers we used a competing risk analysis that accounted for the effects on breast and ovarian cancer in parallel. In this competing risk analysis

  19. Integrated Efforts for Analysis of Geophysical Measurements and Models.

    DTIC Science & Technology

    1997-09-26

    12b. DISTRIBUTION CODE 13. ABSTRACT ( Maximum 200 words) This contract supported investigations of integrated applications of physics, ephemerides...REGIONS AND GPS DATA VALIDATIONS 20 2.5 PL-SCINDA: VISUALIZATION AND ANALYSIS TECHNIQUES 22 2.5.1 View Controls 23 2.5.2 Map Selection...and IR data, about cloudy pixels. Clustering and maximum likelihood classification algorithms categorize up to four cloud layers into stratiform or

  20. WE-AB-BRA-05: Fully Automatic Segmentation of Male Pelvic Organs On CT Without Manual Intervention

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Y; Lian, J; Chen, R

    Purpose: We aim to develop a fully automatic tool for accurate contouring of major male pelvic organs in CT images for radiotherapy without any manual initialization, yet still achieving superior performance than the existing tools. Methods: A learning-based 3D deformable shape model was developed for automatic contouring. Specifically, we utilized a recent machine learning method, random forest, to jointly learn both image regressor and classifier for each organ. In particular, the image regressor is trained to predict the 3D displacement from each vertex of the 3D shape model towards the organ boundary based on the local image appearance around themore » location of this vertex. The predicted 3D displacements are then used to drive the 3D shape model towards the target organ. Once the shape model is deformed close to the target organ, it is further refined by an organ likelihood map estimated by the learned classifier. As the organ likelihood map provides good guideline for the organ boundary, the precise contouring Result could be achieved, by deforming the 3D shape model locally to fit boundaries in the organ likelihood map. Results: We applied our method to 29 previously-treated prostate cancer patients, each with one planning CT scan. Compared with manually delineated pelvic organs, our method obtains overlap ratios of 85.2%±3.74% for the prostate, 94.9%±1.62% for the bladder, and 84.7%±1.97% for the rectum, respectively. Conclusion: This work demonstrated feasibility of a novel machine-learning based approach for accurate and automatic contouring of major male pelvic organs. It shows the potential to replace the time-consuming and inconsistent manual contouring in the clinic. Also, compared with the existing works, our method is more accurate and also efficient since it does not require any manual intervention, such as manual landmark placement. Moreover, our method obtained very similar contouring results as the clinical experts. Project is partially support by a grant from NCI 1R01CA140413.« less

  1. Efficient, adaptive estimation of two-dimensional firing rate surfaces via Gaussian process methods.

    PubMed

    Rad, Kamiar Rahnama; Paninski, Liam

    2010-01-01

    Estimating two-dimensional firing rate maps is a common problem, arising in a number of contexts: the estimation of place fields in hippocampus, the analysis of temporally nonstationary tuning curves in sensory and motor areas, the estimation of firing rates following spike-triggered covariance analyses, etc. Here we introduce methods based on Gaussian process nonparametric Bayesian techniques for estimating these two-dimensional rate maps. These techniques offer a number of advantages: the estimates may be computed efficiently, come equipped with natural errorbars, adapt their smoothness automatically to the local density and informativeness of the observed data, and permit direct fitting of the model hyperparameters (e.g., the prior smoothness of the rate map) via maximum marginal likelihood. We illustrate the method's flexibility and performance on a variety of simulated and real data.

  2. AFLP-based genetic mapping of the “bud-flowering” trait in heather (Calluna vulgaris)

    PubMed Central

    2013-01-01

    Background Calluna vulgaris is one of the most important landscaping plants produced in Germany. Its enormous economic success is due to the prolonged flower attractiveness of mutants in flower morphology, the so-called bud-bloomers. In this study, we present the first genetic linkage map of C. vulgaris in which we mapped a locus of the economically highly desired trait “flower type”. Results The map was constructed in JoinMap 4.1. using 535 AFLP markers from a single mapping population. A large fraction (40%) of markers showed distorted segregation. To test the effect of segregation distortion on linkage estimation, these markers were sorted regarding their segregation ratio and added in groups to the data set. The plausibility of group formation was evaluated by comparison of the “two-way pseudo-testcross” and the “integrated” mapping approach. Furthermore, regression mapping was compared to the multipoint-likelihood algorithm. The majority of maps constructed by different combinations of these methods consisted of eight linkage groups corresponding to the chromosome number of C. vulgaris. Conclusions All maps confirmed the independent inheritance of the most important horticultural traits “flower type”, “flower colour”, and “leaf colour”. An AFLP marker for the most important breeding target “flower type” was identified. The presented genetic map of C. vulgaris can now serve as a basis for further molecular marker selection and map-based cloning of the candidate gene encoding the unique flower architecture of C. vulgaris bud-bloomers. PMID:23915059

  3. Additive hazards regression and partial likelihood estimation for ecological monitoring data across space.

    PubMed

    Lin, Feng-Chang; Zhu, Jun

    2012-01-01

    We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.

  4. Epidemiologic programs for computers and calculators. A microcomputer program for multiple logistic regression by unconditional and conditional maximum likelihood methods.

    PubMed

    Campos-Filho, N; Franco, E L

    1989-02-01

    A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.

  5. Mapping of uncertainty relations between continuous and discrete time

    NASA Astrophysics Data System (ADS)

    Chiuchiú, Davide; Pigolotti, Simone

    2018-03-01

    Lower bounds on fluctuations of thermodynamic currents depend on the nature of time, discrete or continuous. To understand the physical reason, we compare current fluctuations in discrete-time Markov chains and continuous-time master equations. We prove that current fluctuations in the master equations are always more likely, due to random timings of transitions. This comparison leads to a mapping of the moments of a current between discrete and continuous time. We exploit this mapping to obtain uncertainty bounds. Our results reduce the quests for uncertainty bounds in discrete and continuous time to a single problem.

  6. Mapping of uncertainty relations between continuous and discrete time.

    PubMed

    Chiuchiù, Davide; Pigolotti, Simone

    2018-03-01

    Lower bounds on fluctuations of thermodynamic currents depend on the nature of time, discrete or continuous. To understand the physical reason, we compare current fluctuations in discrete-time Markov chains and continuous-time master equations. We prove that current fluctuations in the master equations are always more likely, due to random timings of transitions. This comparison leads to a mapping of the moments of a current between discrete and continuous time. We exploit this mapping to obtain uncertainty bounds. Our results reduce the quests for uncertainty bounds in discrete and continuous time to a single problem.

  7. Heralded processes on continuous-variable spaces as quantum maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreyrol, Franck; Spagnolo, Nicolò; Blandino, Rémi

    2014-12-04

    Heralding processes, which only work when a measurement on a part of the system give the good result, are particularly interesting for continuous-variables. They permit non-Gaussian transformations that are necessary for several continuous-variable quantum information tasks. However if maps and quantum process tomography are commonly used to describe quantum transformations in discrete-variable space, they are much rarer in the continuous-variable domain. Also, no convenient tool for representing maps in a way more adapted to the particularities of continuous variables have yet been explored. In this paper we try to fill this gap by presenting such a tool.

  8. A Maximum Likelihood Approach for Multisample Nonlinear Structural Equation Models with Missing Continuous and Dichotomous Data

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Lee, Sik-Yum

    2006-01-01

    Structural equation models are widely appreciated in social-psychological research and other behavioral research to model relations between latent constructs and manifest variables and to control for measurement error. Most applications of SEMs are based on fully observed continuous normal data and models with a linear structural equation.…

  9. Understanding Evaluation of Learning Support in Mathematics and Statistics

    ERIC Educational Resources Information Center

    MacGillivray, Helen; Croft, Tony

    2011-01-01

    With rapid and continuing growth of learning support initiatives in mathematics and statistics found in many parts of the world, and with the likelihood that this trend will continue, there is a need to ensure that robust and coherent measures are in place to evaluate the effectiveness of these initiatives. The nature of learning support brings…

  10. Identification and Mapping of Tree Species in Urban Areas Using WORLDVIEW-2 Imagery

    NASA Astrophysics Data System (ADS)

    Mustafa, Y. T.; Habeeb, H. N.; Stein, A.; Sulaiman, F. Y.

    2015-10-01

    Monitoring and mapping of urban trees are essential to provide urban forestry authorities with timely and consistent information. Modern techniques increasingly facilitate these tasks, but require the development of semi-automatic tree detection and classification methods. In this article, we propose an approach to delineate and map the crown of 15 tree species in the city of Duhok, Kurdistan Region of Iraq using WorldView-2 (WV-2) imagery. A tree crown object is identified first and is subsequently delineated as an image object (IO) using vegetation indices and texture measurements. Next, three classification methods: Maximum Likelihood, Neural Network, and Support Vector Machine were used to classify IOs using selected IO features. The best results are obtained with Support Vector Machine classification that gives the best map of urban tree species in Duhok. The overall accuracy was between 60.93% to 88.92% and κ-coefficient was between 0.57 to 0.75. We conclude that fifteen tree species were identified and mapped at a satisfactory accuracy in urban areas of this study.

  11. SModelS v1.1 user manual: Improving simplified model constraints with efficiency maps

    NASA Astrophysics Data System (ADS)

    Ambrogi, Federico; Kraml, Sabine; Kulkarni, Suchita; Laa, Ursula; Lessa, Andre; Magerl, Veronika; Sonneveld, Jory; Traub, Michael; Waltenberger, Wolfgang

    2018-06-01

    SModelS is an automatized tool for the interpretation of simplified model results from the LHC. It allows to decompose models of new physics obeying a Z2 symmetry into simplified model components, and to compare these against a large database of experimental results. The first release of SModelS, v1.0, used only cross section upper limit maps provided by the experimental collaborations. In this new release, v1.1, we extend the functionality of SModelS to efficiency maps. This increases the constraining power of the software, as efficiency maps allow to combine contributions to the same signal region from different simplified models. Other new features of version 1.1 include likelihood and χ2 calculations, extended information on the topology coverage, an extended database of experimental results as well as major speed upgrades for both the code and the database. We describe in detail the concepts and procedures used in SModelS v1.1, explaining in particular how upper limits and efficiency map results are dealt with in parallel. Detailed instructions for code usage are also provided.

  12. Planck 2013 results. XV. CMB power spectra and likelihood

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Gaier, T. C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Laureijs, R. J.; Lawrence, C. R.; Le Jeune, M.; Leach, S.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Orieux, F.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-11-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best estimate of the CMB angular power spectrum from Planck over three decades in multipole moment, ℓ, covering 2 ≤ ℓ ≤ 2500. The main source of uncertainty at ℓ ≲ 1500 is cosmic variance. Uncertainties in small-scale foreground modelling and instrumental noise dominate the error budget at higher ℓs. For ℓ < 50, our likelihood exploits all Planck frequency channels from 30 to 353 GHz, separating the cosmological CMB signal from diffuse Galactic foregrounds through a physically motivated Bayesian component separation technique. At ℓ ≥ 50, we employ a correlated Gaussian likelihood approximation based on a fine-grained set of angular cross-spectra derived from multiple detector combinations between the 100, 143, and 217 GHz frequency channels, marginalising over power spectrum foreground templates. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on the final cosmological parameters. We find good internal agreement among the high-ℓ cross-spectra with residuals below a few μK2 at ℓ ≲ 1000, in agreement with estimated calibration uncertainties. We compare our results with foreground-cleaned CMB maps derived from all Planck frequencies, as well as with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. We further show that the best-fit ΛCDM cosmology is in excellent agreement with preliminary PlanckEE and TE polarisation spectra. We find that the standard ΛCDM cosmology is well constrained by Planck from the measurements at ℓ ≲ 1500. One specific example is the spectral index of scalar perturbations, for which we report a 5.4σ deviation from scale invariance, ns = 1. Increasing the multipole range beyond ℓ ≃ 1500 does not increase our accuracy for the ΛCDM parameters, but instead allows us to study extensions beyond the standard model. We find no indication of significant departures from the ΛCDM framework. Finally, we report a tension between the Planck best-fit ΛCDM model and the low-ℓ spectrum in the form of a power deficit of 5-10% at ℓ ≲ 40, with a statistical significance of 2.5-3σ. Without a theoretically motivated model for this power deficit, we do not elaborate further on its cosmological implications, but note that this is our most puzzling finding in an otherwise remarkably consistent data set.

  13. Cosmic Microwave Background Anisotropy Measurement from Python V

    NASA Astrophysics Data System (ADS)

    Coble, K.; Dodelson, S.; Dragovan, M.; Ganga, K.; Knox, L.; Kovac, J.; Ratra, B.; Souradeep, T.

    2003-02-01

    We analyze observations of the microwave sky made with the Python experiment in its fifth year of operation at the Amundsen-Scott South Pole Station in Antarctica. After modeling the noise and constructing a map, we extract the cosmic signal from the data. We simultaneously estimate the angular power spectrum in eight bands ranging from large (l~40) to small (l~260) angular scales, with power detected in the first six bands. There is a significant rise in the power spectrum from large to smaller (l~200) scales, consistent with that expected from acoustic oscillations in the early universe. We compare this Python V map to a map made from data taken in the third year of Python. Python III observations were made at a frequency of 90 GHz and covered a subset of the region of the sky covered by Python V observations, which were made at 40 GHz. Good agreement is obtained both visually (with a filtered version of the map) and via a likelihood ratio test.

  14. The Planck Legacy Archive

    NASA Astrophysics Data System (ADS)

    Dupac, X.; Arviset, C.; Fernandez Barreiro, M.; Lopez-Caniego, M.; Tauber, J.

    2015-12-01

    The Planck Collaboration has released in 2015 their second major dataset through the Planck Legacy Archive (PLA). It includes cosmological, Extragalactic and Galactic science data in temperature (intensity) and polarization. Full-sky maps are provided with unprecedented angular resolution and sensitivity, together with a large number of ancillary maps, catalogues (generic, SZ clusters and Galactic cold clumps), time-ordered data and other information. The extensive cosmological likelihood package allows cosmologists to fully explore the plausible parameters of the Universe. A new web-based PLA user interface is made public since Dec. 2014, allowing easier and faster access to all Planck data, and replacing the previous Java-based software. Numerous additional improvements to the PLA are also being developed through the so-called PLA Added-Value Interface, making use of an external contract with the Planetek Hellas and Expert Analytics software companies. This will allow users to process time-ordered data into sky maps, separate astrophysical components in existing maps, simulate the microwave and infrared sky through the Planck Sky Model, and use a number of other functionalities.

  15. Automatic mapping of event landslides at basin scale in Taiwan using a Montecarlo approach and synthetic land cover fingerprints

    NASA Astrophysics Data System (ADS)

    Mondini, Alessandro C.; Chang, Kang-Tsung; Chiang, Shou-Hao; Schlögel, Romy; Notarnicola, Claudia; Saito, Hitoshi

    2017-12-01

    We propose a framework to systematically generate event landslide inventory maps from satellite images in southern Taiwan, where landslides are frequent and abundant. The spectral information is used to assess the pixel land cover class membership probability through a Maximum Likelihood classifier trained with randomly generated synthetic land cover spectral fingerprints, which are obtained from an independent training images dataset. Pixels are classified as landslides when the calculated landslide class membership probability, weighted by a susceptibility model, is higher than membership probabilities of other classes. We generated synthetic fingerprints from two FORMOSAT-2 images acquired in 2009 and tested the procedure on two other images, one in 2005 and the other in 2009. We also obtained two landslide maps through manual interpretation. The agreement between the two sets of inventories is given by the Cohen's k coefficients of 0.62 and 0.64, respectively. This procedure can now classify a new FORMOSAT-2 image automatically facilitating the production of landslide inventory maps.

  16. Comparison of two Classification methods (MLC and SVM) to extract land use and land cover in Johor Malaysia

    NASA Astrophysics Data System (ADS)

    Rokni Deilmai, B.; Ahmad, B. Bin; Zabihi, H.

    2014-06-01

    Mapping is essential for the analysis of the land use and land cover, which influence many environmental processes and properties. For the purpose of the creation of land cover maps, it is important to minimize error. These errors will propagate into later analyses based on these land cover maps. The reliability of land cover maps derived from remotely sensed data depends on an accurate classification. In this study, we have analyzed multispectral data using two different classifiers including Maximum Likelihood Classifier (MLC) and Support Vector Machine (SVM). To pursue this aim, Landsat Thematic Mapper data and identical field-based training sample datasets in Johor Malaysia used for each classification method, which results indicate in five land cover classes forest, oil palm, urban area, water, rubber. Classification results indicate that SVM was more accurate than MLC. With demonstrated capability to produce reliable cover results, the SVM methods should be especially useful for land cover classification.

  17. Suicide Prevention

    MedlinePlus

    ... association between early recovery and increased likelihood of suicide. As depression begins to lift, a person's energy and planning ... encourage continued treatment. Talk about suicide. Talking about ... with depression can provide valuable perspective and reassurance to your ...

  18. State space modeling of time-varying contemporaneous and lagged relations in connectivity maps.

    PubMed

    Molenaar, Peter C M; Beltz, Adriene M; Gates, Kathleen M; Wilson, Stephen J

    2016-01-15

    Most connectivity mapping techniques for neuroimaging data assume stationarity (i.e., network parameters are constant across time), but this assumption does not always hold true. The authors provide a description of a new approach for simultaneously detecting time-varying (or dynamic) contemporaneous and lagged relations in brain connectivity maps. Specifically, they use a novel raw data likelihood estimation technique (involving a second-order extended Kalman filter/smoother embedded in a nonlinear optimizer) to determine the variances of the random walks associated with state space model parameters and their autoregressive components. The authors illustrate their approach with simulated and blood oxygen level-dependent functional magnetic resonance imaging data from 30 daily cigarette smokers performing a verbal working memory task, focusing on seven regions of interest (ROIs). Twelve participants had dynamic directed functional connectivity maps: Eleven had one or more time-varying contemporaneous ROI state loadings, and one had a time-varying autoregressive parameter. Compared to smokers without dynamic maps, smokers with dynamic maps performed the task with greater accuracy. Thus, accurate detection of dynamic brain processes is meaningfully related to behavior in a clinical sample. Published by Elsevier Inc.

  19. Map of assessed continuous (unconventional) oil resources in the United States, 2014

    USGS Publications Warehouse

    ,; Biewick, Laura R. H.

    2015-01-01

    The U.S. Geological Survey (USGS) conducts quantitative assessments of potential oil and gas resources of the onshore United States and associated coastal State waters. Since 2000, the USGS has completed assessments of continuous (unconventional) resources in the United States based on geologic studies and analysis of well-production data and has compiled digital maps of the assessment units classified into four categories: shale gas, tight gas, coalbed gas, and shale oil or tight oil (continuous oil). This is the fourth digital map product in a series of USGS unconventional oil and gas resource maps; its focus being shale-oil or tight-oil (continuous-oil) assessments. The map plate included in this report can be printed in hardcopy form or downloaded in a Geographic Information System (GIS) data package, which includes an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and a published map file (.pmf). Supporting geologic studies of total petroleum systems and assessment units, as well as studies of the methodology used in the assessment of continuous-oil resources in the United States, are listed with hyperlinks in table 1. Assessment results and geologic reports are available at the USGS websitehttp://energy.usgs.gov/OilGas/AssessmentsData/NationalOilGasAssessment.aspx.

  20. Sparsity-constrained PET image reconstruction with learned dictionaries

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Yang, Bao; Wang, Yanhua; Ying, Leslie

    2016-09-01

    PET imaging plays an important role in scientific and clinical measurement of biochemical and physiological processes. Model-based PET image reconstruction such as the iterative expectation maximization algorithm seeking the maximum likelihood solution leads to increased noise. The maximum a posteriori (MAP) estimate removes divergence at higher iterations. However, a conventional smoothing prior or a total-variation (TV) prior in a MAP reconstruction algorithm causes over smoothing or blocky artifacts in the reconstructed images. We propose to use dictionary learning (DL) based sparse signal representation in the formation of the prior for MAP PET image reconstruction. The dictionary to sparsify the PET images in the reconstruction process is learned from various training images including the corresponding MR structural image and a self-created hollow sphere. Using simulated and patient brain PET data with corresponding MR images, we study the performance of the DL-MAP algorithm and compare it quantitatively with a conventional MAP algorithm, a TV-MAP algorithm, and a patch-based algorithm. The DL-MAP algorithm achieves improved bias and contrast (or regional mean values) at comparable noise to what the other MAP algorithms acquire. The dictionary learned from the hollow sphere leads to similar results as the dictionary learned from the corresponding MR image. Achieving robust performance in various noise-level simulation and patient studies, the DL-MAP algorithm with a general dictionary demonstrates its potential in quantitative PET imaging.

  1. A radiation hybrid map of the distal short arm of human chromosome II, containing the Beckwith-Weidemann and associated embroyonal tumor disease loci

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard, C.W. III; Berg, D.J.; Meeker, T.C.

    1993-05-01

    The authors describe a high-resolution radiation hybrid (RH) map of the distal short arm of human chromosome 11 containing the Beckwith-Weidemann gene and the associated embryonal tumor disease loci. Thirteen human 11p15 genes and 17 new anonymous probes were mapped by a statistical analysis of the cosegregation of markers in 102 rodent-human radiation hybrids retaining fragments of human chromosome 11. The 17 anonymous probes were generated from lambda phage containing human 11p15.5 inserts, by using ALU-PCR. A comprehensive map of all 30 loci and a framework map of nine clusters of loci ordered at odds of 1,000:1 were constructed bymore » a multipoint maximum-likelihood approach by using the computer program RHMAP. This RH map localizes one new gene to chromosome 11p15 (WEE1), provides more precise order information for several 11p15 genes (CTSD, H19, HPX,.ST5, RNH, and SMPD1), confirms previous map orders for other 11p15 genes (CALCA, PTH, HBBC, TH, HRAS, and DRD4), and maps 17 new anonymous probes within the 11p15.5 region. This RH map should prove useful in better defining the positions of the Beckwith-Weidemann and associated embryonal tumor disease-gene loci. 41 refs., 1 fig., 2 tabs.« less

  2. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    PubMed

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image quality assessment by two observers revealed that the MTT maps exhibited superior quality over the TTP maps (88% good rating of MTT as compared to 68% of TTP). Our software allowed fully automated deconvolution analysis of DSC PWI using proven efficient algorithms that can be applied to acute stroke treatment decisions. Our streamlined method also offers promise for further development of automated quantitative analysis of the ischemic penumbra. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  3. Using a Novel Spatial Tool to Inform Invasive Species Early Detection and Rapid Response Efforts

    NASA Astrophysics Data System (ADS)

    Davidson, Alisha D.; Fusaro, Abigail J.; Kashian, Donna R.

    2015-07-01

    Management of invasive species has increasingly emphasized the importance of early detection and rapid response (EDRR) programs in limiting introductions, establishment, and impacts. These programs require an understanding of vector and species spatial dynamics to prioritize monitoring sites and efficiently allocate resources. Yet managers often lack the empirical data necessary to make these decisions. We developed an empirical mapping tool that can facilitate development of EDRR programs through identifying high-risk locations, particularly within the recreational boating vector. We demonstrated the utility of this tool in the Great Lakes watershed. We surveyed boaters to identify trips among water bodies and to quantify behaviors associated with high likelihood of species transfer (e.g., not removing organic materials from boat trailers) during that trip. We mapped water bodies with high-risk inbound and outbound boater movements using ArcGIS. We also tested for differences in high-risk behaviors based on demographic variables to understand risk differences among boater groups. Incorporation of boater behavior led to identification of additional high-risk water bodies compared to using the number of trips alone. Therefore, the number of trips itself may not fully reflect the likelihood of invasion. This tool can be broadly applied in other geographic contexts and with different taxa, and can be adjusted according to varying levels of information concerning the vector or species of interest. The methodology is straightforward and can be followed after a basic introduction to ArcGIS software. The visual nature of the mapping tool will facilitate site prioritization by managers and stakeholders from diverse backgrounds.

  4. Enhancing pilot situational awareness of the airport surface movement area

    NASA Technical Reports Server (NTRS)

    Jones, D. R.; Young, S. D.

    1994-01-01

    Two studies are being conducted to address airport surface movement area safety and capacity issues by providing enhanced situational awareness information to pilots. One study focuses on obtaining pilot opinion of the Runway Status Light System (RSLS). This system has been designed to reduce the likelihood of runway incursions by informing pilots when a runway is occupied. The second study is a flight demonstration of an rate integrated system consisting of an electronic moving map in the cockpit and display of the aircraft identification to the controller. Taxi route and hold warning information will be sent to the aircraft data link for display on the electronic moving map. This paper describes the plans for the two studies.

  5. The structure of mode-locking regions of piecewise-linear continuous maps: II. Skew sawtooth maps

    NASA Astrophysics Data System (ADS)

    Simpson, D. J. W.

    2018-05-01

    In two-parameter bifurcation diagrams of piecewise-linear continuous maps on , mode-locking regions typically have points of zero width known as shrinking points. Near any shrinking point, but outside the associated mode-locking region, a significant proportion of parameter space can be usefully partitioned into a two-dimensional array of annular sectors. The purpose of this paper is to show that in these sectors the dynamics is well-approximated by a three-parameter family of skew sawtooth circle maps, where the relationship between the skew sawtooth maps and the N-dimensional map is fixed within each sector. The skew sawtooth maps are continuous, degree-one, and piecewise-linear, with two different slopes. They approximate the stable dynamics of the N-dimensional map with an error that goes to zero with the distance from the shrinking point. The results explain the complicated radial pattern of periodic, quasi-periodic, and chaotic dynamics that occurs near shrinking points.

  6. An Atlas of ShakeMaps for Landslide and Liquefaction Modeling

    NASA Astrophysics Data System (ADS)

    Johnson, K. L.; Nowicki, M. A.; Mah, R. T.; Garcia, D.; Harp, E. L.; Godt, J. W.; Lin, K.; Wald, D. J.

    2012-12-01

    The human consequences of a seismic event are often a result of subsequent hazards induced by the earthquake, such as landslides. While the United States Geological Survey (USGS) ShakeMap and Prompt Assessment of Global Earthquakes for Response (PAGER) systems are, in conjunction, capable of estimating the damage potential of earthquake shaking in near-real time, they do not currently provide estimates for the potential of further damage by secondary processes. We are developing a sound basis for providing estimates of the likelihood and spatial distribution of landslides for any global earthquake under the PAGER system. Here we discuss several important ingredients in this effort. First, we report on the development of a standardized hazard layer from which to calibrate observed landslide distributions; in contrast, prior studies have used a wide variety of means for estimating the hazard input. This layer now takes the form of a ShakeMap, a standardized approach for computing geospatial estimates for a variety of shaking metrics (both peak ground motions and shaking intensity) from any well-recorded earthquake. We have created ShakeMaps for about 20 historical landslide "case history" events, significant in terms of their landslide occurrence, as part of an updated release of the USGS ShakeMap Atlas. We have also collected digitized landslide data from open-source databases for many of the earthquake events of interest. When these are combined with up-to-date topographic and geologic maps, we have the basic ingredients for calibrating landslide probabilities for a significant collection of earthquakes. In terms of modeling, rather than focusing on mechanistic models of landsliding, we adopt a strictly statistical approach to quantify landslide likelihood. We incorporate geology, slope, peak ground acceleration, and landslide data as variables in a logistic regression, selecting the best explanatory variables given the standardized new hazard layers (see Nowicki et al., this meeting, for more detail on the regression). To make the ShakeMap and PAGER systems more comprehensive in terms of secondary losses, we are working to calibrate a similarly constrained regression for liquefaction estimation using a suite of well-studied earthquakes for which detailed, digitized liquefaction datasets are available; here variants of wetness index and soil strength replace geology and slope. We expect that this Atlas of ShakeMaps for landslide and liquefaction case history events, which will soon be publicly available via the internet, will aid in improving the accuracy of loss-modeling systems such as PAGER, as well as allow for a common framework for numerous other mechanistic and empirical studies.

  7. Debris flow hazard mapping, Hobart, Tasmania, Australia

    NASA Astrophysics Data System (ADS)

    Mazengarb, Colin; Rigby, Ted; Stevenson, Michael

    2015-04-01

    Our mapping on the many dolerite capped mountains in Tasmania indicates that debris flows are a significant geomorphic process operating there. Hobart, the largest city in the State, lies at the foot of one of these mountains and our work is focussed on identifying areas that are susceptible to these events and estimating hazard in the valley systems where residential developments have been established. Geomorphic mapping with the benefit of recent LiDAR and GIS enabled stereo-imagery has allowed us to add to and refine a landslide inventory in our study area. In addition, a dominant geomorphic model has been recognised involving headward gully retreat in colluvial materials associated with rainstorms explains why many past events have occurred and where they may occur in future. In this paper we will review the landslide inventory including a large event (~200 000m3) in 1872 that affected a lightly populated area but since heavily urbanised. From this inventory we have attempted volume-mobility relationships, magnitude-frequency curves and likelihood estimates. The estimation of volume has been challenging to determine given that the area of depletion for each debris flow feature is typically difficult to distinguish from the total affected area. However, where LiDAR data exists, this uncertainty is substantially reduced and we develop width-length relationships (area of depletion) and area-volume relationships to estimate volume for the whole dataset exceeding 300 features. The volume-mobility relationship determined is comparable to international studies and in the absence of reliable eye-witness accounts, suggests that most of the features can be explained as single event debris flows, without requiring more complex mechanisms (such as those that form temporary debris dams that subsequently fail) as proposed by others previously. Likelihood estimates have also been challenging to derive given that almost all of the events have not been witnessed, some are constrained by aerial photographs to decade precision and many predate regional photography (pre 1940's). We have performed runout modelling, using 2D hydraulic modelling software (RiverFlow2D with Mud and Debris module), in order to calibrate our model against real events and gain confidence in the choice of parameters. Runout modelling was undertaken in valley systems with volumes calibrated to existing flood model likelihoods for each catchment. The hazard outputs from our models require developing a translation to hazard models used in Australia. By linking to flood mapping we aim to demonstrate to emergency managers where existing mitigation measures may be inadequate and how they can be adapted to address multiple hazards.

  8. Landslide susceptibility estimations in the Gerecse hills (Hungary).

    NASA Astrophysics Data System (ADS)

    Gerzsenyi, Dávid; Gáspár, Albert

    2017-04-01

    Surface movement processes are constantly posing threat to property in populated and agricultural areas in the Gerecse hills (Hungary). The affected geological formations are mainly unconsolidated sediments. Pleistocene loess and alluvial terrace sediments are overwhelmingly present, but fluvio-lacustrine sediments of the latest Miocene, and consolidated Eocene and Mesozoic limestones and marls can also be found in the area. Landslides and other surface movement processes are being studied for a long time in the area, but a comprehensive GIS-based geostatistical analysis have not yet been made for the whole area. This was the reason for choosing the Gerecse as the focus area of the study. However, the base data of our study are freely accessible from online servers, so the used method can be applied to other regions in Hungary. Qualitative data was acquired from the landslide-inventory map of the Hungarian Surface Movement Survey and from the Geological Map of Hungary (1 : 100 000). Morphometric parameters derived from the SRMT-1 DEM were used as quantitative variables. Using these parameters the distribution of elevation, slope gradient, aspect and categorized geological features were computed, both for areas affected and not affected by slope movements. Then likelihood values were computed for each parameters by comparing their distribution in the two areas. With combining the likelihood values of the four parameters relative hazard values were computed for each cell. This method is known as the "empirical probability estimation" originally published by Chung (2005). The map created this way shows each cell's place in their ranking based on the relative hazard values as a percentage for the whole study area (787 km2). These values provide information about how similar is a certain area to the areas already affected by landslides based on the four predictor variables. This map can also serve as a base for more complex landslide vulnerability studies involving economic factors. The landslide-inventory database used in the research provides information regarding the state of activity of the past surface movements, however the activity of many sites are stated as unknown. A complementary field survey have been carried out aiming to categorize these areas - near to Dunaszentmiklós and Neszmély villages - in one of the most landslide-affected part of the Gerecse. Reference: Chung, C. (2005). Using likelihood ratio functions for modeling the conditional probability of occurrence of future landslides for risk assessment. Computers & Geosciences, 32., pp. 1052-1068.

  9. Planck 2015 results: XI. CMB power spectra, likelihoods, and robustness of parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghanim, N.; Arnaud, M.; Ashdown, M.

    This study presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (ℓ< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy broughtmore » by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck’s wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK 2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Finally and nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.« less

  10. Planck 2015 results. XI. CMB power spectra, likelihoods, and robustness of parameters

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chiang, H. C.; Christensen, P. R.; Clements, D. L.; Colombo, L. P. L.; Combet, C.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Gerbino, M.; Giard, M.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hamann, J.; Hansen, F. K.; Harrison, D. L.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Holmes, W. A.; Hornstrup, A.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Lewis, A.; Liguori, M.; Lilje, P. B.; Lilley, M.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Meinhold, P. R.; Melchiorri, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Mottet, S.; Munshi, D.; Murphy, J. A.; Narimani, A.; Naselsky, P.; Nati, F.; Natoli, P.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Pratt, G. W.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rossetti, M.; Roudier, G.; Rouillé d'Orfeuil, B.; Rubiño-Martín, J. A.; Rusholme, B.; Salvati, L.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Serra, P.; Spencer, L. D.; Spinelli, M.; Stolyarov, V.; Stompor, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, F.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-09-01

    This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, I.e., a pixel-based likelihood at low multipoles (ℓ< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, ns, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck's wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.

  11. Planck 2015 results: XI. CMB power spectra, likelihoods, and robustness of parameters

    DOE PAGES

    Aghanim, N.; Arnaud, M.; Ashdown, M.; ...

    2016-09-20

    This study presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (ℓ< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy broughtmore » by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck’s wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK 2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Finally and nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.« less

  12. Comparing the performance of geostatistical models with additional information from covariates for sewage plume characterization.

    PubMed

    Del Monego, Maurici; Ribeiro, Paulo Justiniano; Ramos, Patrícia

    2015-04-01

    In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Matèrn models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.

  13. 40 CFR 86.1332-90 - Engine mapping procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... § 86.1332-90 Engine mapping procedures. (a) Mount test engine on the engine dynamometer. (b) Determine... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Engine mapping procedures. 86.1332-90... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Emission...

  14. 40 CFR 86.1332-90 - Engine mapping procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... § 86.1332-90 Engine mapping procedures. (a) Mount test engine on the engine dynamometer. (b) Determine... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Engine mapping procedures. 86.1332-90... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Emission...

  15. 40 CFR 86.1332-90 - Engine mapping procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... § 86.1332-90 Engine mapping procedures. (a) Mount test engine on the engine dynamometer. (b) Determine... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Engine mapping procedures. 86.1332-90... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Emission...

  16. The Role of CMR in Cardiomyopathies

    PubMed Central

    Kramer, Christopher M.

    2015-01-01

    Cardiac magnetic resonance imaging (CMR) has made major inroads in the new millenium in the diagnosis and assessment of prognosis for patients with cardiomyopathies. Imaging of left and right ventricular structure and function and tissue characterization with late gadolinium enhancement (LGE) as well as T1 and T2 mapping enable accurate diagnosis of the underlying etiology. In the setting of coronary artery disease, either transmurality of LGE or contractile reserve in response to dobutamine can assess the likelihood of recovery of function after revascularization. The presence of scar reduces the likelihood of response to medical therapy and to cardiac resynchronization therapy in heart failure. The presence and extent of LGE relate to overall cardiovascular outcome in cardiomyopathies. An emerging major role for CMR in cardiomyopathies is to identify myocardial scar for diagnostic and prognostic purposes. PMID:26033902

  17. A Fast and Scalable Radiation Hybrid Map Construction and Integration Strategy

    PubMed Central

    Agarwala, Richa; Applegate, David L.; Maglott, Donna; Schuler, Gregory D.; Schäffer, Alejandro A.

    2000-01-01

    This paper describes a fast and scalable strategy for constructing a radiation hybrid (RH) map from data on different RH panels. The maps on each panel are then integrated to produce a single RH map for the genome. Recurring problems in using maps from several sources are that the maps use different markers, the maps do not place the overlapping markers in same order, and the objective functions for map quality are incomparable. We use methods from combinatorial optimization to develop a strategy that addresses these issues. We show that by the standard objective functions of obligate chromosome breaks and maximum likelihood, software for the traveling salesman problem produces RH maps with better quality much more quickly than using software specifically tailored for RH mapping. We use known algorithms for the longest common subsequence problem as part of our map integration strategy. We demonstrate our methods by reconstructing and integrating maps for markers typed on the Genebridge 4 (GB4) and the Stanford G3 panels publicly available from the RH database. We compare map quality of our integrated map with published maps for GB4 panel and G3 panel by considering whether markers occur in the same order on a map and in DNA sequence contigs submitted to GenBank. We find that all of the maps are inconsistent with the sequence data for at least 50% of the contigs, but our integrated maps are more consistent. The map integration strategy not only scales to multiple RH maps but also to any maps that have comparable criteria for measuring map quality. Our software improves on current technology for doing RH mapping in areas of computation time and algorithms for considering a large number of markers for mapping. The essential impediments to producing dense high-quality RH maps are data quality and panel size, not computation. PMID:10720576

  18. Mapping sea ice leads with a coupled numeric/symbolic system

    NASA Technical Reports Server (NTRS)

    Key, J.; Schweiger, A. J.; Maslanik, J. A.

    1990-01-01

    A method is presented which facilitates the detection and delineation of leads with single-channel Landsat data by coupling numeric and symbolic procedures. The procedure consists of three steps: (1) using the dynamic threshold method, an image is mapped to a lead/no lead binary image; (2) the likelihood of fragments to be real leads is examined with a set of numeric rules; and (3) pairs of objects are examined geometrically and merged where possible. The processing ends when all fragments are merged and statistical characteristics are determined, and a map of valid lead objects are left which summarizes useful physical in the lead complexes. Direct implementation of domain knowledge and rapid prototyping are two benefits of the rule-based system. The approach is found to be more successfully applied to mid- and high-level processing, and the system can retrieve statistics about sea-ice leads as well as detect the leads.

  19. Mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Messmore, J.; Copeland, G. E.; Levy, G. F.

    1975-01-01

    This study was undertaken with the intent of elucidating the forest mapping capabilities of ERTS-1 MSS data when analyzed with the aid of LARS' automatic data processing techniques. The site for this investigation was the Great Dismal Swamp, a 210,000 acre wilderness area located on the Middle Atlantic coastal plain. Due to inadequate ground truth information on the distribution of vegetation within the swamp, an unsupervised classification scheme was utilized. Initially pictureprints, resembling low resolution photographs, were generated in each of the four ERTS-1 channels. Data found within rectangular training fields was then clustered into 13 spectral groups and defined statistically. Using a maximum likelihood classification scheme, the unknown data points were subsequently classified into one of the designated training classes. Training field data was classified with a high degree of accuracy (greater than 95%), and progress is being made towards identifying the mapped spectral classes.

  20. Mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Messmore, J.; Copeland, G. E.; Levy, G. F.

    1975-01-01

    This study was undertaken with the intent of elucidating the forest mapping capabilities of ERTS-1 MSS data when analyzed with the aid of LARS' automatic data processing techniques. The site for this investigation was the Great Dismal Swamp, a 210,000 acre wilderness area located on the Middle Atlantic coastal plain. Due to inadequate ground truth information on the distribution of vegetation within the swamp, an unsupervised classification scheme was utilized. Initially pictureprints, resembling low resolution photographs, were generated in each of the four ERTS-1 channels. Data found within rectangular training fields was then clustered into 13 spectral groups and defined statistically. Using a maximum likelihood classification scheme, the unknown data points were subsequently classified into one of the designated training classes. Training field data was classified with a high degree of accuracy (greater than 95 percent), and progress is being made towards identifying the mapped spectral classes.

  1. Cardiac conduction velocity estimation from sequential mapping assuming known Gaussian distribution for activation time estimation error.

    PubMed

    Shariat, Mohammad Hassan; Gazor, Saeed; Redfearn, Damian

    2016-08-01

    In this paper, we study the problem of the cardiac conduction velocity (CCV) estimation for the sequential intracardiac mapping. We assume that the intracardiac electrograms of several cardiac sites are sequentially recorded, their activation times (ATs) are extracted, and the corresponding wavefronts are specified. The locations of the mapping catheter's electrodes and the ATs of the wavefronts are used here for the CCV estimation. We assume that the extracted ATs include some estimation errors, which we model with zero-mean white Gaussian noise values with known variances. Assuming stable planar wavefront propagation, we derive the maximum likelihood CCV estimator, when the synchronization times between various recording sites are unknown. We analytically evaluate the performance of the CCV estimator and provide its mean square estimation error. Our simulation results confirm the accuracy of the proposed method and the error analysis of the proposed CCV estimator.

  2. The Development of a Long-Term, Continually Updated Global Solar Resource at 10 km Resolution: Preliminary Results From Test Processing and Continuing Plans

    NASA Technical Reports Server (NTRS)

    Stackhouse, P.; Perez, R.; Sengupta, M.; Knapp, K.; Cox, Stephen; Mikovitz, J. Colleen; Zhang, T.; Hemker, K.; Schlemmer, J.; Kivalov, S.

    2014-01-01

    Background: Considering the likelihood of global climatic weather pattern changes and the global competition for energy resources, there is an increasing need to provide improved and continuously updated global Earth surface solar resource information. Toward this end, a project was funded under the NASA Applied Science program involving the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC), National Renewable Energy Laboratory (NREL), the State University of New York/Albany (SUNY) and the NOAA National Climatic Data Center (NCDC) to provide NREL with a global long-term advanced global solar mapping production system for improved depiction of historical solar resources and variability and to provide a mechanism for continual updates of solar resource information. This new production system is made possible by the efforts of NOAA and NASA to completely reprocess the International Satellite Cloud Climatology Project (ISCCP) data set that provides satellite visible and infrared radiances together with retrieved cloud and surface properties on a 3-hourly basis beginning from July 1983. The old version of the ISCCP data provided this information for all the world TMs available geosynchronous satellite systems and NOAA TMs AVHRR data sets at a 30 km effective resolution. This new version aims to provide a new and improved satellite calibration at an effective 10 km resolution. Thus, working with SUNY, NASA will develop and test an improved production system that will enable NREL to continually update the Earth TM solar resource. Objective and Methods: In this presentation, we provide a general overview of this project together with samples of the new solar irradiance mapped data products and comparisons to surface measurements at various locations across the world. An assessment of the solar resource values relative to calibration uncertainty and assumptions are presented. Errors resulting assumptions in snow cover and background aerosol amount are described. These uncertainties and the statistics of the agreement between the measurements and new satellite estimates are also reviewed and compared to other solar data sets. Findings and Conclusions: Preliminary results show that insolation values show an overall small bias (less than 1%) with a RMS of 25% relative to surface measurements. Exceptions at certain locations were found and will be discussed relative to the uncertainties identified above. Lastly, we will identify the next steps in the development and improvement of this production system including some accuracy goals in preparation for ultimate delivery to NREL.

  3. Accurate estimation of short read mapping quality for next-generation genome sequencing

    PubMed Central

    Ruffalo, Matthew; Koyutürk, Mehmet; Ray, Soumya; LaFramboise, Thomas

    2012-01-01

    Motivation: Several software tools specialize in the alignment of short next-generation sequencing reads to a reference sequence. Some of these tools report a mapping quality score for each alignment—in principle, this quality score tells researchers the likelihood that the alignment is correct. However, the reported mapping quality often correlates weakly with actual accuracy and the qualities of many mappings are underestimated, encouraging the researchers to discard correct mappings. Further, these low-quality mappings tend to correlate with variations in the genome (both single nucleotide and structural), and such mappings are important in accurately identifying genomic variants. Approach: We develop a machine learning tool, LoQuM (LOgistic regression tool for calibrating the Quality of short read mappings, to assign reliable mapping quality scores to mappings of Illumina reads returned by any alignment tool. LoQuM uses statistics on the read (base quality scores reported by the sequencer) and the alignment (number of matches, mismatches and deletions, mapping quality score returned by the alignment tool, if available, and number of mappings) as features for classification and uses simulated reads to learn a logistic regression model that relates these features to actual mapping quality. Results: We test the predictions of LoQuM on an independent dataset generated by the ART short read simulation software and observe that LoQuM can ‘resurrect’ many mappings that are assigned zero quality scores by the alignment tools and are therefore likely to be discarded by researchers. We also observe that the recalibration of mapping quality scores greatly enhances the precision of called single nucleotide polymorphisms. Availability: LoQuM is available as open source at http://compbio.case.edu/loqum/. Contact: matthew.ruffalo@case.edu. PMID:22962451

  4. Stable learning of functional maps in self-organizing spiking neural networks with continuous synaptic plasticity

    PubMed Central

    Srinivasa, Narayan; Jiang, Qin

    2013-01-01

    This study describes a spiking model that self-organizes for stable formation and maintenance of orientation and ocular dominance maps in the visual cortex (V1). This self-organization process simulates three development phases: an early experience-independent phase, a late experience-independent phase and a subsequent refinement phase during which experience acts to shape the map properties. The ocular dominance maps that emerge accommodate the two sets of monocular inputs that arise from the lateral geniculate nucleus (LGN) to layer 4 of V1. The orientation selectivity maps that emerge feature well-developed iso-orientation domains and fractures. During the last two phases of development the orientation preferences at some locations appear to rotate continuously through ±180° along circular paths and referred to as pinwheel-like patterns but without any corresponding point discontinuities in the orientation gradient maps. The formation of these functional maps is driven by balanced excitatory and inhibitory currents that are established via synaptic plasticity based on spike timing for both excitatory and inhibitory synapses. The stability and maintenance of the formed maps with continuous synaptic plasticity is enabled by homeostasis caused by inhibitory plasticity. However, a prolonged exposure to repeated stimuli does alter the formed maps over time due to plasticity. The results from this study suggest that continuous synaptic plasticity in both excitatory neurons and interneurons could play a critical role in the formation, stability, and maintenance of functional maps in the cortex. PMID:23450808

  5. Summit-to-sea mapping and change detection using satellite imagery: tools for conservation and management of coral reefs.

    PubMed

    Shapiro, A C; Rohmann, S O

    2005-05-01

    Continuous summit-to-sea maps showing both land features and shallow-water coral reefs have been completed in Puerto Rico and the U.S. Virgin Islands, using circa 2000 Landsat 7 Enhanced Thematic Mapper (ETM+) Imagery. Continuous land/sea terrain was mapped by merging Digital Elevation Models (DEM) with satellite-derived bathymetry. Benthic habitat characterizations were created by unsupervised classifications of Landsat imagery clustered using field data, and produced maps with an estimated overall accuracy of>75% (Tau coefficient >0.65). These were merged with Geocover-LC (land use/land cover) data to create continuous land/ sea cover maps. Image pairs from different dates were analyzed using Principle Components Analysis (PCA) in order to detect areas of change in the marine environment over two different time intervals: 2000 to 2001, and 1991 to 2003. This activity demonstrates the capabilities of Landsat imagery to produce continuous summit-to-sea maps, as well as detect certain changes in the shallow-water marine environment, providing a valuable tool for efficient coastal zone monitoring and effective management and conservation.

  6. MAP Estimators for Piecewise Continuous Inversion

    DTIC Science & Technology

    2016-08-08

    MAP estimators for piecewise continuous inversion M M Dunlop1 and A M Stuart Mathematics Institute, University of Warwick, Coventry, CV4 7AL, UK E...Published 8 August 2016 Abstract We study the inverse problem of estimating a field ua from data comprising a finite set of nonlinear functionals of ua...then natural to study maximum a posterior (MAP) estimators. Recently (Dashti et al 2013 Inverse Problems 29 095017) it has been shown that MAP

  7. Smoke Signals: Adolescent Smoking and School Continuation. Working Papers Series. SAN06-05

    ERIC Educational Resources Information Center

    Cook, Philip J.; Hutchinson, Rebecca

    2006-01-01

    This paper presents an exploratory analysis using NLSY97 data of the relationship between the likelihood of school continuation and the choices of whether to smoke or drink. We demonstrate that in the United States as of the late 1990s, smoking in 11th-grade was a uniquely powerful predictor of whether the student finished high school, and if so…

  8. When Can Categorical Variables Be Treated as Continuous? A Comparison of Robust Continuous and Categorical SEM Estimation Methods under Suboptimal Conditions

    ERIC Educational Resources Information Center

    Rhemtulla, Mijke; Brosseau-Liard, Patricia E.; Savalei, Victoria

    2012-01-01

    A simulation study compared the performance of robust normal theory maximum likelihood (ML) and robust categorical least squares (cat-LS) methodology for estimating confirmatory factor analysis models with ordinal variables. Data were generated from 2 models with 2-7 categories, 4 sample sizes, 2 latent distributions, and 5 patterns of category…

  9. Spatial Statistics of Large Astronomical Databases: An Algorithmic Approach

    NASA Technical Reports Server (NTRS)

    Szapudi, Istvan

    2004-01-01

    In this AISRP, the we have demonstrated that the correlation function i) can be calculated for MAP in minutes (about 45 minutes for Planck) on a modest 500Mhz workstation ii) the corresponding method, although theoretically suboptimal, produces nearly optimal results for realistic noise and cut sky. This trillion fold improvement in speed over the standard maximum likelihood technique opens up tremendous new possibilities, which will be persued in the follow up.

  10. A low-cost drone based application for identifying and mapping of coastal fish nursery grounds

    NASA Astrophysics Data System (ADS)

    Ventura, Daniele; Bruno, Michele; Jona Lasinio, Giovanna; Belluscio, Andrea; Ardizzone, Giandomenico

    2016-03-01

    Acquiring seabed, landform or other topographic data in the field of marine ecology has a pivotal role in defining and mapping key marine habitats. However, accessibility for this kind of data with a high level of detail for very shallow and inaccessible marine habitats has been often challenging, time consuming. Spatial and temporal coverage often has to be compromised to make more cost effective the monitoring routine. Nowadays, emerging technologies, can overcome many of these constraints. Here we describe a recent development in remote sensing based on a small unmanned drone (UAVs) that produce very fine scale maps of fish nursery areas. This technology is simple to use, inexpensive, and timely in producing aerial photographs of marine areas. Both technical details regarding aerial photos acquisition (drone and camera settings) and post processing workflow (3D model generation with Structure From Motion algorithm and photo-stitching) are given. Finally by applying modern algorithm of semi-automatic image analysis and classification (Maximum Likelihood, ECHO and Object-based Image Analysis) we compared the results of three thematic maps of nursery area for juvenile sparid fishes, highlighting the potential of this method in mapping and monitoring coastal marine habitats.

  11. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    PubMed

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  12. Stochastic maps, continuous approximation, and stable distribution

    NASA Astrophysics Data System (ADS)

    Kessler, David A.; Burov, Stanislav

    2017-10-01

    A continuous approximation framework for general nonlinear stochastic as well as deterministic discrete maps is developed. For the stochastic map with uncorelated Gaussian noise, by successively applying the Itô lemma, we obtain a Langevin type of equation. Specifically, we show how nonlinear maps give rise to a Langevin description that involves multiplicative noise. The multiplicative nature of the noise induces an additional effective force, not present in the absence of noise. We further exploit the continuum description and provide an explicit formula for the stable distribution of the stochastic map and conditions for its existence. Our results are in good agreement with numerical simulations of several maps.

  13. Mapping global cropland and field size.

    PubMed

    Fritz, Steffen; See, Linda; McCallum, Ian; You, Liangzhi; Bun, Andriy; Moltchanova, Elena; Duerauer, Martina; Albrecht, Fransizka; Schill, Christian; Perger, Christoph; Havlik, Petr; Mosnier, Aline; Thornton, Philip; Wood-Sichra, Ulrike; Herrero, Mario; Becker-Reshef, Inbal; Justice, Chris; Hansen, Matthew; Gong, Peng; Abdel Aziz, Sheta; Cipriani, Anna; Cumani, Renato; Cecchi, Giuliano; Conchedda, Giulia; Ferreira, Stefanus; Gomez, Adriana; Haffani, Myriam; Kayitakire, Francois; Malanding, Jaiteh; Mueller, Rick; Newby, Terence; Nonguierma, Andre; Olusegun, Adeaga; Ortner, Simone; Rajak, D Ram; Rocha, Jansle; Schepaschenko, Dmitry; Schepaschenko, Maria; Terekhov, Alexey; Tiangwa, Alex; Vancutsem, Christelle; Vintrou, Elodie; Wenbin, Wu; van der Velde, Marijn; Dunwoody, Antonia; Kraxner, Florian; Obersteiner, Michael

    2015-05-01

    A new 1 km global IIASA-IFPRI cropland percentage map for the baseline year 2005 has been developed which integrates a number of individual cropland maps at global to regional to national scales. The individual map products include existing global land cover maps such as GlobCover 2005 and MODIS v.5, regional maps such as AFRICOVER and national maps from mapping agencies and other organizations. The different products are ranked at the national level using crowdsourced data from Geo-Wiki to create a map that reflects the likelihood of cropland. Calibration with national and subnational crop statistics was then undertaken to distribute the cropland within each country and subnational unit. The new IIASA-IFPRI cropland product has been validated using very high-resolution satellite imagery via Geo-Wiki and has an overall accuracy of 82.4%. It has also been compared with the EarthStat cropland product and shows a lower root mean square error on an independent data set collected from Geo-Wiki. The first ever global field size map was produced at the same resolution as the IIASA-IFPRI cropland map based on interpolation of field size data collected via a Geo-Wiki crowdsourcing campaign. A validation exercise of the global field size map revealed satisfactory agreement with control data, particularly given the relatively modest size of the field size data set used to create the map. Both are critical inputs to global agricultural monitoring in the frame of GEOGLAM and will serve the global land modelling and integrated assessment community, in particular for improving land use models that require baseline cropland information. These products are freely available for downloading from the http://cropland.geo-wiki.org website. © 2015 John Wiley & Sons Ltd.

  14. Simultaneous Control of Error Rates in fMRI Data Analysis

    PubMed Central

    Kang, Hakmook; Blume, Jeffrey; Ombao, Hernando; Badre, David

    2015-01-01

    The key idea of statistical hypothesis testing is to fix, and thereby control, the Type I error (false positive) rate across samples of any size. Multiple comparisons inflate the global (family-wise) Type I error rate and the traditional solution to maintaining control of the error rate is to increase the local (comparison-wise) Type II error (false negative) rates. However, in the analysis of human brain imaging data, the number of comparisons is so large that this solution breaks down: the local Type II error rate ends up being so large that scientifically meaningful analysis is precluded. Here we propose a novel solution to this problem: allow the Type I error rate to converge to zero along with the Type II error rate. It works because when the Type I error rate per comparison is very small, the accumulation (or global) Type I error rate is also small. This solution is achieved by employing the Likelihood paradigm, which uses likelihood ratios to measure the strength of evidence on a voxel-by-voxel basis. In this paper, we provide theoretical and empirical justification for a likelihood approach to the analysis of human brain imaging data. In addition, we present extensive simulations that show the likelihood approach is viable, leading to ‘cleaner’ looking brain maps and operationally superiority (lower average error rate). Finally, we include a case study on cognitive control related activation in the prefrontal cortex of the human brain. PMID:26272730

  15. GRID-BASED EXPLORATION OF COSMOLOGICAL PARAMETER SPACE WITH SNAKE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikkelsen, K.; Næss, S. K.; Eriksen, H. K., E-mail: kristin.mikkelsen@astro.uio.no

    2013-11-10

    We present a fully parallelized grid-based parameter estimation algorithm for investigating multidimensional likelihoods called Snake, and apply it to cosmological parameter estimation. The basic idea is to map out the likelihood grid-cell by grid-cell according to decreasing likelihood, and stop when a certain threshold has been reached. This approach improves vastly on the 'curse of dimensionality' problem plaguing standard grid-based parameter estimation simply by disregarding grid cells with negligible likelihood. The main advantages of this method compared to standard Metropolis-Hastings Markov Chain Monte Carlo methods include (1) trivial extraction of arbitrary conditional distributions; (2) direct access to Bayesian evidences; (3)more » better sampling of the tails of the distribution; and (4) nearly perfect parallelization scaling. The main disadvantage is, as in the case of brute-force grid-based evaluation, a dependency on the number of parameters, N{sub par}. One of the main goals of the present paper is to determine how large N{sub par} can be, while still maintaining reasonable computational efficiency; we find that N{sub par} = 12 is well within the capabilities of the method. The performance of the code is tested by comparing cosmological parameters estimated using Snake and the WMAP-7 data with those obtained using CosmoMC, the current standard code in the field. We find fully consistent results, with similar computational expenses, but shorter wall time due to the perfect parallelization scheme.« less

  16. Hyperspherical von Mises-Fisher mixture (HvMF) modelling of high angular resolution diffusion MRI.

    PubMed

    Bhalerao, Abhir; Westin, Carl-Fredrik

    2007-01-01

    A mapping of unit vectors onto a 5D hypersphere is used to model and partition ODFs from HARDI data. This mapping has a number of useful and interesting properties and we make a link to interpretation of the second order spherical harmonic decompositions of HARDI data. The paper presents the working theory and experiments of using a von Mises-Fisher mixture model for directional samples. The MLE of the second moment of the HvMF pdf can also be related to fractional anisotropy. We perform error analysis of the estimation scheme in single and multi-fibre regions and then show how a penalised-likelihood model selection method can be employed to differentiate single and multiple fibre regions.

  17. Neuroanatomical substrates of action perception and understanding: an anatomic likelihood estimation meta-analysis of lesion-symptom mapping studies in brain injured patients

    PubMed Central

    Urgesi, Cosimo; Candidi, Matteo; Avenanti, Alessio

    2014-01-01

    Several neurophysiologic and neuroimaging studies suggested that motor and perceptual systems are tightly linked along a continuum rather than providing segregated mechanisms supporting different functions. Using correlational approaches, these studies demonstrated that action observation activates not only visual but also motor brain regions. On the other hand, brain stimulation and brain lesion evidence allows tackling the critical question of whether our action representations are necessary to perceive and understand others’ actions. In particular, recent neuropsychological studies have shown that patients with temporal, parietal, and frontal lesions exhibit a number of possible deficits in the visual perception and the understanding of others’ actions. The specific anatomical substrates of such neuropsychological deficits however, are still a matter of debate. Here we review the existing literature on this issue and perform an anatomic likelihood estimation meta-analysis of studies using lesion-symptom mapping methods on the causal relation between brain lesions and non-linguistic action perception and understanding deficits. The meta-analysis encompassed data from 361 patients tested in 11 studies and identified regions in the inferior frontal cortex, the inferior parietal cortex and the middle/superior temporal cortex, whose damage is consistently associated with poor performance in action perception and understanding tasks across studies. Interestingly, these areas correspond to the three nodes of the action observation network that are strongly activated in response to visual action perception in neuroimaging research and that have been targeted in previous brain stimulation studies. Thus, brain lesion mapping research provides converging causal evidence that premotor, parietal and temporal regions play a crucial role in action recognition and understanding. PMID:24910603

  18. Vulnerabilities to Rock-Slope Failure Impacts from Christchurch, NZ Case History Analysis

    NASA Astrophysics Data System (ADS)

    Grant, A.; Wartman, J.; Massey, C. I.; Olsen, M. J.; Motley, M. R.; Hanson, D.; Henderson, J.

    2015-12-01

    Rock-slope failures during the 2010/11 Canterbury (Christchurch), New Zealand Earthquake Sequence resulted in 5 fatalities and caused an estimated US$400 million of damage to buildings and infrastructure. Reducing losses from rock-slope failures requires consideration of both hazard (i.e. likelihood of occurrence) and risk (i.e. likelihood of losses given an occurrence). Risk assessment thus requires information on the vulnerability of structures to rock or boulder impacts. Here we present 32 case histories of structures impacted by boulders triggered during the 2010/11 Canterbury earthquake sequence, in the Port Hills region of Christchurch, New Zealand. The consequences of rock fall impacts on structures, taken as penetration distance into structures, are shown to follow a power-law distribution with impact energy. Detailed mapping of rock fall sources and paths from field mapping, aerial lidar digital elevation model (DEM) data, and high-resolution aerial imagery produced 32 well-constrained runout paths of boulders that impacted structures. Impact velocities used for structural analysis were developed using lumped mass 2-D rock fall runout models using 1-m resolution lidar elevation data. Model inputs were based on calibrated surface parameters from mapped runout paths of 198 additional boulder runouts. Terrestrial lidar scans and structure from motion (SfM) imagery generated 3-D point cloud data used to measure structural damage and impacting boulders. Combining velocity distributions from 2-D analysis and high-precision boulder dimensions, kinetic energy distributions were calculated for all impacts. Calculated impact energy versus penetration distance for all cases suggests a power-law relationship between damage and impact energy. These case histories and resulting fragility curve should serve as a foundation for future risk analysis of rock fall hazards by linking vulnerability data to the predicted energy distributions from the hazard analysis.

  19. A Bayesian modelling framework for tornado occurrences in North America

    NASA Astrophysics Data System (ADS)

    Cheng, Vincent Y. S.; Arhonditsis, George B.; Sills, David M. L.; Gough, William A.; Auld, Heather

    2015-03-01

    Tornadoes represent one of nature’s most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  20. A Bayesian modelling framework for tornado occurrences in North America.

    PubMed

    Cheng, Vincent Y S; Arhonditsis, George B; Sills, David M L; Gough, William A; Auld, Heather

    2015-03-25

    Tornadoes represent one of nature's most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  1. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting

    PubMed Central

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen; Wald, Lawrence L.

    2017-01-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization. PMID:26915119

  2. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    PubMed

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  3. [Application of biotope mapping model integrated with vegetation cover continuity attributes in urban biodiversity conservation].

    PubMed

    Gao, Tian; Qiu, Ling; Chen, Cun-gen

    2010-09-01

    Based on the biotope classification system with vegetation structure as the framework, a modified biotope mapping model integrated with vegetation cover continuity attributes was developed, and applied to the study of the greenbelts in Helsingborg in southern Sweden. An evaluation of the vegetation cover continuity in the greenbelts was carried out by the comparisons of the vascular plant species richness in long- and short-continuity forests, based on the identification of woodland continuity by using ancient woodland indicator species (AWIS). In the test greenbelts, long-continuity woodlands had more AWIS. Among the forests where the dominant trees were more than 30-year-old, the long-continuity ones had a higher biodiversity of vascular plants, compared with the short-continuity ones with the similar vegetation structure. The modified biotope mapping model integrated with the continuity features of vegetation cover could be an important tool in investigating urban biodiversity, and provide corresponding strategies for future urban biodiversity conservation.

  4. Mapping aerial metal deposition in metropolitan areas from tree bark: a case study in Sheffield, England.

    PubMed

    Schelle, E; Rawlins, B G; Lark, R M; Webster, R; Staton, I; McLeod, C W

    2008-09-01

    We investigated the use of metals accumulated on tree bark for mapping their deposition across metropolitan Sheffield by sampling 642 trees of three common species. Mean concentrations of metals were generally an order of magnitude greater than in samples from a remote uncontaminated site. We found trivially small differences among tree species with respect to metal concentrations on bark, and in subsequent statistical analyses did not discriminate between them. We mapped the concentrations of As, Cd and Ni by lognormal universal kriging using parameters estimated by residual maximum likelihood (REML). The concentrations of Ni and Cd were greatest close to a large steel works, their probable source, and declined markedly within 500 m of it and from there more gradually over several kilometres. Arsenic was much more evenly distributed, probably as a result of locally mined coal burned in domestic fires for many years. Tree bark seems to integrate airborne pollution over time, and our findings show that sampling and analysing it are cost-effective means of mapping and identifying sources.

  5. The Atacama Cosmology Telescope (ACT): Beam Profiles and First SZ Cluster Maps

    NASA Technical Reports Server (NTRS)

    Hincks, A. D.; Acquaviva, V.; Ade, P. A.; Aguirre, P.; Amiri, M.; Appel, J. W.; Barrientos, L. F.; Battistelli, E. S.; Bond, J. R.; Brown, B.; hide

    2010-01-01

    The Atacama Cosmology Telescope (ACT) is currently observing the cosmic microwave background with arcminute resolution at 148 GHz, 218 GHz, and 277 GHz, In this paper, we present ACT's first results. Data have been analyzed using a maximum-likelihood map-making method which uses B-splines to model and remove the atmospheric signal. It has been used to make high-precision beam maps from which we determine the experiment's window functions, This beam information directly impacts all subsequent analyses of the data. We also used the method to map a sample of galaxy clusters via the Sunyaev-Ze1'dovich (SZ) effect, and show five clusters previously detected with X-ray or SZ observations, We provide integrated Compton-y measurements for each cluster. Of particular interest is our detection of the z = 0.44 component of A3128 and our current non-detection of the low-redshift part, providing strong evidence that the further cluster is more massive as suggested by X-ray measurements. This is a compelling example of the redshift-independent mass selection of the SZ effect.

  6. Mapping proteins in the presence of paralogs using units of coevolution

    PubMed Central

    2013-01-01

    Background We study the problem of mapping proteins between two protein families in the presence of paralogs. This problem occurs as a difficult subproblem in coevolution-based computational approaches for protein-protein interaction prediction. Results Similar to prior approaches, our method is based on the idea that coevolution implies equal rates of sequence evolution among the interacting proteins, and we provide a first attempt to quantify this notion in a formal statistical manner. We call the units that are central to this quantification scheme the units of coevolution. A unit consists of two mapped protein pairs and its score quantifies the coevolution of the pairs. This quantification allows us to provide a maximum likelihood formulation of the paralog mapping problem and to cast it into a binary quadratic programming formulation. Conclusion CUPID, our software tool based on a Lagrangian relaxation of this formulation, makes it, for the first time, possible to compute state-of-the-art quality pairings in a few minutes of runtime. In summary, we suggest a novel alternative to the earlier available approaches, which is statistically sound and computationally feasible. PMID:24564758

  7. A GIS/Remote Sensing-based methodology for groundwater potentiality assessment in Tirnavos area, Greece

    NASA Astrophysics Data System (ADS)

    Oikonomidis, D.; Dimogianni, S.; Kazakis, N.; Voudouris, K.

    2015-06-01

    The aim of this paper is to assess the groundwater potentiality combining Geographic Information Systems and Remote Sensing with data obtained from the field, as an additional tool to the hydrogeological research. The present study was elaborated in the broader area of Tirnavos, covering 419.4 km2. The study area is located in Thessaly (central Greece) and is crossed by two rivers, Pinios and Titarisios. Agriculture is one of the main elements of Thessaly's economy resulting in intense agricultural activity and consequently increased exploitation of groundwater resources. Geographic Information Systems (GIS) and Remote Sensing (RS) were used in order to create a map that depicts the likelihood of existence of groundwater, consisting of five classes, showing the groundwater potentiality and ranging from very high to very low. The extraction of this map is based on the study of input data such as: rainfall, potential recharge, lithology, lineament density, slope, drainage density and depth to groundwater. Weights were assigned to all these factors according to their relevance to groundwater potential and eventually a map based on weighted spatial modeling system was created. Furthermore, a groundwater quality suitability map was illustrated by overlaying the groundwater potentiality map with the map showing the potential zones for drinking groundwater in the study area. The results provide significant information and the maps could be used from local authorities for groundwater exploitation and management.

  8. Bayesian B-spline mapping for dynamic quantitative traits.

    PubMed

    Xing, Jun; Li, Jiahan; Yang, Runqing; Zhou, Xiaojing; Xu, Shizhong

    2012-04-01

    Owing to their ability and flexibility to describe individual gene expression at different time points, random regression (RR) analyses have become a popular procedure for the genetic analysis of dynamic traits whose phenotypes are collected over time. Specifically, when modelling the dynamic patterns of gene expressions in the RR framework, B-splines have been proved successful as an alternative to orthogonal polynomials. In the so-called Bayesian B-spline quantitative trait locus (QTL) mapping, B-splines are used to characterize the patterns of QTL effects and individual-specific time-dependent environmental errors over time, and the Bayesian shrinkage estimation method is employed to estimate model parameters. Extensive simulations demonstrate that (1) in terms of statistical power, Bayesian B-spline mapping outperforms the interval mapping based on the maximum likelihood; (2) for the simulated dataset with complicated growth curve simulated by B-splines, Legendre polynomial-based Bayesian mapping is not capable of identifying the designed QTLs accurately, even when higher-order Legendre polynomials are considered and (3) for the simulated dataset using Legendre polynomials, the Bayesian B-spline mapping can find the same QTLs as those identified by Legendre polynomial analysis. All simulation results support the necessity and flexibility of B-spline in Bayesian mapping of dynamic traits. The proposed method is also applied to a real dataset, where QTLs controlling the growth trajectory of stem diameters in Populus are located.

  9. High-confidence coding and noncoding transcriptome maps

    PubMed Central

    2017-01-01

    The advent of high-throughput RNA sequencing (RNA-seq) has led to the discovery of unprecedentedly immense transcriptomes encoded by eukaryotic genomes. However, the transcriptome maps are still incomplete partly because they were mostly reconstructed based on RNA-seq reads that lack their orientations (known as unstranded reads) and certain boundary information. Methods to expand the usability of unstranded RNA-seq data by predetermining the orientation of the reads and precisely determining the boundaries of assembled transcripts could significantly benefit the quality of the resulting transcriptome maps. Here, we present a high-performing transcriptome assembly pipeline, called CAFE, that significantly improves the original assemblies, respectively assembled with stranded and/or unstranded RNA-seq data, by orienting unstranded reads using the maximum likelihood estimation and by integrating information about transcription start sites and cleavage and polyadenylation sites. Applying large-scale transcriptomic data comprising 230 billion RNA-seq reads from the ENCODE, Human BodyMap 2.0, The Cancer Genome Atlas, and GTEx projects, CAFE enabled us to predict the directions of about 220 billion unstranded reads, which led to the construction of more accurate transcriptome maps, comparable to the manually curated map, and a comprehensive lncRNA catalog that includes thousands of novel lncRNAs. Our pipeline should not only help to build comprehensive, precise transcriptome maps from complex genomes but also to expand the universe of noncoding genomes. PMID:28396519

  10. Comparison of Pixel-Based and Object-Based Classification Using Parameters and Non-Parameters Approach for the Pattern Consistency of Multi Scale Landcover

    NASA Astrophysics Data System (ADS)

    Juniati, E.; Arrofiqoh, E. N.

    2017-09-01

    Information extraction from remote sensing data especially land cover can be obtained by digital classification. In practical some people are more comfortable using visual interpretation to retrieve land cover information. However, it is highly influenced by subjectivity and knowledge of interpreter, also takes time in the process. Digital classification can be done in several ways, depend on the defined mapping approach and assumptions on data distribution. The study compared several classifiers method for some data type at the same location. The data used Landsat 8 satellite imagery, SPOT 6 and Orthophotos. In practical, the data used to produce land cover map in 1:50,000 map scale for Landsat, 1:25,000 map scale for SPOT and 1:5,000 map scale for Orthophotos, but using visual interpretation to retrieve information. Maximum likelihood Classifiers (MLC) which use pixel-based and parameters approach applied to such data, and also Artificial Neural Network classifiers which use pixel-based and non-parameters approach applied too. Moreover, this study applied object-based classifiers to the data. The classification system implemented is land cover classification on Indonesia topographic map. The classification applied to data source, which is expected to recognize the pattern and to assess consistency of the land cover map produced by each data. Furthermore, the study analyse benefits and limitations the use of methods.

  11. Maximum Likelihood Item Easiness Models for Test Theory Without an Answer Key

    PubMed Central

    Batchelder, William H.

    2014-01-01

    Cultural consensus theory (CCT) is a data aggregation technique with many applications in the social and behavioral sciences. We describe the intuition and theory behind a set of CCT models for continuous type data using maximum likelihood inference methodology. We describe how bias parameters can be incorporated into these models. We introduce two extensions to the basic model in order to account for item rating easiness/difficulty. The first extension is a multiplicative model and the second is an additive model. We show how the multiplicative model is related to the Rasch model. We describe several maximum-likelihood estimation procedures for the models and discuss issues of model fit and identifiability. We describe how the CCT models could be used to give alternative consensus-based measures of reliability. We demonstrate the utility of both the basic and extended models on a set of essay rating data and give ideas for future research. PMID:29795812

  12. Absolute continuity for operator valued completely positive maps on C∗-algebras

    NASA Astrophysics Data System (ADS)

    Gheondea, Aurelian; Kavruk, Ali Şamil

    2009-02-01

    Motivated by applicability to quantum operations, quantum information, and quantum probability, we investigate the notion of absolute continuity for operator valued completely positive maps on C∗-algebras, previously introduced by Parthasarathy [in Athens Conference on Applied Probability and Time Series Analysis I (Springer-Verlag, Berlin, 1996), pp. 34-54]. We obtain an intrinsic definition of absolute continuity, we show that the Lebesgue decomposition defined by Parthasarathy is the maximal one among all other Lebesgue-type decompositions and that this maximal Lebesgue decomposition does not depend on the jointly dominating completely positive map, we obtain more flexible formulas for calculating the maximal Lebesgue decomposition, and we point out the nonuniqueness of the Lebesgue decomposition as well as a sufficient condition for uniqueness. In addition, we consider Radon-Nikodym derivatives for absolutely continuous completely positive maps that, in general, are unbounded positive self-adjoint operators affiliated to a certain von Neumann algebra, and we obtain a spectral approximation by bounded Radon-Nikodym derivatives. An application to the existence of the infimum of two completely positive maps is indicated, and formulas in terms of Choi's matrices for the Lebesgue decomposition of completely positive maps in matrix algebras are obtained.

  13. Linkage maps of grapevine displaying the chromosomal locations of 420 microsatellite markers and 82 markers for R-gene candidates.

    PubMed

    Di Gaspero, G; Cipriani, G; Adam-Blondon, A-F; Testolin, R

    2007-05-01

    Genetic maps functionally oriented towards disease resistance have been constructed in grapevine by analysing with a simultaneous maximum-likelihood estimation of linkage 502 markers including microsatellites and resistance gene analogs (RGAs). Mapping material consisted of two pseudo-testcrosses, 'Chardonnay' x 'Bianca' and 'Cabernet Sauvignon' x '20/3' where the seed parents were Vitis vinifera genotypes and the male parents were Vitis hybrids carrying resistance to mildew diseases. Individual maps included 320-364 markers each. The simultaneous use of two mapping crosses made with two pairs of distantly related parents allowed mapping as much as 91% of the markers tested. The integrated map included 420 Simple Sequence Repeat (SSR) markers that identified 536 SSR loci and 82 RGA markers that identified 173 RGA loci. This map consisted of 19 linkage groups (LGs) corresponding to the grape haploid chromosome number, had a total length of 1,676 cM and a mean distance between adjacent loci of 3.6 cM. Single-locus SSR markers were randomly distributed over the map (CD = 1.12). RGA markers were found in 18 of the 19 LGs but most of them (83%) were clustered on seven LGs, namely groups 3, 7, 9, 12, 13, 18 and 19. Several RGA clusters mapped to chromosomal regions where phenotypic traits of resistance to fungal diseases such as downy mildew and powdery mildew, bacterial diseases such as Pierce's disease, and pests such as dagger and root-knot nematode, were previously mapped in different segregating populations. The high number of RGA markers integrated into this new map will help find markers linked to genetic determinants of different pest and disease resistances in grape.

  14. An Analysis of Factor Extraction Strategies: A Comparison of the Relative Strengths of Principal Axis, Ordinary Least Squares, and Maximum Likelihood in Research Contexts That Include Both Categorical and Continuous Variables

    ERIC Educational Resources Information Center

    Coughlin, Kevin B.

    2013-01-01

    This study is intended to provide researchers with empirically derived guidelines for conducting factor analytic studies in research contexts that include dichotomous and continuous levels of measurement. This study is based on the hypotheses that ordinary least squares (OLS) factor analysis will yield more accurate parameter estimates than…

  15. Louisiana Barrier Island Comprehensive Monitoring (BICM) Program Summary Report: Data and Analyses 2006 through 2010

    USGS Publications Warehouse

    Kindinger, Jack G.; Buster, Noreen A.; Flocks, James G.; Bernier, Julie C.; Kulp, Mark A.

    2013-01-01

    The Barrier Island Comprehensive Monitoring (BICM) program was implemented under the Louisiana Coastal Area Science and Technology (LCA S&T) office as a component of the System Wide Assessment and Monitoring (SWAMP) program. The BICM project was developed by the State of Louisiana (Coastal Protection Restoration Authority [CPRA], formerly Department of Natural Resources [DNR]) to complement other Louisiana coastal monitoring programs such as the Coastwide Reference Monitoring System-Wetlands (CRMS-Wetlands) and was a collaborative research effort by CPRA, University of New Orleans (UNO), and the U.S. Geological Survey (USGS). The goal of the BICM program was to provide long-term data on the barrier islands of Louisiana that could be used to plan, design, evaluate, and maintain current and future barrier-island restoration projects. The BICM program used both historical and newly acquired (2006 to 2010) data to assess and monitor changes in the aerial and subaqueous extent of islands, habitat types, sediment texture and geotechnical properties, environmental processes, and vegetation composition. BICM datasets included aerial still and video photography (multiple time series) for shoreline positions, habitat mapping, and land loss; light detection and ranging (lidar) surveys for topographic elevations; single-beam and swath bathymetry; and sediment grab samples. Products produced using BICM data and analyses included (but were not limited to) storm-impact assessments, rate of shoreline and bathymetric change, shoreline-erosion and accretion maps, high-resolution elevation maps, coastal-shoreline and barrier-island habitat-classification maps, and coastal surficial-sediment characterization maps. Discussions in this report summarize the extensive data-collection efforts and present brief interpretive analyses for four coastal Louisiana geographic regions. In addition, several coastal-wide and topical themes were selected that integrate the data and analyses within a broader coastal context: (1) barrier-shoreline evolution driven by rapid relative sea-level rise (RSLR), (2) hurricane impacts to the Chandeleur Islands and likelihood of island recovery, (3) impact of tropical storms on barrier shorelines, (4) Barataria Bay tidal-inlet management, and (5) habitat changes related to RSLR. The final theme addresses potential future goals of the BICM program, including rotational annual to semi-decadal monitoring, proposed new-data collection, how to incorporate technological advances with previous data-collection and monitoring protocols, and standardizing methods and quality-control assessments for continued coastal monitoring and restoration.

  16. Exponential series approaches for nonparametric graphical models

    NASA Astrophysics Data System (ADS)

    Janofsky, Eric

    Markov Random Fields (MRFs) or undirected graphical models are parsimonious representations of joint probability distributions. This thesis studies high-dimensional, continuous-valued pairwise Markov Random Fields. We are particularly interested in approximating pairwise densities whose logarithm belongs to a Sobolev space. For this problem we propose the method of exponential series which approximates the log density by a finite-dimensional exponential family with the number of sufficient statistics increasing with the sample size. We consider two approaches to estimating these models. The first is regularized maximum likelihood. This involves optimizing the sum of the log-likelihood of the data and a sparsity-inducing regularizer. We then propose a variational approximation to the likelihood based on tree-reweighted, nonparametric message passing. This approximation allows for upper bounds on risk estimates, leverages parallelization and is scalable to densities on hundreds of nodes. We show how the regularized variational MLE may be estimated using a proximal gradient algorithm. We then consider estimation using regularized score matching. This approach uses an alternative scoring rule to the log-likelihood, which obviates the need to compute the normalizing constant of the distribution. For general continuous-valued exponential families, we provide parameter and edge consistency results. As a special case we detail a new approach to sparse precision matrix estimation which has statistical performance competitive with the graphical lasso and computational performance competitive with the state-of-the-art glasso algorithm. We then describe results for model selection in the nonparametric pairwise model using exponential series. The regularized score matching problem is shown to be a convex program; we provide scalable algorithms based on consensus alternating direction method of multipliers (ADMM) and coordinate-wise descent. We use simulations to compare our method to others in the literature as well as the aforementioned TRW estimator.

  17. Assessing state efforts to integrate transportation, land use and climate change.

    DOT National Transportation Integrated Search

    2016-12-01

    Climate change is increasingly recognized as a threat to life on earth. Continued emission of greenhouse gases will cause further : warming and long-lasting changes in all components of the climate system, increasing the likelihood of severe, perv...

  18. Use of LANDSAT imagery for wildlife habitat mapping in northeast and east central Alaska

    NASA Technical Reports Server (NTRS)

    Lent, P. C. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Two scenes were analyzed by applying an iterative cluster analysis to a 2% random data sample and then using the resulting clusters as a training set basis for maximum likelihood classification. Twenty-six and twenty-seven categorical classes, respectively resulted from this process. The majority of classes in each case were quite specific vegetation types; each of these types has specific value as moose habitat.

  19. The use of LANDSAT data to monitor the urban growth of Sao Paulo Metropolitan area

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Niero, M.; Lombardo, M. A.; Foresti, C.

    1982-01-01

    Urban growth from 1977 to 1979 of the region between Billings and the Guarapiranga reservoir was mapped and the problematic urban areas identified using several LANDSAT products. Visual and automatic interpretation techniques were applied to the data. Computer compatible tapes of LANDSAT multispectral scanner data were analyzed through the maximum likelihood Gaussian algorithm. The feasibility of monitoring fast urban growth by remote sensing techniques for efficient urban planning and control is demonstrated.

  20. Mapping grey matter reductions in schizophrenia: an anatomical likelihood estimation analysis of voxel-based morphometry studies.

    PubMed

    Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C

    2009-03-01

    Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.

  1. Assessing Landscape Scale Wildfire Exposure for Highly Valued Resources in a Mediterranean Area

    NASA Astrophysics Data System (ADS)

    Alcasena, Fermín J.; Salis, Michele; Ager, Alan A.; Arca, Bachisio; Molina, Domingo; Spano, Donatella

    2015-05-01

    We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km2 located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.

  2. Ecological statistics of Gestalt laws for the perceptual organization of contours.

    PubMed

    Elder, James H; Goldberg, Richard M

    2002-01-01

    Although numerous studies have measured the strength of visual grouping cues for controlled psychophysical stimuli, little is known about the statistical utility of these various cues for natural images. In this study, we conducted experiments in which human participants trace perceived contours in natural images. These contours are automatically mapped to sequences of discrete tangent elements detected in the image. By examining relational properties between pairs of successive tangents on these traced curves, and between randomly selected pairs of tangents, we are able to estimate the likelihood distributions required to construct an optimal Bayesian model for contour grouping. We employed this novel methodology to investigate the inferential power of three classical Gestalt cues for contour grouping: proximity, good continuation, and luminance similarity. The study yielded a number of important results: (1) these cues, when appropriately defined, are approximately uncorrelated, suggesting a simple factorial model for statistical inference; (2) moderate image-to-image variation of the statistics indicates the utility of general probabilistic models for perceptual organization; (3) these cues differ greatly in their inferential power, proximity being by far the most powerful; and (4) statistical modeling of the proximity cue indicates a scale-invariant power law in close agreement with prior psychophysics.

  3. A multifractal analysis of equilibrium measures for conformal expanding maps and Moran-like geometric constructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pesin, Y.; Weiss, H.

    1997-01-01

    In this paper we establish the complete multifractal formalism for equilibrium measures for Holder continuous conformal expanding maps and expanding Markov Moran-like geometric constructions. Examples include Markov maps of an interval, beta transformations of an interval, rational maps with hyperbolic Julia sets, and conformal total endomorphisms. We also construct a Holder continuous homeomorphism of a compact metric space with an ergodic invariant measure of positive entropy for which the dimension spectrum is not convex, and hence the multifractal formalism fails.

  4. Efficient computation of the phylogenetic likelihood function on multi-gene alignments and multi-core architectures.

    PubMed

    Stamatakis, Alexandros; Ott, Michael

    2008-12-27

    The continuous accumulation of sequence data, for example, due to novel wet-laboratory techniques such as pyrosequencing, coupled with the increasing popularity of multi-gene phylogenies and emerging multi-core processor architectures that face problems of cache congestion, poses new challenges with respect to the efficient computation of the phylogenetic maximum-likelihood (ML) function. Here, we propose two approaches that can significantly speed up likelihood computations that typically represent over 95 per cent of the computational effort conducted by current ML or Bayesian inference programs. Initially, we present a method and an appropriate data structure to efficiently compute the likelihood score on 'gappy' multi-gene alignments. By 'gappy' we denote sampling-induced gaps owing to missing sequences in individual genes (partitions), i.e. not real alignment gaps. A first proof-of-concept implementation in RAXML indicates that this approach can accelerate inferences on large and gappy alignments by approximately one order of magnitude. Moreover, we present insights and initial performance results on multi-core architectures obtained during the transition from an OpenMP-based to a Pthreads-based fine-grained parallelization of the ML function.

  5. Multifrequency InSAR height reconstruction through maximum likelihood estimation of local planes parameters.

    PubMed

    Pascazio, Vito; Schirinzi, Gilda

    2002-01-01

    In this paper, a technique that is able to reconstruct highly sloped and discontinuous terrain height profiles, starting from multifrequency wrapped phase acquired by interferometric synthetic aperture radar (SAR) systems, is presented. We propose an innovative unwrapping method, based on a maximum likelihood estimation technique, which uses multifrequency independent phase data, obtained by filtering the interferometric SAR raw data pair through nonoverlapping band-pass filters, and approximating the unknown surface by means of local planes. Since the method does not exploit the phase gradient, it assures the uniqueness of the solution, even in the case of highly sloped or piecewise continuous elevation patterns with strong discontinuities.

  6. Understanding the Relationship Between Exergame Play Experiences, Enjoyment, and Intentions for Continued Play.

    PubMed

    Limperos, Anthony M; Schmierbach, Mike

    2016-04-01

    Although it is generally understood that exergames can be beneficial, more research is needed to understand how in-game experiences influence enjoyment and the likelihood for continued use of these types of games. Therefore, the objective of this research is to understand how player performance in an exergame affects psychological responses (autonomy, competence, and presence), enjoyment of the experience, and likelihood for future play. Sixty-two college students (mean age, 20.32 years) participated in an experiment where they played a challenge event on the "Biggest Loser" exergame for the Nintendo (Kyoto, Japan) Wii™ console. Participants were given up to two chances to see if they could "win" the challenge event. A lab assistant recorded player performance for each session in minutes and seconds (range, 30 seconds-10 minutes). The attempt in which the participant achieved the greatest amount of time playing was used as a measure of player performance. After playing, subjects filled out a questionnaire with items pertaining to enjoyment, competence, autonomy, presence, and future intentions for continued use of the exergame. The results suggest that player achievement (longer time spent playing) directly and indirectly predicts feelings of autonomy, competence, presence, enjoyment, and future intentions to play. Individuals who performed better felt more autonomous and experienced greater presence, leading to greater enjoyment. Enjoyment and presence were found to mediate the relationship between player performance and future intentions to play an exergame. This study suggests that performance in exergames is related to psychological experiences that fuel enjoyment and the likelihood for future exergame use. The theoretical and practical significances of these findings are discussed, as well as future research involving exergames.

  7. The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments

    NASA Astrophysics Data System (ADS)

    Chen, Fajing; Jiao, Meiyan; Chen, Jing

    2013-04-01

    Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.

  8. An integrated Landsat/ancillary data classification of desert rangeland

    NASA Technical Reports Server (NTRS)

    Price, K. P.; Ridd, M. K.; Merola, J. A.

    1985-01-01

    Range inventorying methods using Landsat MSS data, coupled with ancillary data were examined. The study area encompassed nearly 20,000 acres in Rush Valley, UT. The vegetation is predominately desert shrub and annual grasses, with same annual forbs. Three Landsat scenes were evaluated using a Kauth-Thomas brightness/greenness data transformation (May, June, and August dates). The data was classified using a four-band maximum-likelihood classifier. A print map was taken into the field to determine the relationship between print symbols and vegetation. It was determined that classification confusion could be greatly reduced by incorporating geomorphic units and soil texture (coarse vs fine) into the classification. Spectral data, geomorphic units, and soil texture were combined in a GIS format to produce a final vegetation map identifying 12 vegetation types.

  9. An integrated LANDSAT/ancillary data classification of desert rangeland

    NASA Technical Reports Server (NTRS)

    Price, K. P.; Ridd, M. K.; Merola, J. A.

    1984-01-01

    Range inventorying methods using LANDSAT MSS data, coupled with ancillary data were examined. The study area encompassed nearly 20,000 acres in Rush Valley, Utah. The vegetation is predominately desert shrub and annual grasses, with some annual forbs. Three LANDSAT scenes were evaluated using a Kauth-Thomas brightness/greenness data transformation (May, June, and August dates). The data was classified using a four-band maximum-likelihood classifier. A print map was taken into the field to determine the relationship between print symbols and vegetation. It was determined that classification confusion could be greatly reduced by incorporating geomorphic units and soil texture (coarse vs fine) into the classification. Spectral data, geomorphic units, and soil texture were combined in a GIS format to produce a final vegetation map identifying 12 vegetation types.

  10. Nonlinear BCJR equalizer for suppression of intrachannel nonlinearities in 40 Gb/s optical communications systems.

    PubMed

    Djordjevic, Ivan B; Vasic, Bane

    2006-05-29

    A maximum a posteriori probability (MAP) symbol decoding supplemented with iterative decoding is proposed as an effective mean for suppression of intrachannel nonlinearities. The MAP detector, based on Bahl-Cocke-Jelinek-Raviv algorithm, operates on the channel trellis, a dynamical model of intersymbol interference, and provides soft-decision outputs processed further in an iterative decoder. A dramatic performance improvement is demonstrated. The main reason is that the conventional maximum-likelihood sequence detector based on Viterbi algorithm provides hard-decision outputs only, hence preventing the soft iterative decoding. The proposed scheme operates very well in the presence of strong intrachannel intersymbol interference, when other advanced forward error correction schemes fail, and it is also suitable for 40 Gb/s upgrade over existing 10 Gb/s infrastructure.

  11. Comparison of statistical sampling methods with ScannerBit, the GAMBIT scanning module

    NASA Astrophysics Data System (ADS)

    Martinez, Gregory D.; McKay, James; Farmer, Ben; Scott, Pat; Roebber, Elinore; Putze, Antje; Conrad, Jan

    2017-11-01

    We introduce ScannerBit, the statistics and sampling module of the public, open-source global fitting framework GAMBIT. ScannerBit provides a standardised interface to different sampling algorithms, enabling the use and comparison of multiple computational methods for inferring profile likelihoods, Bayesian posteriors, and other statistical quantities. The current version offers random, grid, raster, nested sampling, differential evolution, Markov Chain Monte Carlo (MCMC) and ensemble Monte Carlo samplers. We also announce the release of a new standalone differential evolution sampler, Diver, and describe its design, usage and interface to ScannerBit. We subject Diver and three other samplers (the nested sampler MultiNest, the MCMC GreAT, and the native ScannerBit implementation of the ensemble Monte Carlo algorithm T-Walk) to a battery of statistical tests. For this we use a realistic physical likelihood function, based on the scalar singlet model of dark matter. We examine the performance of each sampler as a function of its adjustable settings, and the dimensionality of the sampling problem. We evaluate performance on four metrics: optimality of the best fit found, completeness in exploring the best-fit region, number of likelihood evaluations, and total runtime. For Bayesian posterior estimation at high resolution, T-Walk provides the most accurate and timely mapping of the full parameter space. For profile likelihood analysis in less than about ten dimensions, we find that Diver and MultiNest score similarly in terms of best fit and speed, outperforming GreAT and T-Walk; in ten or more dimensions, Diver substantially outperforms the other three samplers on all metrics.

  12. Multi-Contrast Multi-Atlas Parcellation of Diffusion Tensor Imaging of the Human Brain

    PubMed Central

    Tang, Xiaoying; Yoshida, Shoko; Hsu, John; Huisman, Thierry A. G. M.; Faria, Andreia V.; Oishi, Kenichi; Kutten, Kwame; Poretti, Andrea; Li, Yue; Miller, Michael I.; Mori, Susumu

    2014-01-01

    In this paper, we propose a novel method for parcellating the human brain into 193 anatomical structures based on diffusion tensor images (DTIs). This was accomplished in the setting of multi-contrast diffeomorphic likelihood fusion using multiple DTI atlases. DTI images are modeled as high dimensional fields, with each voxel exhibiting a vector valued feature comprising of mean diffusivity (MD), fractional anisotropy (FA), and fiber angle. For each structure, the probability distribution of each element in the feature vector is modeled as a mixture of Gaussians, the parameters of which are estimated from the labeled atlases. The structure-specific feature vector is then used to parcellate the test image. For each atlas, a likelihood is iteratively computed based on the structure-specific vector feature. The likelihoods from multiple atlases are then fused. The updating and fusing of the likelihoods is achieved based on the expectation-maximization (EM) algorithm for maximum a posteriori (MAP) estimation problems. We first demonstrate the performance of the algorithm by examining the parcellation accuracy of 18 structures from 25 subjects with a varying degree of structural abnormality. Dice values ranging 0.8–0.9 were obtained. In addition, strong correlation was found between the volume size of the automated and the manual parcellation. Then, we present scan-rescan reproducibility based on another dataset of 16 DTI images – an average of 3.73%, 1.91%, and 1.79% for volume, mean FA, and mean MD respectively. Finally, the range of anatomical variability in the normal population was quantified for each structure. PMID:24809486

  13. A Regional Analysis of Non-Methane Hydrocarbons And Meteorology of The Rural Southeast United States

    DTIC Science & Technology

    1996-01-01

    Zt is an ARIMA time series. This is a typical regression model , except that it allows for autocorrelation in the error term Z. In this work, an ARMA...data=folder; var residual; run; II Statistical output of 1992 regression model on 1993 ozone data ARIMA Procedure Maximum Likelihood Estimation Approx...at each of the sites, and to show the effect of synoptic meteorology on high ozone by examining NOAA daily weather maps and climatic data

  14. Three-dimensional quantitative T1 and T2 mapping of the carotid artery: Sequence design and in vivo feasibility.

    PubMed

    Coolen, Bram F; Poot, Dirk H J; Liem, Madieke I; Smits, Loek P; Gao, Shan; Kotek, Gyula; Klein, Stefan; Nederveen, Aart J

    2016-03-01

    A novel three-dimensional (3D) T1 and T2 mapping protocol for the carotid artery is presented. A 3D black-blood imaging sequence was adapted allowing carotid T1 and T2 mapping using multiple flip angles and echo time (TE) preparation times. B1 mapping was performed to correct for spatially varying deviations from the nominal flip angle. The protocol was optimized using simulations and phantom experiments. In vivo scans were performed on six healthy volunteers in two sessions, and in a patient with advanced atherosclerosis. Compensation for patient motion was achieved by 3D registration of the inter/intrasession scans. Subsequently, T1 and T2 maps were obtained by maximum likelihood estimation. Simulations and phantom experiments showed that the bias in T1 and T2 estimation was < 10% within the range of physiological values. In vivo T1 and T2 values for carotid vessel wall were 844 ± 96 and 39 ± 5 ms, with good repeatability across scans. Patient data revealed altered T1 and T2 values in regions of atherosclerotic plaque. The 3D T1 and T2 mapping of the carotid artery is feasible using variable flip angle and variable TE preparation acquisitions. We foresee application of this technique for plaque characterization and monitoring plaque progression in atherosclerotic patients. © 2015 Wiley Periodicals, Inc.

  15. Fine Mapping Causal Variants with an Approximate Bayesian Method Using Marginal Test Statistics

    PubMed Central

    Chen, Wenan; Larrabee, Beth R.; Ovsyannikova, Inna G.; Kennedy, Richard B.; Haralambieva, Iana H.; Poland, Gregory A.; Schaid, Daniel J.

    2015-01-01

    Two recently developed fine-mapping methods, CAVIAR and PAINTOR, demonstrate better performance over other fine-mapping methods. They also have the advantage of using only the marginal test statistics and the correlation among SNPs. Both methods leverage the fact that the marginal test statistics asymptotically follow a multivariate normal distribution and are likelihood based. However, their relationship with Bayesian fine mapping, such as BIMBAM, is not clear. In this study, we first show that CAVIAR and BIMBAM are actually approximately equivalent to each other. This leads to a fine-mapping method using marginal test statistics in the Bayesian framework, which we call CAVIAR Bayes factor (CAVIARBF). Another advantage of the Bayesian framework is that it can answer both association and fine-mapping questions. We also used simulations to compare CAVIARBF with other methods under different numbers of causal variants. The results showed that both CAVIARBF and BIMBAM have better performance than PAINTOR and other methods. Compared to BIMBAM, CAVIARBF has the advantage of using only marginal test statistics and takes about one-quarter to one-fifth of the running time. We applied different methods on two independent cohorts of the same phenotype. Results showed that CAVIARBF, BIMBAM, and PAINTOR selected the same top 3 SNPs; however, CAVIARBF and BIMBAM had better consistency in selecting the top 10 ranked SNPs between the two cohorts. Software is available at https://bitbucket.org/Wenan/caviarbf. PMID:25948564

  16. Vegetation mapping from high-resolution satellite images in the heterogeneous arid environments of Socotra Island (Yemen)

    NASA Astrophysics Data System (ADS)

    Malatesta, Luca; Attorre, Fabio; Altobelli, Alfredo; Adeeb, Ahmed; De Sanctis, Michele; Taleb, Nadim M.; Scholte, Paul T.; Vitale, Marcello

    2013-01-01

    Socotra Island (Yemen), a global biodiversity hotspot, is characterized by high geomorphological and biological diversity. In this study, we present a high-resolution vegetation map of the island based on combining vegetation analysis and classification with remote sensing. Two different image classification approaches were tested to assess the most accurate one in mapping the vegetation mosaic of Socotra. Spectral signatures of the vegetation classes were obtained through a Gaussian mixture distribution model, and a sequential maximum a posteriori (SMAP) classification was applied to account for the heterogeneity and the complex spatial pattern of the arid vegetation. This approach was compared to the traditional maximum likelihood (ML) classification. Satellite data were represented by a RapidEye image with 5 m pixel resolution and five spectral bands. Classified vegetation relevés were used to obtain the training and evaluation sets for the main plant communities. Postclassification sorting was performed to adjust the classification through various rule-based operations. Twenty-eight classes were mapped, and SMAP, with an accuracy of 87%, proved to be more effective than ML (accuracy: 66%). The resulting map will represent an important instrument for the elaboration of conservation strategies and the sustainable use of natural resources in the island.

  17. Mapping Quantitative Field Resistance Against Apple Scab in a 'Fiesta' x 'Discovery' Progeny.

    PubMed

    Liebhard, R; Koller, B; Patocchi, A; Kellerhals, M; Pfammatter, W; Jermini, M; Gessler, C

    2003-04-01

    ABSTRACT Breeding of resistant apple cultivars (Malus x domestica) as a disease management strategy relies on the knowledge and understanding of the underlying genetics. The availability of molecular markers and genetic linkage maps enables the detection and the analysis of major resistance genes as well as of quantitative trait loci (QTL) contributing to the resistance of a genotype. Such a genetic linkage map was constructed, based on a segregating population of the cross between apple cvs. Fiesta (syn. Red Pippin) and Discovery. The progeny was observed for 3 years at three different sites in Switzerland and field resistance against apple scab (Venturia inaequalis) was assessed. Only a weak correlation was detected between leaf scab and fruit scab. A QTL analysis was performed, based on the genetic linkage map consisting of 804 molecular markers and covering all 17 chromosomes of apple. With the maximum likelihood-based interval mapping method, eight genomic regions were identified, six conferring resistance against leaf scab and two conferring fruit scab resistance. Although cv. Discovery showed a much stronger resistance against scab in the field, most QTL identified were attributed to the more susceptible parent 'Fiesta'. This indicated a high degree of homozygosity at the scab resistance loci in 'Discovery', preventing their detection in the progeny due to the lack of segregation.

  18. 32 CFR 154.60 - Evaluating continued security eligibility.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....60 Section 154.60 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY... assess the future trustworthiness of an individual in terms of the likelihood of the individual... that any human being will remain trustworthy. Accordingly the issuance of a personnel security...

  19. 32 CFR 154.60 - Evaluating continued security eligibility.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....60 Section 154.60 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY... assess the future trustworthiness of an individual in terms of the likelihood of the individual... that any human being will remain trustworthy. Accordingly the issuance of a personnel security...

  20. 32 CFR 154.60 - Evaluating continued security eligibility.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....60 Section 154.60 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY... assess the future trustworthiness of an individual in terms of the likelihood of the individual... that any human being will remain trustworthy. Accordingly the issuance of a personnel security...

  1. LIKELIHOOD MODELS FOR CLUSTERED BINARY AND CONTINUOUS OUTCOMES: APPLICATION TO DEVELOPMENTAL TOXICOLOGY. (R824757)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  2. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    NASA Astrophysics Data System (ADS)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  3. The Inferred Distribution of Liquid Water in Europa's Ice Shell: Implications for the Europa Lander Mission

    NASA Astrophysics Data System (ADS)

    Noviello, J. L.; Torrano, Z. A.; Rhoden, A.; Manga, M.

    2017-12-01

    A key objective of the Europa lander mission is to identify liquid water within 30 km of the lander (Europa Lander SDT report, 2017), to provide essential context with which to evaluate samples and enable assessment of Europa's overall habitability. To inform lander mission development, we utilize a model of surface feature formation that invokes liquid water within Europa's ice shell to map out the implied 3D distribution of liquid water and assess the likelihood of a lander to be within 30 km of liquid water given regional variability. Europa's surface displays a variety of microfeatures, also called lenticulae, including pits, domes, spots, and microchaos. A recent model by Manga and Michaut (2017) attributes these features to various stages in the thermal-mechanical evolution of liquid water intrusions (i.e. sills) within the ice shell, from sill emplacement to surface breaching (in the case of microchaos) to freezing of the sill. Pits are of particular interest because they appear only when liquid water is still present. Another key feature of the model is that the size of a microfeature at the surface is controlled by the depth of the sill. Hence, we can apply this model to regions of Europa that contain microfeatures to infer the size, depth, and spatial distribution of liquid water within the ice shell. We are creating a database of microfeatures that includes digitized, collated data from previous mapping efforts along with our own mapping study. We focus on images with 220 m/pixel resolution, which includes the regional mapping data sets. Analysis of a preliminary study area suggests that sills are typically located at depths of 2km or less from the surface. We will present analysis of the full database of microfeatures and the corresponding 3D distribution of sills implied by the model. Our preliminary analysis also shows that pits are clustered in some regions, consistent with previous results, although individual pits are also observed. We apply a statistical method, using the distribution of nearest neighbor distances, to quantify the degree of clustering and to determine the typical spatial separation among and between microfeature types. We will create density maps of microfeatures in several regions of Europa, and determine the likelihood that a lander will be within 30 km of a sill, assuming an arbitrary landing site.

  4. Geology of a Portion of the Martian Highlands: MTMs -20002, -20007, -25002 and -25007

    NASA Technical Reports Server (NTRS)

    Fortezzo, C. M.; Williams, K. K.

    2009-01-01

    As part of a continuing study to understand the relationship between valleys and highland resurfacing through geologic mapping, we are continuing to map seven MTM quads in portions of the Margaritifer, Arabia, and Noachis Terrae. Results from this mapping will also help constrain the role and extent of past water in the region. The MTMs are grouped in two different areas: a 4-quadrangle area (-20002, -20007, -25002, -25007) and an L-shaped area (-15017, -20017, -20022) within the region [1-5]. This abstract focuses on the geologic units and history from mapping in the 4-quadrangle area, but includes a brief update on the L-shaped map area.

  5. Pulse Oximeter Derived Blood Pressure Measurement in Patients With a Continuous Flow Left Ventricular Assist Device.

    PubMed

    Hellman, Yaron; Malik, Adnan S; Lane, Kathleen A; Shen, Changyu; Wang, I-Wen; Wozniak, Thomas C; Hashmi, Zubair A; Munson, Sarah D; Pickrell, Jeanette; Caccamo, Marco A; Gradus-Pizlo, Irmina; Hadi, Azam

    2017-05-01

    Currently, blood pressure (BP) measurement is obtained noninvasively in patients with continuous flow left ventricular assist device (LVAD) by placing a Doppler probe over the brachial or radial artery with inflation and deflation of a manual BP cuff. We hypothesized that replacing the Doppler probe with a finger-based pulse oximeter can yield BP measurements similar to the Doppler derived mean arterial pressure (MAP). We conducted a prospective study consisting of patients with contemporary continuous flow LVADs. In a small pilot phase I inpatient study, we compared direct arterial line measurements with an automated blood pressure (ABP) cuff, Doppler and pulse oximeter derived MAP. Our main phase II study included LVAD outpatients with a comparison between Doppler, ABP, and pulse oximeter derived MAP. A total of five phase I and 36 phase II patients were recruited during February-June 2014. In phase I, the average MAP measured by pulse oximeter was closer to arterial line MAP rather than Doppler (P = 0.06) or ABP (P < 0.01). In phase II, pulse oximeter MAP (96.6 mm Hg) was significantly closer to Doppler MAP (96.5 mm Hg) when compared to ABP (82.1 mm Hg) (P = 0.0001). Pulse oximeter derived blood pressure measurement may be as reliable as Doppler in patients with continuous flow LVADs. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  6. Mixture model based joint-MAP reconstruction of attenuation and activity maps in TOF-PET

    NASA Astrophysics Data System (ADS)

    Hemmati, H.; Kamali-Asl, A.; Ghafarian, P.; Ay, M. R.

    2018-06-01

    A challenge to have quantitative positron emission tomography (PET) images is to provide an accurate and patient-specific photon attenuation correction. In PET/MR scanners, the nature of MR signals and hardware limitations have led to a real challenge on the attenuation map extraction. Except for a constant factor, the activity and attenuation maps from emission data on TOF-PET system can be determined by the maximum likelihood reconstruction of attenuation and activity approach (MLAA) from emission data. The aim of the present study is to constrain the joint estimations of activity and attenuation approach for PET system using a mixture model prior based on the attenuation map histogram. This novel prior enforces non-negativity and its hyperparameters can be estimated using a mixture decomposition step from the current estimation of the attenuation map. The proposed method can also be helpful on the solving of scaling problem and is capable to assign the predefined regional attenuation coefficients with some degree of confidence to the attenuation map similar to segmentation-based attenuation correction approaches. The performance of the algorithm is studied with numerical and Monte Carlo simulations and a phantom experiment and was compared with MLAA algorithm with and without the smoothing prior. The results demonstrate that the proposed algorithm is capable of producing the cross-talk free activity and attenuation images from emission data. The proposed approach has potential to be a practical and competitive method for joint reconstruction of activity and attenuation maps from emission data on PET/MR and can be integrated on the other methods.

  7. Geographic information systems and logistic regression for high-resolution malaria risk mapping in a rural settlement of the southern Brazilian Amazon.

    PubMed

    de Oliveira, Elaine Cristina; dos Santos, Emerson Soares; Zeilhofer, Peter; Souza-Santos, Reinaldo; Atanaka-Santos, Marina

    2013-11-15

    In Brazil, 99% of the cases of malaria are concentrated in the Amazon region, with high level of transmission. The objectives of the study were to use geographic information systems (GIS) analysis and logistic regression as a tool to identify and analyse the relative likelihood and its socio-environmental determinants of malaria infection in the Vale do Amanhecer rural settlement, Brazil. A GIS database of georeferenced malaria cases, recorded in 2005, and multiple explanatory data layers was built, based on a multispectral Landsat 5 TM image, digital map of the settlement blocks and a SRTM digital elevation model. Satellite imagery was used to map the spatial patterns of land use and cover (LUC) and to derive spectral indices of vegetation density (NDVI) and soil/vegetation humidity (VSHI). An Euclidian distance operator was applied to measure proximity of domiciles to potential mosquito breeding habitats and gold mining areas. The malaria risk model was generated by multiple logistic regression, in which environmental factors were considered as independent variables and the number of cases, binarized by a threshold value was the dependent variable. Out of a total of 336 cases of malaria, 133 positive slides were from inhabitants at Road 08, which corresponds to 37.60% of the notifications. The southern region of the settlement presented 276 cases and a greater number of domiciles in which more than ten cases/home were notified. From these, 102 (30.36%) cases were caused by Plasmodium falciparum and 174 (51.79%) cases by Plasmodium vivax. Malaria risk is the highest in the south of the settlement, associated with proximity to gold mining sites, intense land use, high levels of soil/vegetation humidity and low vegetation density. Mid-resolution, remote sensing data and GIS-derived distance measures can be successfully combined with digital maps of the housing location of (non-) infected inhabitants to predict relative likelihood of disease infection through the analysis by logistic regression. Obtained findings on the relation between malaria cases and environmental factors should be applied in the future for land use planning in rural settlements in the Southern Amazon to minimize risks of disease transmission.

  8. Efficient crop type mapping based on remote sensing in the Central Valley, California

    NASA Astrophysics Data System (ADS)

    Zhong, Liheng

    Most agricultural systems in California's Central Valley are purposely flexible and intentionally designed to meet the demands of dynamic markets. Agricultural land use is also impacted by climate change and urban development. As a result, crops change annually and semiannually, which makes estimating agricultural water use difficult, especially given the existing method by which agricultural land use is identified and mapped. A minor portion of agricultural land is surveyed annually for land-use type, and every 5 to 8 years the entire valley is completely evaluated. So far no effort has been made to effectively and efficiently identify specific crop types on an annual basis in this area. The potential of satellite imagery to map agricultural land cover and estimate water usage in the Central Valley is explored. Efforts are made to minimize the cost and reduce the time of production during the mapping process. The land use change analysis shows that a remote sensing based mapping method is the only means to map the frequent change of major crop types. The traditional maximum likelihood classification approach is first utilized to map crop types to test the classification capacity of existing algorithms. High accuracy is achieved with sufficient ground truth data for training, and crop maps of moderate quality can be timely produced to facilitate a near-real-time water use estimate. However, the large set of ground truth data required by this method results in high costs in data collection. It is difficult to reduce the cost because a trained classification algorithm is not transferable between different years or different regions. A phenology based classification (PBC) approach is developed which extracts phenological metrics from annual vegetation index profiles and identifies crop types based on these metrics using decision trees. According to the comparison with traditional maximum likelihood classification, this phenology-based approach shows great advantages when the size of the training set is limited by ground truth availability. Once developed, the classifier is able to be applied to different years and a vast area with only a few adjustments according to local agricultural and annual weather conditions. 250 m MODIS imagery is utilized as the main input to the PBC algorithm and displays promising capacity in crop identification in several counties in the Central Valley. A time series of Landsat TM/ETM+ images at a 30 m resolution is necessary in the crop mapping of counties with smaller land parcels, although the processing time is longer. Spectral characteristics are also employed to identify crops in PBC. Spectral signatures are associated with phenological stages instead of imaging dates, which highly increases the stability of the classifier performance and overcomes the problem of over-fitting. Moderate accuracies are achieved by PBC, with confusions mostly within the same crop categories. Based on a quantitative analysis, misclassification in PBC has very trivial impacts on the accuracy of agricultural water use estimate. The cost of the entire PBC procedure is controlled to a very low level, which will enable its usage in routine annual crop mapping in the Central Valley.

  9. CGBayesNets: Conditional Gaussian Bayesian Network Learning and Inference with Mixed Discrete and Continuous Data

    PubMed Central

    Weiss, Scott T.

    2014-01-01

    Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com. PMID:24922310

  10. CGBayesNets: conditional Gaussian Bayesian network learning and inference with mixed discrete and continuous data.

    PubMed

    McGeachie, Michael J; Chang, Hsun-Hsien; Weiss, Scott T

    2014-06-01

    Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com.

  11. Determining and representing width of soil boundaries using electrical conductivity and MultiGrid

    NASA Astrophysics Data System (ADS)

    Greve, Mogens Humlekrog; Greve, Mette Balslev

    2004-07-01

    In classical soil mapping, map unit boundaries are considered crisp even though all experienced survey personnel are aware of the fact, that soil boundaries really are transition zones of varying width. However, classification of transition zone width on site is difficult in a practical survey. The objective of this study is to present a method for determining soil boundary width and a way of representing continuous soil boundaries in GIS. A survey was performed using the non-contact conductivity meter EM38 from Geonics Inc., which measures the bulk Soil Electromagnetic Conductivity (SEC). The EM38 provides an opportunity to classify the width of transition zones in an unbiased manner. By calculating the spatial rate of change in the interpolated EM38 map across the crisp map unit delineations from a classical soil mapping, a measure of transition zone width can be extracted. The map unit delineations are represented as transition zones in a GIS through a concept of multiple grid layers, a MultiGrid. Each layer corresponds to a soil type and the values in a layer represent the percentage of that soil type in each cell. As a test, the subsoil texture was mapped at the Vindum field in Denmark using both the classical mapping method with crisp representation of the boundaries and the new map with MultiGrid and continuous boundaries. These maps were then compared to an independent reference map of subsoil texture. The improvement of the prediction of subsoil texture, using continuous boundaries instead of crisp, was in the case of the Vindum field, 15%.

  12. Landsat-faciliated vegetation classification of the Kenai National Wildlife Refuge and adjacent areas, Alaska

    USGS Publications Warehouse

    Talbot, S. S.; Shasby, M.B.; Bailey, T.N.

    1985-01-01

    A Landsat-based vegetation map was prepared for Kenai National Wildlife Refuge and adjacent lands, 2 million and 2.5 million acres respectively. The refuge lies within the middle boreal sub zone of south central Alaska. Seven major classes and sixteen subclasses were recognized: forest (closed needleleaf, needleleaf woodland, mixed); deciduous scrub (lowland and montane, subalpine); dwarf scrub (dwarf shrub tundra, lichen tundra, dwarf shrub and lichen tundra, dwarf shrub peatland, string bog/wetlands); herbaceous (graminoid meadows and marshes); scarcely vegetated areas ; water (clear, moderately turbid, highly turbid); and glaciers. The methodology employed a cluster-block technique. Sample areas were described based on a combination of helicopter-ground survey, aerial photo interpretation, and digital Landsat data. Major steps in the Landsat analysis involved: preprocessing (geometric connection), spectral class labeling of sample areas, derivation of statistical parameters for spectral classes, preliminary classification of the entree study area using a maximum-likelihood algorithm, and final classification through ancillary information such as digital elevation data. The vegetation map (scale 1:250,000) was a pioneering effort since there were no intermediate-sclae maps of the area. Representative of distinctive regional patterns, the map was suitable for use in comprehensive conservation planning and wildlife management.

  13. ShakeMap-based prediction of earthquake-induced mass movements in Switzerland calibrated on historical observations

    USGS Publications Warehouse

    Cauzzi, Carlo; Fah, Donat; Wald, David J.; Clinton, John; Losey, Stephane; Wiemer, Stefan

    2018-01-01

    In Switzerland, nearly all historical Mw ~ 6 earthquakes have induced damaging landslides, rockslides and snow avalanches that, in some cases, also resulted in damage to infrastructure and loss of lives. We describe the customisation to Swiss conditions of a globally calibrated statistical approach originally developed to rapidly assess earthquake-induced landslide likelihoods worldwide. The probability of occurrence of such earthquake-induced effects is modelled through a set of geospatial susceptibility proxies and peak ground acceleration. The predictive model is tuned to capture the observations from past events and optimised for near-real-time estimates based on USGS-style ShakeMaps routinely produced by the Swiss Seismological Service. Our emphasis is on the use of high-resolution geospatial datasets along with additional local information on ground failure susceptibility. Even if calibrated on historic events with moderate magnitudes, the methodology presented in this paper yields sensible results also for low-magnitude recent events. The model is integrated in the Swiss ShakeMap framework. This study has a high practical relevance to many Swiss ShakeMap stakeholders, especially those managing lifeline systems, and to other global users interested in conducting a similar customisation for their region of interest.

  14. Input-output mapping reconstruction of spike trains at dorsal horn evoked by manual acupuncture

    NASA Astrophysics Data System (ADS)

    Wei, Xile; Shi, Dingtian; Yu, Haitao; Deng, Bin; Lu, Meili; Han, Chunxiao; Wang, Jiang

    2016-12-01

    In this study, a generalized linear model (GLM) is used to reconstruct mapping from acupuncture stimulation to spike trains driven by action potential data. The electrical signals are recorded in spinal dorsal horn after manual acupuncture (MA) manipulations with different frequencies being taken at the “Zusanli” point of experiment rats. Maximum-likelihood method is adopted to estimate the parameters of GLM and the quantified value of assumed model input. Through validating the accuracy of firings generated from the established GLM, it is found that the input-output mapping of spike trains evoked by acupuncture can be successfully reconstructed for different frequencies. Furthermore, via comparing the performance of several GLMs based on distinct inputs, it suggests that input with the form of half-sine with noise can well describe the generator potential induced by acupuncture mechanical action. Particularly, the comparison of reproducing the experiment spikes for five selected inputs is in accordance with the phenomenon found in Hudgkin-Huxley (H-H) model simulation, which indicates the mapping from half-sine with noise input to experiment spikes meets the real encoding scheme to some extent. These studies provide us a new insight into coding processes and information transfer of acupuncture.

  15. Family Caregiver Identity: A Literature Review

    ERIC Educational Resources Information Center

    Eifert, Elise K.; Adams, Rebecca; Dudley, William; Perko, Michael

    2015-01-01

    Background: Despite the multitude of available resources, family caregivers of those with chronic disease continually underutilize support services to cope with the demands of caregiving. Several studies have linked self-identification as a caregiver to the increased likelihood of support service use. Purpose: The present study reviewed the…

  16. Univariate and bivariate likelihood-based meta-analysis methods performed comparably when marginal sensitivity and specificity were the targets of inference.

    PubMed

    Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H

    2017-03-01

    To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Continuous properties of the data-to-solution map for a generalized μ-Camassa-Holm integrable equation

    NASA Astrophysics Data System (ADS)

    Yu, Shengqi

    2018-05-01

    This work studies a generalized μ-type integrable equation with both quadratic and cubic nonlinearities; the μ-Camassa-Holm and modified μ-Camassa-Holm equations are members of this family of equations. It has been shown that the Cauchy problem for this generalized μ-Camassa-Holm integrable equation is locally well-posed for initial data u0 ∈ Hs, s > 5/2. In this work, we further investigate the continuity properties to this equation. It is proved in this work that the data-to-solution map of the proposed equation is not uniformly continuous. It is also found that the solution map is Hölder continuous in the Hr-topology when 0 ≤ r < s with Hölder exponent α depending on both s and r.

  18. Topographic Independent Component Analysis reveals random scrambling of orientation in visual space

    PubMed Central

    Martinez-Garcia, Marina; Martinez, Luis M.

    2017-01-01

    Neurons at primary visual cortex (V1) in humans and other species are edge filters organized in orientation maps. In these maps, neurons with similar orientation preference are clustered together in iso-orientation domains. These maps have two fundamental properties: (1) retinotopy, i.e. correspondence between displacements at the image space and displacements at the cortical surface, and (2) a trade-off between good coverage of the visual field with all orientations and continuity of iso-orientation domains in the cortical space. There is an active debate on the origin of these locally continuous maps. While most of the existing descriptions take purely geometric/mechanistic approaches which disregard the network function, a clear exception to this trend in the literature is the original approach of Hyvärinen and Hoyer based on infomax and Topographic Independent Component Analysis (TICA). Although TICA successfully addresses a number of other properties of V1 simple and complex cells, in this work we question the validity of the orientation maps obtained from TICA. We argue that the maps predicted by TICA can be analyzed in the retinal space, and when doing so, it is apparent that they lack the required continuity and retinotopy. Here we show that in the orientation maps reported in the TICA literature it is easy to find examples of violation of the continuity between similarly tuned mechanisms in the retinal space, which suggest a random scrambling incompatible with the maps in primates. The new experiments in the retinal space presented here confirm this guess: TICA basis vectors actually follow a random salt-and-pepper organization back in the image space. Therefore, the interesting clusters found in the TICA topology cannot be interpreted as the actual cortical orientation maps found in cats, primates or humans. In conclusion, Topographic ICA does not reproduce cortical orientation maps. PMID:28640816

  19. Topographic Independent Component Analysis reveals random scrambling of orientation in visual space.

    PubMed

    Martinez-Garcia, Marina; Martinez, Luis M; Malo, Jesús

    2017-01-01

    Neurons at primary visual cortex (V1) in humans and other species are edge filters organized in orientation maps. In these maps, neurons with similar orientation preference are clustered together in iso-orientation domains. These maps have two fundamental properties: (1) retinotopy, i.e. correspondence between displacements at the image space and displacements at the cortical surface, and (2) a trade-off between good coverage of the visual field with all orientations and continuity of iso-orientation domains in the cortical space. There is an active debate on the origin of these locally continuous maps. While most of the existing descriptions take purely geometric/mechanistic approaches which disregard the network function, a clear exception to this trend in the literature is the original approach of Hyvärinen and Hoyer based on infomax and Topographic Independent Component Analysis (TICA). Although TICA successfully addresses a number of other properties of V1 simple and complex cells, in this work we question the validity of the orientation maps obtained from TICA. We argue that the maps predicted by TICA can be analyzed in the retinal space, and when doing so, it is apparent that they lack the required continuity and retinotopy. Here we show that in the orientation maps reported in the TICA literature it is easy to find examples of violation of the continuity between similarly tuned mechanisms in the retinal space, which suggest a random scrambling incompatible with the maps in primates. The new experiments in the retinal space presented here confirm this guess: TICA basis vectors actually follow a random salt-and-pepper organization back in the image space. Therefore, the interesting clusters found in the TICA topology cannot be interpreted as the actual cortical orientation maps found in cats, primates or humans. In conclusion, Topographic ICA does not reproduce cortical orientation maps.

  20. Multiplicative Forests for Continuous-Time Processes

    PubMed Central

    Weiss, Jeremy C.; Natarajan, Sriraam; Page, David

    2013-01-01

    Learning temporal dependencies between variables over continuous time is an important and challenging task. Continuous-time Bayesian networks effectively model such processes but are limited by the number of conditional intensity matrices, which grows exponentially in the number of parents per variable. We develop a partition-based representation using regression trees and forests whose parameter spaces grow linearly in the number of node splits. Using a multiplicative assumption we show how to update the forest likelihood in closed form, producing efficient model updates. Our results show multiplicative forests can be learned from few temporal trajectories with large gains in performance and scalability. PMID:25284967

  1. Multiplicative Forests for Continuous-Time Processes.

    PubMed

    Weiss, Jeremy C; Natarajan, Sriraam; Page, David

    2012-01-01

    Learning temporal dependencies between variables over continuous time is an important and challenging task. Continuous-time Bayesian networks effectively model such processes but are limited by the number of conditional intensity matrices, which grows exponentially in the number of parents per variable. We develop a partition-based representation using regression trees and forests whose parameter spaces grow linearly in the number of node splits. Using a multiplicative assumption we show how to update the forest likelihood in closed form, producing efficient model updates. Our results show multiplicative forests can be learned from few temporal trajectories with large gains in performance and scalability.

  2. Interval mapping of high growth (hg), a major locus that increases weight gain in mice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horvat, S.; Medrano, J.F.

    1995-04-01

    The high growth locus (hg) causes a major increase in weight gain and body size in mice. As a first step to map-based cloning of hg, we developed a genetic map of the hg-containing region using interval mapping of 403 F{sub 2} from a C57BL/6J-hghg x CAST/EiJ cross. The maximum likelihood position of hg was at the chromosome 10 marker D10Mit41 (LOD = 24.8) in the F{sub 2} females and 1.5 cM distal to D10Mit41 (LOD = 9.56) in the F{sub 2} males with corresponding LOD 2 support intervals of 3.7 and 5.4 cM, respectively. The peak LOD scores weremore » significantly higher than the estimated empirical threshold LOD values. The localization of hg by interval mapping was supported by a test cross of F{sub 2} mice recombinant between the LOD 2 support interval and the flanking marker. The interval mapping and test-cross indicate that hg is not allelic with candidate genes Igf1 or decorin (Dcn), a gene that was mapped close to hg in this study. The hg inheritance was recessive in females, although we could not reject recessive or additive inheritance in males. Possible causes for sex differences in peak LOD scores and for the distortion of transmission ratios observed in F{sub 2} males are discussed. The genetic map of the hg region will facilitate further fine mapping and cloning of hg, and allow searches for a homologous quantitative trait locus affecting growth in humans and domestic animals. 48 refs., 3 figs., 3 tabs.« less

  3. Mapping invasive wetland plants in the Hudson River National Estuarine Research Reserve using quickbird satellite imagery

    USGS Publications Warehouse

    Laba, M.; Downs, R.; Smith, S.; Welsh, S.; Neider, C.; White, S.; Richmond, M.; Philpot, W.; Baveye, P.

    2008-01-01

    The National Estuarine Research Reserve (NERR) program is a nationally coordinated research and monitoring program that identifies and tracks changes in ecological resources of representative estuarine ecosystems and coastal watersheds. In recent years, attention has focused on using high spatial and spectral resolution satellite imagery to map and monitor wetland plant communities in the NERRs, particularly invasive plant species. The utility of this technology for that purpose has yet to be assessed in detail. To that end, a specific high spatial resolution satellite imagery, QuickBird, was used to map plant communities and monitor invasive plants within the Hudson River NERR (HRNERR). The HRNERR contains four diverse tidal wetlands (Stockport Flats, Tivoli Bays, Iona Island, and Piermont), each with unique water chemistry (i.e., brackish, oligotrophic and fresh) and, consequently, unique assemblages of plant communities, including three invasive plants (Trapa natans, Phragmites australis, and Lythrum salicaria). A maximum-likelihood classification was used to produce 20-class land cover maps for each of the four marshes within the HRNERR. Conventional contingency tables and a fuzzy set analysis served as a basis for an accuracy assessment of these maps. The overall accuracies, as assessed by the contingency tables, were 73.6%, 68.4%, 67.9%, and 64.9% for Tivoli Bays, Stockport Flats, Piermont, and Iona Island, respectively. Fuzzy assessment tables lead to higher estimates of map accuracies of 83%, 75%, 76%, and 76%, respectively. In general, the open water/tidal channel class was the most accurately mapped class and Scirpus sp. was the least accurately mapped. These encouraging accuracies suggest that high-resolution satellite imagery offers significant potential for the mapping of invasive plant species in estuarine environments. ?? 2007 Elsevier Inc. All rights reserved.

  4. Using remote sensing in support of environmental management: A framework for selecting products, algorithms and methods.

    PubMed

    de Klerk, Helen M; Gilbertson, Jason; Lück-Vogel, Melanie; Kemp, Jaco; Munch, Zahn

    2016-11-01

    Traditionally, to map environmental features using remote sensing, practitioners will use training data to develop models on various satellite data sets using a number of classification approaches and use test data to select a single 'best performer' from which the final map is made. We use a combination of an omission/commission plot to evaluate various results and compile a probability map based on consistently strong performing models across a range of standard accuracy measures. We suggest that this easy-to-use approach can be applied in any study using remote sensing to map natural features for management action. We demonstrate this approach using optical remote sensing products of different spatial and spectral resolution to map the endemic and threatened flora of quartz patches in the Knersvlakte, South Africa. Quartz patches can be mapped using either SPOT 5 (used due to its relatively fine spatial resolution) or Landsat8 imagery (used because it is freely accessible and has higher spectral resolution). Of the variety of classification algorithms available, we tested maximum likelihood and support vector machine, and applied these to raw spectral data, the first three PCA summaries of the data, and the standard normalised difference vegetation index. We found that there is no 'one size fits all' solution to the choice of a 'best fit' model (i.e. combination of classification algorithm or data sets), which is in agreement with the literature that classifier performance will vary with data properties. We feel this lends support to our suggestion that rather than the identification of a 'single best' model and a map based on this result alone, a probability map based on the range of consistently top performing models provides a rigorous solution to environmental mapping. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. The Importance of Temporal and Spatial Vegetation Structure Information in Biotope Mapping Schemes: A Case Study in Helsingborg, Sweden

    NASA Astrophysics Data System (ADS)

    Gao, Tian; Qiu, Ling; Hammer, Mårten; Gunnarsson, Allan

    2012-02-01

    Temporal and spatial vegetation structure has impact on biodiversity qualities. Yet, current schemes of biotope mapping do only to a limited extend incorporate these factors in the mapping. The purpose of this study is to evaluate the application of a modified biotope mapping scheme that includes temporal and spatial vegetation structure. A refined scheme was developed based on a biotope classification, and applied to a green structure system in Helsingborg city in southern Sweden. It includes four parameters of vegetation structure: continuity of forest cover, age of dominant trees, horizontal structure, and vertical structure. The major green structure sites were determined by interpretation of panchromatic aerial photographs assisted with a field survey. A set of biotope maps was constructed on the basis of each level of modified classification. An evaluation of the scheme included two aspects in particular: comparison of species richness between long-continuity and short-continuity forests based on identification of woodland continuity using ancient woodland indicators (AWI) species and related historical documents, and spatial distribution of animals in the green space in relation to vegetation structure. The results indicate that (1) the relationship between forest continuity: according to verification of historical documents, the richness of AWI species was higher in long-continuity forests; Simpson's diversity was significantly different between long- and short-continuity forests; the total species richness and Shannon's diversity were much higher in long-continuity forests shown a very significant difference. (2) The spatial vegetation structure and age of stands influence the richness and abundance of the avian fauna and rabbits, and distance to the nearest tree and shrub was a strong determinant of presence for these animal groups. It is concluded that continuity of forest cover, age of dominant trees, horizontal and vertical structures of vegetation should now be included in urban biotope classifications.

  6. Assessing landscape scale wildfire exposure for highly valued resources in a Mediterranean area.

    PubMed

    Alcasena, Fermín J; Salis, Michele; Ager, Alan A; Arca, Bachisio; Molina, Domingo; Spano, Donatella

    2015-05-01

    We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km(2) located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.

  7. Bit Error Probability for Maximum Likelihood Decoding of Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Fossorier, Marc P. C.; Rhee, Dojun

    1996-01-01

    In this paper, the bit error probability P(sub b) for maximum likelihood decoding of binary linear codes is investigated. The contribution of each information bit to P(sub b) is considered. For randomly generated codes, it is shown that the conventional approximation at high SNR P(sub b) is approximately equal to (d(sub H)/N)P(sub s), where P(sub s) represents the block error probability, holds for systematic encoding only. Also systematic encoding provides the minimum P(sub b) when the inverse mapping corresponding to the generator matrix of the code is used to retrieve the information sequence. The bit error performances corresponding to other generator matrix forms are also evaluated. Although derived for codes with a generator matrix randomly generated, these results are shown to provide good approximations for codes used in practice. Finally, for decoding methods which require a generator matrix with a particular structure such as trellis decoding or algebraic-based soft decision decoding, equivalent schemes that reduce the bit error probability are discussed.

  8. Endangered Species Act and energy facility planning: compliance and conflict

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shreeve, D; Calef, C; Nagy, J

    1978-05-01

    New energy facilities such as coal mines, gasification plants, refineries, and power plants--because of their severe environmental impacts--may, if sited haphazardly, jeopardize endangered species. By law, conflicts between energy-facility siting and endangered species occurrence must be minimized. To assess the likelihood of such conflicts arising, the authors used data from the Fish and Wildlife Service, Endangered Species Office, that describe the species' ranges by county. This data set was matched with county-level occurrences of imminent energy developments to find counties of overlap and hence potential conflict. An index was developed to measure the likelihood of actual conflict occurring in suchmore » counties. Factors determining the index are: numbers of endangered species inhabiting the county, number of energy-related developments, and to what degree the county remains in a wild or undeveloped state. Maps were prepared showing (1) geographic ranges of endangered species by taxonomic groups (mammals, fish, etc.) and (2) counties of conflict.« less

  9. Programmer's manual for MMLE3, a general FORTRAN program for maximum likelihood parameter estimation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.

    1981-01-01

    The MMLE3 is a maximum likelihood parameter estimation program capable of handling general bilinear dynamic equations of arbitrary order with measurement noise and/or state noise (process noise). The basic MMLE3 program is quite general and, therefore, applicable to a wide variety of problems. The basic program can interact with a set of user written problem specific routines to simplify the use of the program on specific systems. A set of user routines for the aircraft stability and control derivative estimation problem is provided with the program. The implementation of the program on specific computer systems is discussed. The structure of the program is diagrammed, and the function and operation of individual routines is described. Complete listings and reference maps of the routines are included on microfiche as a supplement. Four test cases are discussed; listings of the input cards and program output for the test cases are included on microfiche as a supplement.

  10. Can policy analysis theories predict and inform policy change? Reflections on the battle for legal abortion in Indonesia

    PubMed Central

    Surjadjaja, Claudia; Mayhew, Susannah H

    2011-01-01

    The relevance and importance of research for understanding policy processes and influencing policies has been much debated, but studies on the effectiveness of policy theories for predicting and informing opportunities for policy change (i.e. prospective policy analysis) are rare. The case study presented in this paper is drawn from a policy analysis of a contemporary process of policy debate on legalization of abortion in Indonesia, which was in flux at the time of the research and provided a unique opportunity for prospective analysis. Applying a combination of policy analysis theories, this case study provides an analysis of processes, power and relationships between actors involved in the amendment of the Health Law in Indonesia. It uses a series of practical stakeholder mapping tools to identify power relations between key actors and what strategic approaches should be employed to manage these to enhance the possibility of policy change. The findings show how the moves to legalize abortion have been supported or constrained according to the balance of political and religious powers operating in a macro-political context defined increasingly by a polarized Islamic-authoritarian—Western-liberal agenda. The issue of reproductive health constituted a battlefield where these two ideologies met and the debate on the current health law amendment became a contest, which still continues, for the larger future of Indonesia. The findings confirm the utility of policy analysis theories and stakeholder mapping tools for predicting the likelihood of policy change and informing the strategic approaches for achieving such change. They also highlight opportunities and dilemmas in prospective policy analysis and raise questions about whether research on policy processes and actors can or should be used to inform, or even influence, policies in ‘real-time’. PMID:21183461

  11. Can policy analysis theories predict and inform policy change? Reflections on the battle for legal abortion in Indonesia.

    PubMed

    Surjadjaja, Claudia; Mayhew, Susannah H

    2011-09-01

    The relevance and importance of research for understanding policy processes and influencing policies has been much debated, but studies on the effectiveness of policy theories for predicting and informing opportunities for policy change (i.e. prospective policy analysis) are rare. The case study presented in this paper is drawn from a policy analysis of a contemporary process of policy debate on legalization of abortion in Indonesia, which was in flux at the time of the research and provided a unique opportunity for prospective analysis. Applying a combination of policy analysis theories, this case study provides an analysis of processes, power and relationships between actors involved in the amendment of the Health Law in Indonesia. It uses a series of practical stakeholder mapping tools to identify power relations between key actors and what strategic approaches should be employed to manage these to enhance the possibility of policy change. The findings show how the moves to legalize abortion have been supported or constrained according to the balance of political and religious powers operating in a macro-political context defined increasingly by a polarized Islamic-authoritarian-Western-liberal agenda. The issue of reproductive health constituted a battlefield where these two ideologies met and the debate on the current health law amendment became a contest, which still continues, for the larger future of Indonesia. The findings confirm the utility of policy analysis theories and stakeholder mapping tools for predicting the likelihood of policy change and informing the strategic approaches for achieving such change. They also highlight opportunities and dilemmas in prospective policy analysis and raise questions about whether research on policy processes and actors can or should be used to inform, or even influence, policies in 'real-time'.

  12. Likelihood-Based Clustering of Meta-Analytic SROC Curves

    ERIC Educational Resources Information Center

    Holling, Heinz; Bohning, Walailuck; Bohning, Dankmar

    2012-01-01

    Meta-analysis of diagnostic studies experience the common problem that different studies might not be comparable since they have been using a different cut-off value for the continuous or ordered categorical diagnostic test value defining different regions for which the diagnostic test is defined to be positive. Hence specificities and…

  13. Campus-Based Practices for Promoting Student Success: Financial Aid. Research Brief

    ERIC Educational Resources Information Center

    Horn, Aaron S.; Reinert, Leah

    2014-01-01

    Financial aid may be particularly critical for promoting full-time enrollment, continuous enrollment, and a manageable balance of school and work responsibilities, which influence the likelihood of timely degree completion (Adelman, 2006; Attewell, Heil, & Reisel, 2012; Hossler et al., 2009). For example, Attewell, Heil, and Reisel (2012)…

  14. 24 CFR 570.456 - Ineligible activities and limitations on eligible activities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... proposed project which includes speculative commercial or industrial space is intended to facilitate the... category for which such space is appropriate; and (B) There is a likelihood of continuation of the pattern...) The presumptions established in this paragraph (c)(1) will not apply if the speculative space...

  15. 24 CFR 570.456 - Ineligible activities and limitations on eligible activities.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... proposed project which includes speculative commercial or industrial space is intended to facilitate the... category for which such space is appropriate; and (B) There is a likelihood of continuation of the pattern...) The presumptions established in this paragraph (c)(1) will not apply if the speculative space...

  16. 24 CFR 570.456 - Ineligible activities and limitations on eligible activities.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... proposed project which includes speculative commercial or industrial space is intended to facilitate the... category for which such space is appropriate; and (B) There is a likelihood of continuation of the pattern...) The presumptions established in this paragraph (c)(1) will not apply if the speculative space...

  17. Reinventing School Finance: Falling Forward

    ERIC Educational Resources Information Center

    Picus, Lawrence O.; Odden, Allan R.

    2011-01-01

    States and school districts are facing unprecedented financial pressure due to the continued poor performance of the United States's economy. Dramatic shortfalls in funding due to reduced tax collections were held off for 2 years thanks to federal stimulus funds, but with these revenues already consumed and little likelihood of more in the near…

  18. Social Background and School Continuation Decisions. Discussion Papers No. 462.

    ERIC Educational Resources Information Center

    Mare, Robert D.

    In this paper, logistic response models of the effects of parental socioeconomic characteristics and family structure on the probability of making selected school transitions for white American males are estimated by maximum likelihood. Transition levels examined include: (l) completion of elementary school; (2) attendance at high school; (3)…

  19. Motion-induced error reduction by combining Fourier transform profilometry with phase-shifting profilometry.

    PubMed

    Li, Beiwen; Liu, Ziping; Zhang, Song

    2016-10-03

    We propose a hybrid computational framework to reduce motion-induced measurement error by combining the Fourier transform profilometry (FTP) and phase-shifting profilometry (PSP). The proposed method is composed of three major steps: Step 1 is to extract continuous relative phase maps for each isolated object with single-shot FTP method and spatial phase unwrapping; Step 2 is to obtain an absolute phase map of the entire scene using PSP method, albeit motion-induced errors exist on the extracted absolute phase map; and Step 3 is to shift the continuous relative phase maps from Step 1 to generate final absolute phase maps for each isolated object by referring to the absolute phase map with error from Step 2. Experiments demonstrate the success of the proposed computational framework for measuring multiple isolated rapidly moving objects.

  20. Species distribution modelling for conservation of an endangered endemic orchid

    PubMed Central

    Wang, Hsiao-Hsuan; Wonkka, Carissa L.; Treglia, Michael L.; Grant, William E.; Smeins, Fred E.; Rogers, William E.

    2015-01-01

    Concerns regarding the long-term viability of threatened and endangered plant species are increasingly warranted given the potential impacts of climate change and habitat fragmentation on unstable and isolated populations. Orchidaceae is the largest and most diverse family of flowering plants, but it is currently facing unprecedented risks of extinction. Despite substantial conservation emphasis on rare orchids, populations continue to decline. Spiranthes parksii (Navasota ladies' tresses) is a federally and state-listed endangered terrestrial orchid endemic to central Texas. Hence, we aimed to identify potential factors influencing the distribution of the species, quantify the relative importance of each factor and determine suitable habitat for future surveys and targeted conservation efforts. We analysed several geo-referenced variables describing climatic conditions and landscape features to identify potential factors influencing the likelihood of occurrence of S. parksii using boosted regression trees. Our model classified 97 % of the cells correctly with regard to species presence and absence, and indicated that probability of existence was correlated with climatic conditions and landscape features. The most influential variables were mean annual precipitation, mean elevation, mean annual minimum temperature and mean annual maximum temperature. The most likely suitable range for S. parksii was the eastern portions of Leon and Madison Counties, the southern portion of Brazos County, a portion of northern Grimes County and along the borders between Burleson and Washington Counties. Our model can assist in the development of an integrated conservation strategy through: (i) focussing future survey and research efforts on areas with a high likelihood of occurrence, (ii) aiding in selection of areas for conservation and restoration and (iii) framing future research questions including those necessary for predicting responses to climate change. Our model could also incorporate new information on S. parksii as it becomes available to improve prediction accuracy, and our methodology could be adapted to develop distribution maps for other rare species of conservation concern. PMID:25900746

  1. Species distribution modelling for conservation of an endangered endemic orchid.

    PubMed

    Wang, Hsiao-Hsuan; Wonkka, Carissa L; Treglia, Michael L; Grant, William E; Smeins, Fred E; Rogers, William E

    2015-04-21

    Concerns regarding the long-term viability of threatened and endangered plant species are increasingly warranted given the potential impacts of climate change and habitat fragmentation on unstable and isolated populations. Orchidaceae is the largest and most diverse family of flowering plants, but it is currently facing unprecedented risks of extinction. Despite substantial conservation emphasis on rare orchids, populations continue to decline. Spiranthes parksii (Navasota ladies' tresses) is a federally and state-listed endangered terrestrial orchid endemic to central Texas. Hence, we aimed to identify potential factors influencing the distribution of the species, quantify the relative importance of each factor and determine suitable habitat for future surveys and targeted conservation efforts. We analysed several geo-referenced variables describing climatic conditions and landscape features to identify potential factors influencing the likelihood of occurrence of S. parksii using boosted regression trees. Our model classified 97 % of the cells correctly with regard to species presence and absence, and indicated that probability of existence was correlated with climatic conditions and landscape features. The most influential variables were mean annual precipitation, mean elevation, mean annual minimum temperature and mean annual maximum temperature. The most likely suitable range for S. parksii was the eastern portions of Leon and Madison Counties, the southern portion of Brazos County, a portion of northern Grimes County and along the borders between Burleson and Washington Counties. Our model can assist in the development of an integrated conservation strategy through: (i) focussing future survey and research efforts on areas with a high likelihood of occurrence, (ii) aiding in selection of areas for conservation and restoration and (iii) framing future research questions including those necessary for predicting responses to climate change. Our model could also incorporate new information on S. parksii as it becomes available to improve prediction accuracy, and our methodology could be adapted to develop distribution maps for other rare species of conservation concern. Published by Oxford University Press on behalf of the Annals of Botany Company.

  2. Saturated linkage map construction in Rubus idaeus using genotyping by sequencing and genome-independent imputation

    PubMed Central

    2013-01-01

    Background Rapid development of highly saturated genetic maps aids molecular breeding, which can accelerate gain per breeding cycle in woody perennial plants such as Rubus idaeus (red raspberry). Recently, robust genotyping methods based on high-throughput sequencing were developed, which provide high marker density, but result in some genotype errors and a large number of missing genotype values. Imputation can reduce the number of missing values and can correct genotyping errors, but current methods of imputation require a reference genome and thus are not an option for most species. Results Genotyping by Sequencing (GBS) was used to produce highly saturated maps for a R. idaeus pseudo-testcross progeny. While low coverage and high variance in sequencing resulted in a large number of missing values for some individuals, a novel method of imputation based on maximum likelihood marker ordering from initial marker segregation overcame the challenge of missing values, and made map construction computationally tractable. The two resulting parental maps contained 4521 and 2391 molecular markers spanning 462.7 and 376.6 cM respectively over seven linkage groups. Detection of precise genomic regions with segregation distortion was possible because of map saturation. Microsatellites (SSRs) linked these results to published maps for cross-validation and map comparison. Conclusions GBS together with genome-independent imputation provides a rapid method for genetic map construction in any pseudo-testcross progeny. Our method of imputation estimates the correct genotype call of missing values and corrects genotyping errors that lead to inflated map size and reduced precision in marker placement. Comparison of SSRs to published R. idaeus maps showed that the linkage maps constructed with GBS and our method of imputation were robust, and marker positioning reliable. The high marker density allowed identification of genomic regions with segregation distortion in R. idaeus, which may help to identify deleterious alleles that are the basis of inbreeding depression in the species. PMID:23324311

  3. Assessing reliability of protein-protein interactions by integrative analysis of data in model organisms.

    PubMed

    Lin, Xiaotong; Liu, Mei; Chen, Xue-wen

    2009-04-29

    Protein-protein interactions play vital roles in nearly all cellular processes and are involved in the construction of biological pathways such as metabolic and signal transduction pathways. Although large-scale experiments have enabled the discovery of thousands of previously unknown linkages among proteins in many organisms, the high-throughput interaction data is often associated with high error rates. Since protein interaction networks have been utilized in numerous biological inferences, the inclusive experimental errors inevitably affect the quality of such prediction. Thus, it is essential to assess the quality of the protein interaction data. In this paper, a novel Bayesian network-based integrative framework is proposed to assess the reliability of protein-protein interactions. We develop a cross-species in silico model that assigns likelihood scores to individual protein pairs based on the information entirely extracted from model organisms. Our proposed approach integrates multiple microarray datasets and novel features derived from gene ontology. Furthermore, the confidence scores for cross-species protein mappings are explicitly incorporated into our model. Applying our model to predict protein interactions in the human genome, we are able to achieve 80% in sensitivity and 70% in specificity. Finally, we assess the overall quality of the experimentally determined yeast protein-protein interaction dataset. We observe that the more high-throughput experiments confirming an interaction, the higher the likelihood score, which confirms the effectiveness of our approach. This study demonstrates that model organisms certainly provide important information for protein-protein interaction inference and assessment. The proposed method is able to assess not only the overall quality of an interaction dataset, but also the quality of individual protein-protein interactions. We expect the method to continually improve as more high quality interaction data from more model organisms becomes available and is readily scalable to a genome-wide application.

  4. Hybrid ICA-Bayesian network approach reveals distinct effective connectivity differences in schizophrenia.

    PubMed

    Kim, D; Burge, J; Lane, T; Pearlson, G D; Kiehl, K A; Calhoun, V D

    2008-10-01

    We utilized a discrete dynamic Bayesian network (dDBN) approach (Burge, J., Lane, T., Link, H., Qiu, S., Clark, V.P., 2007. Discrete dynamic Bayesian network analysis of fMRI data. Hum Brain Mapp.) to determine differences in brain regions between patients with schizophrenia and healthy controls on a measure of effective connectivity, termed the approximate conditional likelihood score (ACL) (Burge, J., Lane, T., 2005. Learning Class-Discriminative Dynamic Bayesian Networks. Proceedings of the International Conference on Machine Learning, Bonn, Germany, pp. 97-104.). The ACL score represents a class-discriminative measure of effective connectivity by measuring the relative likelihood of the correlation between brain regions in one group versus another. The algorithm is capable of finding non-linear relationships between brain regions because it uses discrete rather than continuous values and attempts to model temporal relationships with a first-order Markov and stationary assumption constraint (Papoulis, A., 1991. Probability, random variables, and stochastic processes. McGraw-Hill, New York.). Since Bayesian networks are overly sensitive to noisy data, we introduced an independent component analysis (ICA) filtering approach that attempted to reduce the noise found in fMRI data by unmixing the raw datasets into a set of independent spatial component maps. Components that represented noise were removed and the remaining components reconstructed into the dimensions of the original fMRI datasets. We applied the dDBN algorithm to a group of 35 patients with schizophrenia and 35 matched healthy controls using an ICA filtered and unfiltered approach. We determined that filtering the data significantly improved the magnitude of the ACL score. Patients showed the greatest ACL scores in several regions, most markedly the cerebellar vermis and hemispheres. Our findings suggest that schizophrenia patients exhibit weaker connectivity than healthy controls in multiple regions, including bilateral temporal, frontal, and cerebellar regions during an auditory paradigm.

  5. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    PubMed Central

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-01-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented. PMID:15238544

  6. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    PubMed

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-06-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented.

  7. Maximum-Likelihood Estimation for Frequency-Modulated Continuous-Wave Laser Ranging Using Photon-Counting Detectors

    DTIC Science & Technology

    2013-01-01

    are calculated from coherently -detected fields, e.g., coherent Doppler lidar . Our CRB results reveal that the best-case mean-square error scales as 1...1088 (2001). 7. K. Asaka, Y. Hirano, K. Tatsumi, K. Kasahara, and T. Tajime, “A pseudo-random frequency modulation continuous wave coherent lidar using...multiple returns,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 2170–2180 (2007). 11. T. J. Karr, “Atmospheric phase error in coherent laser radar

  8. FPGA Acceleration of the phylogenetic likelihood function for Bayesian MCMC inference methods.

    PubMed

    Zierke, Stephanie; Bakos, Jason D

    2010-04-12

    Likelihood (ML)-based phylogenetic inference has become a popular method for estimating the evolutionary relationships among species based on genomic sequence data. This method is used in applications such as RAxML, GARLI, MrBayes, PAML, and PAUP. The Phylogenetic Likelihood Function (PLF) is an important kernel computation for this method. The PLF consists of a loop with no conditional behavior or dependencies between iterations. As such it contains a high potential for exploiting parallelism using micro-architectural techniques. In this paper, we describe a technique for mapping the PLF and supporting logic onto a Field Programmable Gate Array (FPGA)-based co-processor. By leveraging the FPGA's on-chip DSP modules and the high-bandwidth local memory attached to the FPGA, the resultant co-processor can accelerate ML-based methods and outperform state-of-the-art multi-core processors. We use the MrBayes 3 tool as a framework for designing our co-processor. For large datasets, we estimate that our accelerated MrBayes, if run on a current-generation FPGA, achieves a 10x speedup relative to software running on a state-of-the-art server-class microprocessor. The FPGA-based implementation achieves its performance by deeply pipelining the likelihood computations, performing multiple floating-point operations in parallel, and through a natural log approximation that is chosen specifically to leverage a deeply pipelined custom architecture. Heterogeneous computing, which combines general-purpose processors with special-purpose co-processors such as FPGAs and GPUs, is a promising approach for high-performance phylogeny inference as shown by the growing body of literature in this field. FPGAs in particular are well-suited for this task because of their low power consumption as compared to many-core processors and Graphics Processor Units (GPUs).

  9. Free kick instead of cross-validation in maximum-likelihood refinement of macromolecular crystal structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pražnikar, Jure; University of Primorska,; Turk, Dušan, E-mail: dusan.turk@ijs.si

    2014-12-01

    The maximum-likelihood free-kick target, which calculates model error estimates from the work set and a randomly displaced model, proved superior in the accuracy and consistency of refinement of crystal structures compared with the maximum-likelihood cross-validation target, which calculates error estimates from the test set and the unperturbed model. The refinement of a molecular model is a computational procedure by which the atomic model is fitted to the diffraction data. The commonly used target in the refinement of macromolecular structures is the maximum-likelihood (ML) function, which relies on the assessment of model errors. The current ML functions rely on cross-validation. Theymore » utilize phase-error estimates that are calculated from a small fraction of diffraction data, called the test set, that are not used to fit the model. An approach has been developed that uses the work set to calculate the phase-error estimates in the ML refinement from simulating the model errors via the random displacement of atomic coordinates. It is called ML free-kick refinement as it uses the ML formulation of the target function and is based on the idea of freeing the model from the model bias imposed by the chemical energy restraints used in refinement. This approach for the calculation of error estimates is superior to the cross-validation approach: it reduces the phase error and increases the accuracy of molecular models, is more robust, provides clearer maps and may use a smaller portion of data for the test set for the calculation of R{sub free} or may leave it out completely.« less

  10. Satellite image based methods for fuels maps updating

    NASA Astrophysics Data System (ADS)

    Alonso-Benito, Alfonso; Hernandez-Leal, Pedro A.; Arbelo, Manuel; Gonzalez-Calvo, Alejandro; Moreno-Ruiz, Jose A.; Garcia-Lazaro, Jose R.

    2016-10-01

    Regular updating of fuels maps is important for forest fire management. Nevertheless complex and time consuming field work is usually necessary for this purpose, which prevents a more frequent update. That is why the assessment of the usefulness of satellite data and the development of remote sensing techniques that enable the automatic updating of these maps, is of vital interest. In this work, we have tested the use of the spectral bands of OLI (Operational Land Imager) sensor on board Landsat 8 satellite, for updating the fuels map of El Hierro Island (Spain). From previously digitized map, a set of 200 reference plots for different fuel types was created. A 50% of the plots were randomly used as a training set and the rest were considered for validation. Six supervised and 2 unsupervised classification methods were applied, considering two levels of detail. A first level with only 5 classes (Meadow, Brushwood, Undergrowth canopy cover >50%, Undergrowth canopy cover <15%, and Xeric formations), and the second one containing 19 fuel types. The level 1 classification methods yielded an overall accuracy ranging from 44% for Parellelepided to an 84% for Maximun Likelihood. Meanwhile, level 2 results showed at best, an unacceptable overall accuracy of 34%, which prevents the use of this data for such a detailed characterization. Anyway it has been demonstrated that in some conditions, images of medium spatial resolution, like Landsat 8-OLI, could be a valid tool for an automatic upgrade of fuels maps, minimizing costs and complementing traditional methodologies.

  11. Wide-cross whole-genome radiation hybrid mapping of cotton (Gossypium hirsutum L.).

    PubMed Central

    Gao, Wenxiang; Chen, Z Jeffrey; Yu, John Z; Raska, Dwaine; Kohel, Russell J; Womack, James E; Stelly, David M

    2004-01-01

    We report the development and characterization of a "wide-cross whole-genome radiation hybrid" (WWRH) panel from cotton (Gossypium hirsutum L.). Chromosomes were segmented by gamma-irradiation of G. hirsutum (n = 26) pollen, and segmented chromosomes were rescued after in vivo fertilization of G. barbadense egg cells (n = 26). A 5-krad gamma-ray WWRH mapping panel (N = 93) was constructed and genotyped at 102 SSR loci. SSR marker retention frequencies were higher than those for animal systems and marker retention patterns were informative. Using the program RHMAP, 52 of 102 SSR markers were mapped into 16 syntenic groups. Linkage group 9 (LG 9) SSR markers BNL0625 and BNL2805 had been colocalized by linkage analysis, but their order was resolved by differential retention among WWRH plants. Two linkage groups, LG 13 and LG 9, were combined into one syntenic group, and the chromosome 1 linkage group marker BNL4053 was reassigned to chromosome 9. Analyses of cytogenetic stocks supported synteny of LG 9 and LG 13 and localized them to the short arm of chromosome 17. They also supported reassignment of marker BNL4053 to the long arm of chromosome 9. A WWRH map of the syntenic group composed of linkage groups 9 and 13 was constructed by maximum-likelihood analysis under the general retention model. The results demonstrate not only the feasibility of WWRH panel construction and mapping, but also complementarity to traditional linkage mapping and cytogenetic methods. PMID:15280245

  12. Mapping the integrated Sachs-Wolfe effect

    NASA Astrophysics Data System (ADS)

    Manzotti, A.; Dodelson, S.

    2014-12-01

    On large scales, the anisotropies in the cosmic microwave background (CMB) reflect not only the primordial density field but also the energy gain when photons traverse decaying gravitational potentials of large scale structure, what is called the integrated Sachs-Wolfe (ISW) effect. Decomposing the anisotropy signal into a primordial piece and an ISW component, the main secondary effect on large scales, is more urgent than ever as cosmologists strive to understand the Universe on those scales. We present a likelihood technique for extracting the ISW signal combining measurements of the CMB, the distribution of galaxies, and maps of gravitational lensing. We test this technique with simulated data showing that we can successfully reconstruct the ISW map using all the data sets together. Then we present the ISW map obtained from a combination of real data: the NRAO VLA sky survey (NVSS) galaxy survey, temperature anisotropies, and lensing maps made by the Planck satellite. This map shows that, with the data sets used and assuming linear physics, there is no evidence, from the reconstructed ISW signal in the Cold Spot region, for an entirely ISW origin of this large scale anomaly in the CMB. However a large scale structure origin from low redshift voids outside the NVSS redshift range is still possible. Finally we show that future surveys, thanks to a better large scale lensing reconstruction will be able to improve the reconstruction signal to noise which is now mainly coming from galaxy surveys.

  13. Fine Mapping Causal Variants with an Approximate Bayesian Method Using Marginal Test Statistics.

    PubMed

    Chen, Wenan; Larrabee, Beth R; Ovsyannikova, Inna G; Kennedy, Richard B; Haralambieva, Iana H; Poland, Gregory A; Schaid, Daniel J

    2015-07-01

    Two recently developed fine-mapping methods, CAVIAR and PAINTOR, demonstrate better performance over other fine-mapping methods. They also have the advantage of using only the marginal test statistics and the correlation among SNPs. Both methods leverage the fact that the marginal test statistics asymptotically follow a multivariate normal distribution and are likelihood based. However, their relationship with Bayesian fine mapping, such as BIMBAM, is not clear. In this study, we first show that CAVIAR and BIMBAM are actually approximately equivalent to each other. This leads to a fine-mapping method using marginal test statistics in the Bayesian framework, which we call CAVIAR Bayes factor (CAVIARBF). Another advantage of the Bayesian framework is that it can answer both association and fine-mapping questions. We also used simulations to compare CAVIARBF with other methods under different numbers of causal variants. The results showed that both CAVIARBF and BIMBAM have better performance than PAINTOR and other methods. Compared to BIMBAM, CAVIARBF has the advantage of using only marginal test statistics and takes about one-quarter to one-fifth of the running time. We applied different methods on two independent cohorts of the same phenotype. Results showed that CAVIARBF, BIMBAM, and PAINTOR selected the same top 3 SNPs; however, CAVIARBF and BIMBAM had better consistency in selecting the top 10 ranked SNPs between the two cohorts. Software is available at https://bitbucket.org/Wenan/caviarbf. Copyright © 2015 by the Genetics Society of America.

  14. Wildlife tradeoffs based on landscape models of habitat

    USGS Publications Warehouse

    Loehle, C.; Mitchell, M.S.

    2000-01-01

    It is becoming increasingly clear that the spatial structure of landscapes affects the habitat choices and abundance of wildlife. In contrast to wildlife management based on preservation of critical habitat features such as nest sites on a beach or mast trees, it has not been obvious how to incorporate spatial structure into management plans. We present techniques to accomplish this goal. We used multiscale logistic regression models developed previously for neotropical migrant bird species habitat use in South Carolina (USA) as a basis for these techniques. Based on these models we used a spatial optimization technique to generate optimal maps (probability of occurrence, P = 1.0) for each of seven species. To emulate management of a forest for maximum species diversity, we defined the objective function of the algorithm as the sum of probabilities over the seven species, resulting in a complex map that allowed all seven species to coexist. The map that allowed for coexistence is not obvious, must be computed algorithmically, and would be difficult to realize using rules of thumb for habitat management. To assess how management of a forest for a single species of interest might affect other species, we analyzed tradeoffs by gradually increasing the weighting on a single species in the objective function over a series of simulations. We found that as habitat was increasingly modified to favor that species, the probability of presence for two of the other species was driven to zero. This shows that whereas it is not possible to simultaneously maximize the likelihood of presence for multiple species with divergent habitat preferences, compromise solutions are possible at less than maximal likelihood in many cases. Our approach suggests that efficiency of habitat management for species diversity can by maximized for even small landscapes by incorporating spatial context. The methods we present are suitable for wildlife management, endangered species conservation, and nature reserve design.

  15. A partial differential equation-based general framework adapted to Rayleigh's, Rician's and Gaussian's distributed noise for restoration and enhancement of magnetic resonance image.

    PubMed

    Yadav, Ram Bharos; Srivastava, Subodh; Srivastava, Rajeev

    2016-01-01

    The proposed framework is obtained by casting the noise removal problem into a variational framework. This framework automatically identifies the various types of noise present in the magnetic resonance image and filters them by choosing an appropriate filter. This filter includes two terms: the first term is a data likelihood term and the second term is a prior function. The first term is obtained by minimizing the negative log likelihood of the corresponding probability density functions: Gaussian or Rayleigh or Rician. Further, due to the ill-posedness of the likelihood term, a prior function is needed. This paper examines three partial differential equation based priors which include total variation based prior, anisotropic diffusion based prior, and a complex diffusion (CD) based prior. A regularization parameter is used to balance the trade-off between data fidelity term and prior. The finite difference scheme is used for discretization of the proposed method. The performance analysis and comparative study of the proposed method with other standard methods is presented for brain web dataset at varying noise levels in terms of peak signal-to-noise ratio, mean square error, structure similarity index map, and correlation parameter. From the simulation results, it is observed that the proposed framework with CD based prior is performing better in comparison to other priors in consideration.

  16. Towards the Optimal Pixel Size of dem for Automatic Mapping of Landslide Areas

    NASA Astrophysics Data System (ADS)

    Pawłuszek, K.; Borkowski, A.; Tarolli, P.

    2017-05-01

    Determining appropriate spatial resolution of digital elevation model (DEM) is a key step for effective landslide analysis based on remote sensing data. Several studies demonstrated that choosing the finest DEM resolution is not always the best solution. Various DEM resolutions can be applicable for diverse landslide applications. Thus, this study aims to assess the influence of special resolution on automatic landslide mapping. Pixel-based approach using parametric and non-parametric classification methods, namely feed forward neural network (FFNN) and maximum likelihood classification (ML), were applied in this study. Additionally, this allowed to determine the impact of used classification method for selection of DEM resolution. Landslide affected areas were mapped based on four DEMs generated at 1 m, 2 m, 5 m and 10 m spatial resolution from airborne laser scanning (ALS) data. The performance of the landslide mapping was then evaluated by applying landslide inventory map and computation of confusion matrix. The results of this study suggests that the finest scale of DEM is not always the best fit, however working at 1 m DEM resolution on micro-topography scale, can show different results. The best performance was found at 5 m DEM-resolution for FFNN and 1 m DEM resolution for results. The best performance was found to be using 5 m DEM-resolution for FFNN and 1 m DEM resolution for ML classification.

  17. Use of ERTS-1 data: Summary report of work on ten tasks. [solving natural resources and environmental quality problems using ERTS-1 MSS data

    NASA Technical Reports Server (NTRS)

    Thomson, F. J.; Polcyn, F. C.; Bryan, M. L.; Sattinger, I. J.; Malila, W. A.; Nalepka, R. F.; Wezernak, C. T.; Horvath, R.; Vincent, R. K. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. Depth mapping's for a portion of Lake Michigan and at the Little Bahama Bank test site have been verified by use of navigation charts and on-site visits. A thirteen category recognition map of Yellowstone Park has been prepared. Model calculation of atmospheric effects for various altitudes have been prepared. Radar, SLAR, and ERTS-1 data for flooded areas of Monroe County, Michigan are being studied. Water bodies can be reliably recognized and mapped using maximum likelihood processing of ERTS-1 digital data. Wetland mapping has been accomplished by slicing of single band and/or ratio processing of two bands for a single observation date. Both analog and digital processing have been used to map the Lake Ontario basin using ERTS-1 data. Operating characteristic curves were developed for the proportion estimation algorithm to determine its performance in the measurement of surface water area. The signal in band MSS-5 was related to sediment content of waters by modelling approach and by relating surface measurements of water to processed ERTS data. Radiance anomalies in ERTS-1 data could be associated with the presence of oil on water in San Francisco Bay, but the anomalies were of the same order as those caused by variations in sediment concentration and tidal flushing.

  18. Iterative Demodulation and Decoding of Non-Square QAM

    NASA Technical Reports Server (NTRS)

    Li, Lifang; Divsalar, Dariush; Dolinar, Samuel

    2004-01-01

    It has been shown that a non-square (NS) 2(sup 2n+1)-ary (where n is a positive integer) quadrature amplitude modulation [(NS)2(sup 2n+1)-QAM] has inherent memory that can be exploited to obtain coding gains. Moreover, it should not be necessary to build new hardware to realize these gains. The present scheme is a product of theoretical calculations directed toward reducing the computational complexity of decoding coded 2(sup 2n+1)-QAM. In the general case of 2(sup 2n+1)-QAM, the signal constellation is not square and it is impossible to have independent in-phase (I) and quadrature-phase (Q) mapping and demapping. However, independent I and Q mapping and demapping are desirable for reducing the complexity of computing the log likelihood ratio (LLR) between a bit and a received symbol (such computations are essential operations in iterative decoding). This is because in modulation schemes that include independent I and Q mapping and demapping, each bit of a signal point is involved in only one-dimensional mapping and demapping. As a result, the computation of the LLR is equivalent to that of a one-dimensional pulse amplitude modulation (PAM) system. Therefore, it is desirable to find a signal constellation that enables independent I and Q mapping and demapping for 2(sup 2n+1)-QAM.

  19. BLANCO MOUNTAIN AND BLACK CANYON ROADLESS AREAS, CALIFORNIA.

    USGS Publications Warehouse

    Diggles, Michael F.; Rains, Richard L.

    1984-01-01

    The mineral survey of the Blanco Mountain and Black Canyon Roadless Areas, California indicated that areas of probable and substantiated mineral-resource potential exist only in the Black Canyon Roadless Area. Gold with moderate amounts of lead, silver, zinc, and tungsten, occurs in vein deposits and in tactite. The nature of the geological terrain indicates little likelihood for the occurrence of energy resources in the roadless areas. Detailed geologic mapping might better define the extent of gold mineralization. Detailed stream-sediment sampling and analysis of heavy-mineral concentrations could better define tungsten resource potential.

  20. NASA/BLM APT, phase 2. Volume 2: Technology demonstration. [Arizona

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Techniques described include: (1) steps in the preprocessing of LANDSAT data; (2) the training of a classifier; (3) maximum likelihood classification and precision; (4) geometric correction; (5) class description; (6) digitizing; (7) digital terrain data; (8) an overview of sample design; (9) allocation and selection of primary sample units; (10) interpretation of secondary sample units; (11) data collection ground plots; (12) data reductions; (13) analysis for productivity estimation and map verification; (14) cost analysis; and (150) LANDSAT digital products. The evaluation of the pre-inventory planning for P.J. is included.

  1. A 1.8-Mb YAC contig in Xp11.23: identification of CpG islands and physical mapping of CA repeats in a region of high gene density.

    PubMed

    Coleman, M P; Németh, A H; Campbell, L; Raut, C P; Weissenbach, J; Davies, K E

    1994-05-15

    The genes ARAF1, SYN1, TIMP, and PFC are clustered within 70 kb of one another, and, as reported in the accompanying paper (J. Knight et al., 1994, Genomics 21: 180-187), at least four more genes map within 400 kb: a cluster of Krüppel-type zinc finger genes (including ZNF21, ZNF41, and ZNF81) and ELK-1, a member of the ets oncogene superfamily. This gene-rich region is of particular interest because of the large number of disease genes mapping to Xp11.23: at least three eye diseases (retinitis pigmentosa type 2, congenital stationary night blindness CSNB1, and Aland Island eye disease), Wiskott-Aldrich syndrome, X-linked nephrolithiasis, and a translocation breakpoint associated with synovial sarcoma. We have constructed a 1.8-Mb YAC contig in this region, confirming the link between TIMP and OATL1 reported by Knight et al. (1994) and extending the map in the distal direction. To investigate the likelihood that more genes are located within this region, we have carried out detailed mapping of rare-cutter restriction sites in these YACs and identified seven CpG islands. At least six of these islands are located over 50 kb from any known gene locations, suggesting that the region contains at least this many as yet unidentified genes. We have also mapped the physical locations of six highly polymorphic CA repeats within the contig, thus integrating the physical, genetic, and transcriptional maps of the region and facilitating the mapping and identification of disease genes.(ABSTRACT TRUNCATED AT 250 WORDS)

  2. EMSAR: estimation of transcript abundance from RNA-seq data by mappability-based segmentation and reclustering.

    PubMed

    Lee, Soohyun; Seo, Chae Hwa; Alver, Burak Han; Lee, Sanghyuk; Park, Peter J

    2015-09-03

    RNA-seq has been widely used for genome-wide expression profiling. RNA-seq data typically consists of tens of millions of short sequenced reads from different transcripts. However, due to sequence similarity among genes and among isoforms, the source of a given read is often ambiguous. Existing approaches for estimating expression levels from RNA-seq reads tend to compromise between accuracy and computational cost. We introduce a new approach for quantifying transcript abundance from RNA-seq data. EMSAR (Estimation by Mappability-based Segmentation And Reclustering) groups reads according to the set of transcripts to which they are mapped and finds maximum likelihood estimates using a joint Poisson model for each optimal set of segments of transcripts. The method uses nearly all mapped reads, including those mapped to multiple genes. With an efficient transcriptome indexing based on modified suffix arrays, EMSAR minimizes the use of CPU time and memory while achieving accuracy comparable to the best existing methods. EMSAR is a method for quantifying transcripts from RNA-seq data with high accuracy and low computational cost. EMSAR is available at https://github.com/parklab/emsar.

  3. Evaluation of potential surface rupture and review of current seismic hazards program at the Los Alamos National Laboratory. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-09

    This report summarizes the authors review and evaluation of the existing seismic hazards program at Los Alamos National Laboratory (LANL). The report recommends that the original program be augmented with a probabilistic analysis of seismic hazards involving assignment of weighted probabilities of occurrence to all potential sources. This approach yields a more realistic evaluation of the likelihood of large earthquake occurrence particularly in regions where seismic sources may have recurrent intervals of several thousand years or more. The report reviews the locations and geomorphic expressions of identified fault lines along with the known displacements of these faults and last knowmore » occurrence of seismic activity. Faults are mapped and categorized into by their potential for actual movement. Based on geologic site characterization, recommendations are made for increased seismic monitoring; age-dating studies of faults and geomorphic features; increased use of remote sensing and aerial photography for surface mapping of faults; the development of a landslide susceptibility map; and to develop seismic design standards for all existing and proposed facilities at LANL.« less

  4. Object based technique for delineating and mapping 15 tree species using VHR WorldView-2 imagery

    NASA Astrophysics Data System (ADS)

    Mustafa, Yaseen T.; Habeeb, Hindav N.

    2014-10-01

    Monitoring and analyzing forests and trees are required task to manage and establish a good plan for the forest sustainability. To achieve such a task, information and data collection of the trees are requested. The fastest way and relatively low cost technique is by using satellite remote sensing. In this study, we proposed an approach to identify and map 15 tree species in the Mangish sub-district, Kurdistan Region-Iraq. Image-objects (IOs) were used as the tree species mapping unit. This is achieved using the shadow index, normalized difference vegetation index and texture measurements. Four classification methods (Maximum Likelihood, Mahalanobis Distance, Neural Network, and Spectral Angel Mapper) were used to classify IOs using selected IO features derived from WorldView-2 imagery. Results showed that overall accuracy was increased 5-8% using the Neural Network method compared with other methods with a Kappa coefficient of 69%. This technique gives reasonable results of various tree species classifications by means of applying the Neural Network method with IOs techniques on WorldView-2 imagery.

  5. Vegetation mapping of Nowitna National Wildlife Reguge, Alaska using Landsat MSS digital data

    USGS Publications Warehouse

    Talbot, S. S.; Markon, Carl J.

    1986-01-01

    A Landsat-derived vegetation map was prepared for Nowitna National Wildlife Refuge. The refuge lies within the middle boreal subzone of north central Alaska. Seven major vegetation classes and sixteen subclasses were recognized: forest (closed needleleaf, open needleleaf, needleleaf woodland, mixed, and broadleaf); broadleaf scrub (lowland, alluvial, subalpine); dwarf scrub (prostrate dwarf shrub tundra, dwarf shrub-graminoid tussock peatland); herbaceous (graminoid bog, marsh and meadow); scarcely vegetated areas (scarcely vegetated scree and floodplain); water (clear, turbid); and other areas (mountain shadow). The methodology employed a cluster-block technique. Sample areas were described based on a combination of helicopter-ground survey, aerial photointerpretation, and digital Landsat data. Major steps in the Landsat analysis involved preprocessing (geometric correction), derivation of statistical parameters for spectral classes, spectral class labeling of sample areas, preliminary classification of the entire study area using a maximum-likelihood algorithm, and final classification utilizing ancillary information such as digital elevation data. The final product is a 1:250,000-scale vegetation map representative of distinctive regional patterns and suitable for use in comprehensive conservation planning.

  6. Markov-random-field-based super-resolution mapping for identification of urban trees in VHR images

    NASA Astrophysics Data System (ADS)

    Ardila, Juan P.; Tolpekin, Valentyn A.; Bijker, Wietske; Stein, Alfred

    2011-11-01

    Identification of tree crowns from remote sensing requires detailed spectral information and submeter spatial resolution imagery. Traditional pixel-based classification techniques do not fully exploit the spatial and spectral characteristics of remote sensing datasets. We propose a contextual and probabilistic method for detection of tree crowns in urban areas using a Markov random field based super resolution mapping (SRM) approach in very high resolution images. Our method defines an objective energy function in terms of the conditional probabilities of panchromatic and multispectral images and it locally optimizes the labeling of tree crown pixels. Energy and model parameter values are estimated from multiple implementations of SRM in tuning areas and the method is applied in QuickBird images to produce a 0.6 m tree crown map in a city of The Netherlands. The SRM output shows an identification rate of 66% and commission and omission errors in small trees and shrub areas. The method outperforms tree crown identification results obtained with maximum likelihood, support vector machines and SRM at nominal resolution (2.4 m) approaches.

  7. Terrain Classification on Venus from Maximum-Likelihood Inversion of Parameterized Models of Topography, Gravity, and their Relation

    NASA Astrophysics Data System (ADS)

    Eggers, G. L.; Lewis, K. W.; Simons, F. J.; Olhede, S.

    2013-12-01

    Venus does not possess a plate-tectonic system like that observed on Earth, and many surface features--such as tesserae and coronae--lack terrestrial equivalents. To understand Venus' tectonics is to understand its lithosphere, requiring a study of topography and gravity, and how they relate. Past studies of topography dealt with mapping and classification of visually observed features, and studies of gravity dealt with inverting the relation between topography and gravity anomalies to recover surface density and elastic thickness in either the space (correlation) or the spectral (admittance, coherence) domain. In the former case, geological features could be delineated but not classified quantitatively. In the latter case, rectangular or circular data windows were used, lacking geological definition. While the estimates of lithospheric strength on this basis were quantitative, they lacked robust error estimates. Here, we remapped the surface into 77 regions visually and qualitatively defined from a combination of Magellan topography, gravity, and radar images. We parameterize the spectral covariance of the observed topography, treating it as a Gaussian process assumed to be stationary over the mapped regions, using a three-parameter isotropic Matern model, and perform maximum-likelihood based inversions for the parameters. We discuss the parameter distribution across the Venusian surface and across terrain types such as coronoae, dorsae, tesserae, and their relation with mean elevation and latitudinal position. We find that the three-parameter model, while mathematically established and applicable to Venus topography, is overparameterized, and thus reduce the results to a two-parameter description of the peak spectral variance and the range-to-half-peak variance (in function of the wavenumber). With the reduction the clustering of geological region types in two-parameter space becomes promising. Finally, we perform inversions for the JOINT spectral variance of topography and gravity, in which the INITIAL loading by topography retains the Matern form but the FINAL topography and gravity are the result of flexural compensation. In our modeling, we pay explicit attention to finite-field spectral estimation effects (and their remedy via tapering), and to the implementation of statistical tests (for anisotropy, for initial-loading process correlation, to ascertain the proper density contrasts and interface depth in a two-layer model), robustness assessment and uncertainty quantification, as well as to algorithmic intricacies related to low-dimensional but poorly scaled maximum-likelihood inversions. We conclude that Venusian geomorphic terrains are well described by their 2-D topographic and gravity (cross-)power spectra, and the spectral properties of distinct geologic provinces on Venus are worth quantifying via maximum-likelihood-based methods under idealized three-parameter Matern distributions. Analysis of fitted parameters and the fitted-data residuals reveals natural variability in the (sub)surface properties on Venus, as well as some directional anisotropy. Geologic regions tend to cluster according to terrain type in our parameter space, which we analyze to confirm their shared geologic histories and utilize for guidance in ongoing mapping efforts of Venus and other terrestrial bodies.

  8. Phylogenetic Relationships of American Willows (Salix L., Salicaceae)

    PubMed Central

    Lauron-Moreau, Aurélien; Pitre, Frédéric E.; Argus, George W.; Labrecque, Michel; Brouillet, Luc

    2015-01-01

    Salix L. is the largest genus in the family Salicaceae (450 species). Several classifications have been published, but taxonomic subdivision has been under continuous revision. Our goal is to establish the phylogenetic structure of the genus using molecular data on all American willows, using three DNA markers. This complete phylogeny of American willows allows us to propose a biogeographic framework for the evolution of the genus. Material was obtained for the 122 native and introduced willow species of America. Sequences were obtained from the ITS (ribosomal nuclear DNA) and two plastid regions, matK and rbcL. Phylogenetic analyses (parsimony, maximum likelihood, Bayesian inference) were performed on the data. Geographic distribution was mapped onto the tree. The species tree provides strong support for a division of the genus into two subgenera, Salix and Vetrix. Subgenus Salix comprises temperate species from the Americas and Asia, and their disjunction may result from Tertiary events. Subgenus Vetrix is composed of boreo-arctic species of the Northern Hemisphere and their radiation may coincide with the Quaternary glaciations. Sixteen species have ambiguous positions; genetic diversity is lower in subg. Vetrix. A molecular phylogeny of all species of American willows has been inferred. It needs to be tested and further resolved using other molecular data. Nonetheless, the genus clearly has two clades that have distinct biogeographic patterns. PMID:25880993

  9. node2vec: Scalable Feature Learning for Networks

    PubMed Central

    Grover, Aditya; Leskovec, Jure

    2016-01-01

    Prediction tasks over nodes and edges in networks require careful effort in engineering features used by learning algorithms. Recent research in the broader field of representation learning has led to significant progress in automating prediction by learning the features themselves. However, present feature learning approaches are not expressive enough to capture the diversity of connectivity patterns observed in networks. Here we propose node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks. In node2vec, we learn a mapping of nodes to a low-dimensional space of features that maximizes the likelihood of preserving network neighborhoods of nodes. We define a flexible notion of a node’s network neighborhood and design a biased random walk procedure, which efficiently explores diverse neighborhoods. Our algorithm generalizes prior work which is based on rigid notions of network neighborhoods, and we argue that the added flexibility in exploring neighborhoods is the key to learning richer representations. We demonstrate the efficacy of node2vec over existing state-of-the-art techniques on multi-label classification and link prediction in several real-world networks from diverse domains. Taken together, our work represents a new way for efficiently learning state-of-the-art task-independent representations in complex networks. PMID:27853626

  10. The Nonmetro Labor Force in the Seventies.

    ERIC Educational Resources Information Center

    Schaub, James D.

    The report identifies structural changes and trends in the composition of the nonmetro labor force between 1973 and 1979; evaluates the labor force performance by race, sex, and age; and suggests underlying causes of the major changes and the likelihood of particular trends continuing into the eighties. Tabular data indicate that: (1) metro and…

  11. Resilience Development of Preservice Teachers in Urban Schools

    ERIC Educational Resources Information Center

    Roselle, Rene

    2007-01-01

    Retention of teachers in urban schools continues to plague public schools. Could universities increase the likelihood that teachers will stay in urban schools longer by preparing them for some of the adversities they may face and helping them develop resilience in relation to these challenges? Could we produce resilient educators before they…

  12. Insecure Attachment Patterns at Five Years. What Do They Tell Us?

    ERIC Educational Resources Information Center

    Priddis, Lynn; Howieson, Noel D.

    2012-01-01

    Developmental outcomes for children whose primary caregivers are misattuned but not considered abusive are unclear. This paper argues that if by the pre-school years, insecure patterns of attachment are evident then a continuing dysfunctional attachment relationship is indicated and the likelihood of later difficulties is increased. The current…

  13. Adolescent Substance-Use Frequency following Self-Help Group Attendance and Outpatient Substance Abuse Treatment

    ERIC Educational Resources Information Center

    Gangi, Jennifer; Darling, Carol A.

    2012-01-01

    Despite the heterogeneity of posttreatment outcomes, the likelihood of relapse is often dependent on several factors, including participation in continuing care services such as self-help groups. However, few studies have examined the use of self-help groups among adolescent outpatients. Therefore, in this study, investigators examined self-help…

  14. Safety Study: The Performance and use of Child Restraint Systems, Seatbelts, and Air Bags for Children in Passenger Vehicles. Volume 1:Analysis

    DOT National Transportation Integrated Search

    1996-01-01

    Despite the effectiveness of child restraints and lap/shoulder belts to reduce the likelihood of severe and fatal injuries, accidents continue to occur in which restrained children are being injured and killed. The Safety Board conducted this study t...

  15. Effect of Performance Feedback on Perceived Knowledge and Likelihood to Pursue Continuing Education

    ERIC Educational Resources Information Center

    Eberman, Lindsey E.; Tripp, Brady L.

    2011-01-01

    Context: For practicing health care professionals, waiting for a teachable moment to identify a gap in knowledge could prove critical. Other methods are needed to help health care professionals identify their knowledge gaps. Objective: To assess the effect of performance feedback on Athletic Trainers' (AT) perceived knowledge (PK) and likelihood…

  16. Reassessing Education's Role within the Global Village: Where in the World Are We?

    ERIC Educational Resources Information Center

    Stohrer, Freda F.

    The rapid technological advancement that U.S. society is experiencing has increased the likelihood that workers will have to be trained and retrained throughout their working lives to meet continually changing job requirements. This situation has challenged instructors of technical communication who are faced with the overlap of traditional…

  17. Concussion reporting, sex, and conformity to traditional gender norms in young adults.

    PubMed

    Kroshus, Emily; Baugh, Christine M; Stein, Cynthia J; Austin, S Bryn; Calzo, Jerel P

    2017-01-01

    This study assessed whether between-sex differences in concussion reporting intention and behavior among young adults are explained by the extent to which the individual conforms to traditional masculine norms that often characterize contemporary sport culture. A survey of college athletes in the United States (n = 328) found greater symptom reporting intention among females as compared to males, but no difference in their likelihood continued play while experiencing symptoms of a possible concussion. Greater conformity to the norms of risk-taking was associated with greater likelihood of continued play while symptomatic among female athletes but not among male athletes. These findings suggest that gendered behavior, rather than biologically determined sex, is an important consideration for concussion safety in this age group. Addressing elements of the contemporary sport ethos that reinforce risk taking in service of athletic achievement may be a relevant direction for interventions aimed at improving injury reporting among all athletes. Copyright © 2016 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  18. Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando

    2013-04-01

    SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.

  19. Revision of an automated microseismic location algorithm for DAS - 3C geophone hybrid array

    NASA Astrophysics Data System (ADS)

    Mizuno, T.; LeCalvez, J.; Raymer, D.

    2017-12-01

    Application of distributed acoustic sensing (DAS) has been studied in several areas in seismology. One of the areas is microseismic reservoir monitoring (e.g., Molteni et al., 2017, First Break). Considering the present limitations of DAS, which include relatively low signal-to-noise ratio (SNR) and no 3C polarization measurements, a DAS - 3C geophone hybrid array is a practical option when using a single monitoring well. Considering the large volume of data from distributed sensing, microseismic event detection and location using a source scanning type algorithm is a reasonable choice, especially for real-time monitoring. The algorithm must handle both strain rate along the borehole axis for DAS and particle velocity for 3C geophones. Only a small quantity of large SNR events will be detected throughout a large aperture encompassing the hybrid array; therefore, the aperture is to be optimized dynamically to eliminate noisy channels for a majority of events. For such hybrid array, coalescence microseismic mapping (CMM) (Drew et al., 2005, SPE) was revised. CMM forms a likelihood function of location of event and its origin time. At each receiver, a time function of event arrival likelihood is inferred using an SNR function, and it is migrated to time and space to determine hypocenter and origin time likelihood. This algorithm was revised to dynamically optimize such a hybrid array by identifying receivers where a microseismic signal is possibly detected and using only those receivers to compute the likelihood function. Currently, peak SNR is used to select receivers. To prevent false results due to small aperture, a minimum aperture threshold is employed. The algorithm refines location likelihood using 3C geophone polarization. We tested this algorithm using a ray-based synthetic dataset. Leaney (2014, PhD thesis, UBC) is used to compute particle velocity at receivers. Strain rate along the borehole axis is computed from particle velocity as DAS microseismic synthetic data. The likelihood function formed by both DAS and geophone behaves as expected with the aperture dynamically selected depending on the SNR of the event. We conclude that this algorithm can be successfully applied for such hybrid arrays to monitor microseismic activity. A study using a recently acquired dataset is planned.

  20. Effects of arm elevation on radial artery pressure: a new method to distinguish hypovolemic shock and septic shock from hypotension.

    PubMed

    Xie, Zhiyi; Zhang, Zhenyu; Xu, Yuan; Zhou, Hua; Wu, Sheng; Wang, Zhong

    2018-06-01

    In this prospective observational study, we investigated the variability in radial artery invasive blood pressure associated with arm elevation in patients with different hemodynamic types. We carried out a prospective observational study using data from 73 general anesthesia hepatobiliary postoperative adult patients admitted to an ICU over a 1-year period. A standard procedure was used for the arm elevation test. The value of invasive radial arterial pressure was recorded at baseline, and 30 and 60 s after the arm had been raised from 0° to 90°. We compared the blood pressure before versus after arm elevation, and between hemodynamically stable, hypovolemic shock, and septic shock patient groups. In all 73 patients, systolic arterial pressure (SAP) decreased, diastolic arterial pressure (DAP) increased, and pulse pressure (PP) decreased at 30 and 60 s after arm elevation (P<0.01), but the mean arterial pressure (MAP) was unchanged (P>0.05). On comparing 30 and 60 s, there was no significant difference in SAP, DAP, PP, or MAP (P>0.05). In 40 hemodynamically stable patients, SAP and PP decreased, and DAP and MAP increased significantly at 30 and 60 s after arm elevation compared with baseline (P<0.01). In 16 hypovolemic patients, SAP, DAP, and MAP increased significantly compared with baseline at 30 and 60 s (P<0.01), but PP was unchanged (P>0.05). In 17 patients with septic shock, SAP, PP, and MAP decreased significantly versus baseline at 30 and 60 s (P<0.01), but DAP was unchanged (P>0.05). Comparison of the absolute value of pressure change of septic shock patients at 30 s after raising the arm showed that SAP, DAP, and MAP changes were significantly lower compared with those in hypovolemic shock and hemodynamically stable patients (P<0.01). The areas under the receiver operator characteristic curve for predicting septic shock was 0.930 [95% confidence interval (CI): 0.867-0.992, P< 0.001] for change value at 30 s after arm elevation of SAP. The best cut-off point for the SAP change value was -5 mmHg or less, with a sensitivity of 94.12%, a specificity of 80.36%, a positive likelihood ratio of 4.79 (95% CI: 2.8-8.2), and a negative likelihood ratio of 0.073 (95% CI: 0.01-0.5). Our study shows that hypovolemic shock and septic shock patients have significantly different radial artery invasive blood pressure changes in an arm elevation test, which could be applied as a new method to distinguish hypovolemic shock and septic shock from hypotension.

  1. Mapping quantitative trait loci controlling milk production in dairy cattle by exploiting progeny testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Georges, M.; Nielsen, D.; Mackinnon, M.

    1995-02-01

    We have exploited {open_quotes}progeny testing{close_quotes} to map quantitative trait loci (QTL) underlying the genetic variation of milk production in a selected dairy cattle population. A total of 1,518 sires, with progeny tests based on the milking performances of >150,000 daughters jointly, was genotyped for 159 autosomal microsatellites bracketing 1645 centimorgan or approximately two thirds of the bovine genome. Using a maximum likelihood multilocus linkage analysis accounting for variance heterogeneity of the phenotypes, we identified five chromosomes giving very strong evidence (LOD score {ge} 3) for the presence of a QTL controlling milk production: chromosomes 1, 6, 9, 10 and 20.more » These findings demonstrate that loci with considerable effects on milk production are still segregating in highly selected populations and pave the way toward marker-assisted selection in dairy cattle breeding. 44 refs., 4 figs., 3 tabs.« less

  2. Potts glass reflection of the decoding threshold for qudit quantum error correcting codes

    NASA Astrophysics Data System (ADS)

    Jiang, Yi; Kovalev, Alexey A.; Pryadko, Leonid P.

    We map the maximum likelihood decoding threshold for qudit quantum error correcting codes to the multicritical point in generalized Potts gauge glass models, extending the map constructed previously for qubit codes. An n-qudit quantum LDPC code, where a qudit can be involved in up to m stabilizer generators, corresponds to a ℤd Potts model with n interaction terms which can couple up to m spins each. We analyze general properties of the phase diagram of the constructed model, give several bounds on the location of the transitions, bounds on the energy density of extended defects (non-local analogs of domain walls), and discuss the correlation functions which can be used to distinguish different phases in the original and the dual models. This research was supported in part by the Grants: NSF PHY-1415600 (AAK), NSF PHY-1416578 (LPP), and ARO W911NF-14-1-0272 (LPP).

  3. A family of chaotic pure analog coding schemes based on baker's map function

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Li, Jing; Lu, Xuanxuan; Yuen, Chau; Wu, Jun

    2015-12-01

    This paper considers a family of pure analog coding schemes constructed from dynamic systems which are governed by chaotic functions—baker's map function and its variants. Various decoding methods, including maximum likelihood (ML), minimum mean square error (MMSE), and mixed ML-MMSE decoding algorithms, have been developed for these novel encoding schemes. The proposed mirrored baker's and single-input baker's analog codes perform a balanced protection against the fold error (large distortion) and weak distortion and outperform the classical chaotic analog coding and analog joint source-channel coding schemes in literature. Compared to the conventional digital communication system, where quantization and digital error correction codes are used, the proposed analog coding system has graceful performance evolution, low decoding latency, and no quantization noise. Numerical results show that under the same bandwidth expansion, the proposed analog system outperforms the digital ones over a wide signal-to-noise (SNR) range.

  4. CSGRqtl: A Comparative Quantitative Trait Locus Database for Saccharinae Grasses.

    PubMed

    Zhang, Dong; Paterson, Andrew H

    2017-01-01

    Conventional biparental quantitative trait locus (QTL) mapping has led to some successes in the identification of causal genes in many organisms. QTL likelihood intervals not only provide "prior information" for finer-resolution approaches such as GWAS but also provide better statistical power than GWAS to detect variants with low/rare frequency in a natural population. Here, we describe a new element of an ongoing effort to provide online resources to facilitate study and improvement of the important Saccharinae clade. The primary goal of this new resource is the anchoring of published QTLs for this clade to the Sorghum genome. Genetic map alignments translate a wealth of genomic information from sorghum to Saccharum spp., Miscanthus spp., and other taxa. In addition, genome alignments facilitate comparison of the Saccharinae QTL sets to those of other taxa that enjoy comparable resources, exemplified herein by rice.

  5. Exploring the clinical potential of an automatic colonic polyp detection method based on the creation of energy maps.

    PubMed

    Fernández-Esparrach, Glòria; Bernal, Jorge; López-Cerón, Maria; Córdova, Henry; Sánchez-Montes, Cristina; Rodríguez de Miguel, Cristina; Sánchez, Francisco Javier

    2016-09-01

    Polyp miss-rate is a drawback of colonoscopy that increases significantly for small polyps. We explored the efficacy of an automatic computer-vision method for polyp detection. Our method relies on a model that defines polyp boundaries as valleys of image intensity. Valley information is integrated into energy maps that represent the likelihood of the presence of a polyp. In 24 videos containing polyps from routine colonoscopies, all polyps were detected in at least one frame. The mean of the maximum values on the energy map was higher for frames with polyps than without (P < 0.001). Performance improved in high quality frames (AUC = 0.79 [95 %CI 0.70 - 0.87] vs. 0.75 [95 %CI 0.66 - 0.83]). With 3.75 set as the maximum threshold value, sensitivity and specificity for the detection of polyps were 70.4 % (95 %CI 60.3 % - 80.8 %) and 72.4 % (95 %CI 61.6 % - 84.6 %), respectively. Energy maps performed well for colonic polyp detection, indicating their potential applicability in clinical practice. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Inverting ion images without Abel inversion: maximum entropy reconstruction of velocity maps.

    PubMed

    Dick, Bernhard

    2014-01-14

    A new method for the reconstruction of velocity maps from ion images is presented, which is based on the maximum entropy concept. In contrast to other methods used for Abel inversion the new method never applies an inversion or smoothing to the data. Instead, it iteratively finds the map which is the most likely cause for the observed data, using the correct likelihood criterion for data sampled from a Poissonian distribution. The entropy criterion minimizes the information content in this map, which hence contains no information for which there is no evidence in the data. Two implementations are proposed, and their performance is demonstrated with simulated and experimental data: Maximum Entropy Velocity Image Reconstruction (MEVIR) obtains a two-dimensional slice through the velocity distribution and can be compared directly to Abel inversion. Maximum Entropy Velocity Legendre Reconstruction (MEVELER) finds one-dimensional distribution functions Q(l)(v) in an expansion of the velocity distribution in Legendre polynomials P((cos θ) for the angular dependence. Both MEVIR and MEVELER can be used for the analysis of ion images with intensities as low as 0.01 counts per pixel, with MEVELER performing significantly better than MEVIR for images with low intensity. Both methods perform better than pBASEX, in particular for images with less than one average count per pixel.

  7. Correlation of diffusion and perfusion MRI with Ki-67 in high-grade meningiomas.

    PubMed

    Ginat, Daniel T; Mangla, Rajiv; Yeaney, Gabrielle; Wang, Henry Z

    2010-12-01

    Atypical and anaplastic meningiomas have a greater likelihood of recurrence than benign meningiomas. The risk for recurrence is often estimated using the Ki-67 labeling index. The purpose of this study was to determine the correlation between Ki-67 and regional cerebral blood volume (rCBV) and between Ki-67 and apparent diffusion coefficient (ADC) in atypical and anaplastic meningiomas. A retrospective review of the advanced imaging and immunohistochemical characteristics of atypical and anaplastic meningiomas was performed. The relative minimum ADC, relative maximum rCBV, and specimen Ki-67 index were measured. Pearson's correlation was used to compare these parameters. There were 23 cases with available ADC maps and 20 cases with available rCBV maps. The average Ki-67 among the cases with ADC maps and rCBV maps was 17.6% (range, 5-38%) and 16.7% (range, 3-38%), respectively. The mean minimum ADC ratio was 0.91 (SD, 0.26) and the mean maximum rCBV ratio was 22.5 (SD, 7.9). There was a significant positive correlation between maximum rCBV and Ki-67 (Pearson's correlation, 0.69; p = 0.00038). However, there was no significant correlation between minimum ADC and Ki-67 (Pearson's correlation, -0.051; p = 0.70). Maximum rCBV correlated significantly with Ki-67 in high-grade meningiomas.

  8. US Topo Maps 2014: Program updates and research

    USGS Publications Warehouse

    Fishburn, Kristin A.

    2014-01-01

    The U. S. Geological Survey (USGS) US Topo map program is now in year two of its second three-year update cycle. Since the program was launched in 2009, the product and the production system tools and processes have undergone enhancements that have made the US Topo maps a popular success story. Research and development continues with structural and content product enhancements, streamlined and more fully automated workflows, and the evaluation of a GIS-friendly US Topo GIS Packet. In addition, change detection methodologies are under evaluation to further streamline product maintenance and minimize resource expenditures for production in the future. The US Topo map program will continue to evolve in the years to come, providing traditional map users and Geographic Information System (GIS) analysts alike with a convenient, freely available product incorporating nationally consistent data that are quality assured to high standards.

  9. A forskolin derivative, colforsin daropate hydrochloride, inhibits the decrease in cortical renal blood flow induced by noradrenaline or angiotensin II in anesthetized rats.

    PubMed

    Ogata, Junichi; Minami, Kouichiro; Segawa, Kayoko; Uezono, Yasuhito; Shiraishi, Munehiro; Yamamoto, Chikako; Sata, Takeyoshi; Sung-Teh, Kim; Shigematsu, Akio

    2004-01-01

    A forskolin derivative, colforsin daropate hydrochloride (CDH), acts directly on adenylate cyclase to increase the intracellular cyclic adenosine monophosphate levels which produce a positive inotropic effect and a lower blood pressure. However, little is known about the effects of CDH on the renal function. We used laser Doppler flowmetry to measure the cortical renal blood flow (RBF) in male Wistar rats given a continuous intravenous infusion of CDH and evaluated the effects of CDH on the noradrenaline (NA) and angiotensin II (AngII) induced increases in blood pressure and reductions in RBF. Continuous intravenous administration of CDH at 0.25 microg/kg/min did not affect the mean arterial pressure (MAP), but increased heart rate and RBF. Continuous intravenous administration of CDH at high doses (0.5-0.75 microg/kg/min) decreased the MAP, with little effect on the RBF. The administration of exogenous NA (1.7 microg/kg) increased the MAP and decreased the RBF. However, a bolus injection of NA did not decrease the RBF during continuous intravenous administration of CDH, and CDH did not affect the NA-induced increase in MAP. The administration of exogenous AngII (100 ng/kg) increased MAP and decreased RBF and heart rate, but a bolus injection of AngII did not decrease RBF during continuous intravenous administration of CDH. These results suggest that CDH plays a protective role against the pressor effects and the decrease in RBF induced by NA or AngII. Copyright 2004 S. Karger AG, Basel

  10. Detection of breast cancer in automated 3D breast ultrasound

    NASA Astrophysics Data System (ADS)

    Tan, Tao; Platel, Bram; Mus, Roel; Karssemeijer, Nico

    2012-03-01

    Automated 3D breast ultrasound (ABUS) is a novel imaging modality, in which motorized scans of the breasts are made with a wide transducer through a membrane under modest compression. The technology has gained high interest and may become widely used in screening of dense breasts, where sensitivity of mammography is poor. ABUS has a high sensitivity for detecting solid breast lesions. However, reading ABUS images is time consuming, and subtle abnormalities may be missed. Therefore, we are developing a computer aided detection (CAD) system to help reduce reading time and errors. In the multi-stage system we propose, segmentations of the breast and nipple are performed, providing landmarks for the detection algorithm. Subsequently, voxel features characterizing coronal spiculation patterns, blobness, contrast, and locations with respect to landmarks are extracted. Using an ensemble of classifiers, a likelihood map indicating potential malignancies is computed. Local maxima in the likelihood map are determined using a local maxima detector and form a set of candidate lesions in each view. These candidates are further processed in a second detection stage, which includes region segmentation, feature extraction and a final classification. Region segmentation is performed using a 3D spiral-scanning dynamic programming method. Region features include descriptors of shape, acoustic behavior and texture. Performance was determined using a 78-patient dataset with 93 images, including 50 malignant lesions. We used 10-fold cross-validation. Using FROC analysis we found that the system obtains a lesion sensitivity of 60% and 70% at 2 and 4 false positives per image respectively.

  11. Deformable MR Prostate Segmentation via Deep Feature Learning and Sparse Patch Matching

    PubMed Central

    Guo, Yanrong; Gao, Yaozong

    2016-01-01

    Automatic and reliable segmentation of the prostate is an important but difficult task for various clinical applications such as prostate cancer radiotherapy. The main challenges for accurate MR prostate localization lie in two aspects: (1) inhomogeneous and inconsistent appearance around prostate boundary, and (2) the large shape variation across different patients. To tackle these two problems, we propose a new deformable MR prostate segmentation method by unifying deep feature learning with the sparse patch matching. First, instead of directly using handcrafted features, we propose to learn the latent feature representation from prostate MR images by the stacked sparse auto-encoder (SSAE). Since the deep learning algorithm learns the feature hierarchy from the data, the learned features are often more concise and effective than the handcrafted features in describing the underlying data. To improve the discriminability of learned features, we further refine the feature representation in a supervised fashion. Second, based on the learned features, a sparse patch matching method is proposed to infer a prostate likelihood map by transferring the prostate labels from multiple atlases to the new prostate MR image. Finally, a deformable segmentation is used to integrate a sparse shape model with the prostate likelihood map for achieving the final segmentation. The proposed method has been extensively evaluated on the dataset that contains 66 T2-wighted prostate MR images. Experimental results show that the deep-learned features are more effective than the handcrafted features in guiding MR prostate segmentation. Moreover, our method shows superior performance than other state-of-the-art segmentation methods. PMID:26685226

  12. Fast estimation of diffusion tensors under Rician noise by the EM algorithm.

    PubMed

    Liu, Jia; Gasbarra, Dario; Railavo, Juha

    2016-01-15

    Diffusion tensor imaging (DTI) is widely used to characterize, in vivo, the white matter of the central nerve system (CNS). This biological tissue contains much anatomic, structural and orientational information of fibers in human brain. Spectral data from the displacement distribution of water molecules located in the brain tissue are collected by a magnetic resonance scanner and acquired in the Fourier domain. After the Fourier inversion, the noise distribution is Gaussian in both real and imaginary parts and, as a consequence, the recorded magnitude data are corrupted by Rician noise. Statistical estimation of diffusion leads a non-linear regression problem. In this paper, we present a fast computational method for maximum likelihood estimation (MLE) of diffusivities under the Rician noise model based on the expectation maximization (EM) algorithm. By using data augmentation, we are able to transform a non-linear regression problem into the generalized linear modeling framework, reducing dramatically the computational cost. The Fisher-scoring method is used for achieving fast convergence of the tensor parameter. The new method is implemented and applied using both synthetic and real data in a wide range of b-amplitudes up to 14,000s/mm(2). Higher accuracy and precision of the Rician estimates are achieved compared with other log-normal based methods. In addition, we extend the maximum likelihood (ML) framework to the maximum a posteriori (MAP) estimation in DTI under the aforementioned scheme by specifying the priors. We will describe how close numerically are the estimators of model parameters obtained through MLE and MAP estimation. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Semi-automated segmentation of solid and GGO nodules in lung CT images using vessel-likelihood derived from local foreground structure

    NASA Astrophysics Data System (ADS)

    Yaguchi, Atsushi; Okazaki, Tomoya; Takeguchi, Tomoyuki; Matsumoto, Sumiaki; Ohno, Yoshiharu; Aoyagi, Kota; Yamagata, Hitoshi

    2015-03-01

    Reflecting global interest in lung cancer screening, considerable attention has been paid to automatic segmentation and volumetric measurement of lung nodules on CT. Ground glass opacity (GGO) nodules deserve special consideration in this context, since it has been reported that they are more likely to be malignant than solid nodules. However, due to relatively low contrast and indistinct boundaries of GGO nodules, segmentation is more difficult for GGO nodules compared with solid nodules. To overcome this difficulty, we propose a method for accurately segmenting not only solid nodules but also GGO nodules without prior information about nodule types. First, the histogram of CT values in pre-extracted lung regions is modeled by a Gaussian mixture model and a threshold value for including high-attenuation regions is computed. Second, after setting up a region of interest around the nodule seed point, foreground regions are extracted by using the threshold and quick-shift-based mode seeking. Finally, for separating vessels from the nodule, a vessel-likelihood map derived from elongatedness of foreground regions is computed, and a region growing scheme starting from the seed point is applied to the map with the aid of fast marching method. Experimental results using an anthropomorphic chest phantom showed that our method yielded generally lower volumetric measurement errors for both solid and GGO nodules compared with other methods reported in preceding studies conducted using similar technical settings. Also, our method allowed reasonable segmentation of GGO nodules in low-dose images and could be applied to clinical CT images including part-solid nodules.

  14. A Bayesian Estimate of the CMB-Large-scale Structure Cross-correlation

    NASA Astrophysics Data System (ADS)

    Moura-Santos, E.; Carvalho, F. C.; Penna-Lima, M.; Novaes, C. P.; Wuensche, C. A.

    2016-08-01

    Evidences for late-time acceleration of the universe are provided by multiple probes, such as Type Ia supernovae, the cosmic microwave background (CMB), and large-scale structure (LSS). In this work, we focus on the integrated Sachs-Wolfe (ISW) effect, I.e., secondary CMB fluctuations generated by evolving gravitational potentials due to the transition between, e.g., the matter and dark energy (DE) dominated phases. Therefore, assuming a flat universe, DE properties can be inferred from ISW detections. We present a Bayesian approach to compute the CMB-LSS cross-correlation signal. The method is based on the estimate of the likelihood for measuring a combined set consisting of a CMB temperature and galaxy contrast maps, provided that we have some information on the statistical properties of the fluctuations affecting these maps. The likelihood is estimated by a sampling algorithm, therefore avoiding the computationally demanding techniques of direct evaluation in either pixel or harmonic space. As local tracers of the matter distribution at large scales, we used the Two Micron All Sky Survey galaxy catalog and, for the CMB temperature fluctuations, the ninth-year data release of the Wilkinson Microwave Anisotropy Probe (WMAP9). The results show a dominance of cosmic variance over the weak recovered signal, due mainly to the shallowness of the catalog used, with systematics associated with the sampling algorithm playing a secondary role as sources of uncertainty. When combined with other complementary probes, the method presented in this paper is expected to be a useful tool to late-time acceleration studies in cosmology.

  15. Regional Estimates of Drought-Induced Tree Canopy Loss across Texas

    NASA Astrophysics Data System (ADS)

    Schwantes, A.; Swenson, J. J.; González-Roglich, M.; Johnson, D. M.; Domec, J. C.; Jackson, R. B.

    2015-12-01

    The severe drought of 2011 killed millions of trees across the state of Texas. Drought-induced tree-mortality can have significant impacts to carbon cycling, regional biophysics, and community composition. We quantified canopy cover loss across the state using remotely sensed imagery from before and after the drought at multiple scales. First, we classified ~200 orthophotos (1-m spatial resolution) from the National Agriculture Imagery Program, using a supervised maximum likelihood classification. Area of canopy cover loss in these classifications was highly correlated (R2 = 0.8) with ground estimates of canopy cover loss, measured in 74 plots across 15 different sites in Texas. These 1-m orthophoto classifications were then used to calibrate and validate coarser scale (30-m) Landsat imagery to create wall-to-wall tree canopy cover loss maps across the state of Texas. We quantified percent dead and live canopy within each pixel of Landsat to create continuous maps of dead and live tree cover, using two approaches: (1) a zero-inflated beta distribution model and (2) a random forest algorithm. Widespread canopy loss occurred across all the major natural systems of Texas, with the Edwards Plateau region most affected. In this region, on average, 10% of the forested area was lost due to the 2011 drought. We also identified climatic thresholds that controlled the spatial distribution of tree canopy loss across the state. However, surprisingly, there were many local hot spots of canopy loss, suggesting that not only climatic factors could explain the spatial patterns of canopy loss, but rather other factors related to soil, landscape, management, and stand density also likely played a role. As increases in extreme droughts are predicted to occur with climate change, it will become important to define methods that can detect associated drought-induced tree mortality across large regions. These maps could then be used (1) to quantify impacts to carbon cycling and regional biophysics, (2) to better understand the spatiotemporal dynamics of tree mortality, and (3) to calibrate and/or validate mortality algorithms in regional models.

  16. Errors in Seismic Hazard Assessment are Creating Huge Human Losses

    NASA Astrophysics Data System (ADS)

    Bela, J.

    2015-12-01

    The current practice of representing earthquake hazards to the public based upon their perceived likelihood or probability of occurrence is proven now by the global record of actual earthquakes to be not only erroneous and unreliable, but also too deadly! Earthquake occurrence is sporadic and therefore assumptions of earthquake frequency and return-period are both not only misleading, but also categorically false. More than 700,000 people have now lost their lives (2000-2011), wherein 11 of the World's Deadliest Earthquakes have occurred in locations where probability-based seismic hazard assessments had predicted only low seismic low hazard. Unless seismic hazard assessment and the setting of minimum earthquake design safety standards for buildings and bridges are based on a more realistic deterministic recognition of "what can happen" rather than on what mathematical models suggest is "most likely to happen" such future huge human losses can only be expected to continue! The actual earthquake events that did occur were at or near the maximum potential-size event that either already had occurred in the past; or were geologically known to be possible. Haiti's M7 earthquake, 2010 (with > 222,000 fatalities) meant the dead could not even be buried with dignity. Japan's catastrophic Tohoku earthquake, 2011; a M9 Megathrust earthquake, unleashed a tsunami that not only obliterated coastal communities along the northern Japanese coast, but also claimed > 20,000 lives. This tsunami flooded nuclear reactors at Fukushima, causing 4 explosions and 3 reactors to melt down. But while this history of huge human losses due to erroneous and misleading seismic hazard estimates, despite its wrenching pain, cannot be unlived; if faced with courage and a more realistic deterministic estimate of "what is possible", it need not be lived again. An objective testing of the results of global probability based seismic hazard maps against real occurrences has never been done by the GSHAP team; even though the obvious inadequacy of the GSHAP map could have been established in the course of a simple check before the project completion. The doctrine of "psha exceptionalism" that created the maps can only be esponged by carefully examining the facts . . . which unfortunately include huge human losses!

  17. Anatomy assisted PET image reconstruction incorporating multi-resolution joint entropy

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Rahmim, Arman

    2015-01-01

    A promising approach in PET image reconstruction is to incorporate high resolution anatomical information (measured from MR or CT) taking the anato-functional similarity measures such as mutual information or joint entropy (JE) as the prior. These similarity measures only classify voxels based on intensity values, while neglecting structural spatial information. In this work, we developed an anatomy-assisted maximum a posteriori (MAP) reconstruction algorithm wherein the JE measure is supplied by spatial information generated using wavelet multi-resolution analysis. The proposed wavelet-based JE (WJE) MAP algorithm involves calculation of derivatives of the subband JE measures with respect to individual PET image voxel intensities, which we have shown can be computed very similarly to how the inverse wavelet transform is implemented. We performed a simulation study with the BrainWeb phantom creating PET data corresponding to different noise levels. Realistically simulated T1-weighted MR images provided by BrainWeb modeling were applied in the anatomy-assisted reconstruction with the WJE-MAP algorithm and the intensity-only JE-MAP algorithm. Quantitative analysis showed that the WJE-MAP algorithm performed similarly to the JE-MAP algorithm at low noise level in the gray matter (GM) and white matter (WM) regions in terms of noise versus bias tradeoff. When noise increased to medium level in the simulated data, the WJE-MAP algorithm started to surpass the JE-MAP algorithm in the GM region, which is less uniform with smaller isolated structures compared to the WM region. In the high noise level simulation, the WJE-MAP algorithm presented clear improvement over the JE-MAP algorithm in both the GM and WM regions. In addition to the simulation study, we applied the reconstruction algorithms to real patient studies involving DPA-173 PET data and Florbetapir PET data with corresponding T1-MPRAGE MRI images. Compared to the intensity-only JE-MAP algorithm, the WJE-MAP algorithm resulted in comparable regional mean values to those from the maximum likelihood algorithm while reducing noise. Achieving robust performance in various noise-level simulation and patient studies, the WJE-MAP algorithm demonstrates its potential in clinical quantitative PET imaging.

  18. An imputed forest composition map for New England screened by species range boundaries

    Treesearch

    Matthew J. Duveneck; Jonathan R. Thompson; B. Tyler Wilson

    2015-01-01

    Initializing forest landscape models (FLMs) to simulate changes in tree species composition requires accurate fine-scale forest attribute information mapped continuously over large areas. Nearest-neighbor imputation maps, maps developed from multivariate imputation of field plots, have high potential for use as the initial condition within FLMs, but the tendency for...

  19. Approximating prediction uncertainty for random forest regression models

    Treesearch

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  20. Magnetic navigation and catheter ablation of right atrial ectopic tachycardia in the presence of a hemi-azygos continuation: a magnetic navigation case using 3D electroanatomical mapping.

    PubMed

    Ernst, Sabine; Chun, Julian K R; Koektuerk, Buelent; Kuck, Karl-Heinz

    2009-01-01

    We report on a 63-year-old female patient in whom an electrophysiologic study discovered a hemi-azygos continuation. Using the magnetic navigation system, remote-controlled ablation was performed in conjunction with the 3D electroanatomical mapping system. Failing the attempt to advance a diagnostic catheter from the femoral vein, a diagnostic catheter was advanced via the left subclavian vein into the coronary sinus. The soft magnetic catheter was positioned in the right atrium via the hemi-azygos vein, and 3D mapping demonstrated an ectopic atrial tachycardia. Successful ablation was performed entirely remote controlled. Fluoroscopy time was only 7.1 minutes, of which 45 seconds were required during remote navigation. Remote-controlled catheter ablation using magnetic navigation in conjunction with the electroanatomical mapping system proved to be a valuable tool to perform successful ablation in the presence of a hemi-azygos continuation.

  1. Varieties of quantity estimation in children.

    PubMed

    Sella, Francesco; Berteletti, Ilaria; Lucangeli, Daniela; Zorzi, Marco

    2015-06-01

    In the number-to-position task, with increasing age and numerical expertise, children's pattern of estimates shifts from a biased (nonlinear) to a formal (linear) mapping. This widely replicated finding concerns symbolic numbers, whereas less is known about other types of quantity estimation. In Experiment 1, Preschool, Grade 1, and Grade 3 children were asked to map continuous quantities, discrete nonsymbolic quantities (numerosities), and symbolic (Arabic) numbers onto a visual line. Numerical quantity was matched for the symbolic and discrete nonsymbolic conditions, whereas cumulative surface area was matched for the continuous and discrete quantity conditions. Crucially, in the discrete condition children's estimation could rely either on the cumulative area or numerosity. All children showed a linear mapping for continuous quantities, whereas a developmental shift from a logarithmic to a linear mapping was observed for both nonsymbolic and symbolic numerical quantities. Analyses on individual estimates suggested the presence of two distinct strategies in estimating discrete nonsymbolic quantities: one based on numerosity and the other based on spatial extent. In Experiment 2, a non-spatial continuous quantity (shades of gray) and new discrete nonsymbolic conditions were added to the set used in Experiment 1. Results confirmed the linear patterns for the continuous tasks, as well as the presence of a subset of children relying on numerosity for the discrete nonsymbolic numerosity conditions despite the availability of continuous visual cues. Overall, our findings demonstrate that estimation of numerical and non-numerical quantities is based on different processing strategies and follow different developmental trajectories. (c) 2015 APA, all rights reserved).

  2. Social networks, mental health problems, and mental health service utilization in OEF/OIF National Guard veterans.

    PubMed

    Sripada, Rebecca K; Bohnert, Amy S B; Teo, Alan R; Levine, Debra S; Pfeiffer, Paul N; Bowersox, Nicholas W; Mizruchi, Mark S; Chermack, Stephen T; Ganoczy, Dara; Walters, Heather; Valenstein, Marcia

    2015-09-01

    Low social support and small social network size have been associated with a variety of negative mental health outcomes, while their impact on mental health services use is less clear. To date, few studies have examined these associations in National Guard service members, where frequency of mental health problems is high, social support may come from military as well as other sources, and services use may be suboptimal. Surveys were administered to 1448 recently returned National Guard members. Multivariable regression models assessed the associations between social support characteristics, probable mental health conditions, and service utilization. In bivariate analyses, large social network size, high social network diversity, high perceived social support, and high military unit support were each associated with lower likelihood of having a probable mental health condition (p < .001). In adjusted analyses, high perceived social support (OR .90, CI .88-.92) and high unit support (OR .96, CI .94-.97) continued to be significantly associated with lower likelihood of mental health conditions. Two social support measures were associated with lower likelihood of receiving mental health services in bivariate analyses, but were not significant in adjusted models. General social support and military-specific support were robustly associated with reduced mental health symptoms in National Guard members. Policy makers, military leaders, and clinicians should attend to service members' level of support from both the community and their units and continue efforts to bolster these supports. Other strategies, such as focused outreach, may be needed to bring National Guard members with need into mental health care.

  3. Risk Management for Human Support Technology Development

    NASA Technical Reports Server (NTRS)

    jones, Harry

    2005-01-01

    NASA requires continuous risk management for all programs and projects. The risk management process identifies risks, analyzes their impact, prioritizes them, develops and carries out plans to mitigate or accept them, tracks risks and mitigation plans, and communicates and documents risk information. Project risk management is driven by the project goal and is performed by the entire team. Risk management begins early in the formulation phase with initial risk identification and development of a risk management plan and continues throughout the project life cycle. This paper describes the risk management approach that is suggested for use in NASA's Human Support Technology Development. The first step in risk management is to identify the detailed technical and programmatic risks specific to a project. Each individual risk should be described in detail. The identified risks are summarized in a complete risk list. Risk analysis provides estimates of the likelihood and the qualitative impact of a risk. The likelihood and impact of the risk are used to define its priority location in the risk matrix. The approaches for responding to risk are either to mitigate it by eliminating or reducing the effect or likelihood of a risk, to accept it with a documented rationale and contingency plan, or to research or monitor the risk, The Human Support Technology Development program includes many projects with independently achievable goals. Each project must do independent risk management, considering all its risks together and trading them against performance, budget, and schedule. Since the program can succeed even if some projects fail, the program risk has a complex dependence on the individual project risks.

  4. Planning and conducting medical support to joint operations.

    PubMed

    Hughes, A S

    2000-01-01

    Operations are core business for all of us and the PJHQ medical cell is at the heart of this process. With the likelihood of a continuing UK presence in the Balkans for some time to come, the challenge of meeting this and any other new operational commitments will continue to demand a flexible and innovative approach from all concerned. These challenges together with the Joint and multinational aspects of the job make the PJHQ medical cell a demanding but rewarding place to work and provide a valuable Joint staff training opportunity for the RNMS.

  5. Variable selection in discrete survival models including heterogeneity.

    PubMed

    Groll, Andreas; Tutz, Gerhard

    2017-04-01

    Several variable selection procedures are available for continuous time-to-event data. However, if time is measured in a discrete way and therefore many ties occur models for continuous time are inadequate. We propose penalized likelihood methods that perform efficient variable selection in discrete survival modeling with explicit modeling of the heterogeneity in the population. The method is based on a combination of ridge and lasso type penalties that are tailored to the case of discrete survival. The performance is studied in simulation studies and an application to the birth of the first child.

  6. Cardiorespiratory dynamics measured from continuous ECG monitoring improves detection of deterioration in acute care patients: A retrospective cohort study

    PubMed Central

    Clark, Matthew T.; Calland, James Forrest; Enfield, Kyle B.; Voss, John D.; Lake, Douglas E.; Moorman, J. Randall

    2017-01-01

    Background Charted vital signs and laboratory results represent intermittent samples of a patient’s dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. Methods and findings We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Conclusions Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs. PMID:28771487

  7. Cardiorespiratory dynamics measured from continuous ECG monitoring improves detection of deterioration in acute care patients: A retrospective cohort study.

    PubMed

    Moss, Travis J; Clark, Matthew T; Calland, James Forrest; Enfield, Kyle B; Voss, John D; Lake, Douglas E; Moorman, J Randall

    2017-01-01

    Charted vital signs and laboratory results represent intermittent samples of a patient's dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs.

  8. Subject-specific bone attenuation correction for brain PET/MR: can ZTE-MRI substitute CT scan accurately?

    PubMed

    Khalifé, Maya; Fernandez, Brice; Jaubert, Olivier; Soussan, Michael; Brulon, Vincent; Buvat, Irène; Comtat, Claude

    2017-09-21

    In brain PET/MR applications, accurate attenuation maps are required for accurate PET image quantification. An implemented attenuation correction (AC) method for brain imaging is the single-atlas approach that estimates an AC map from an averaged CT template. As an alternative, we propose to use a zero echo time (ZTE) pulse sequence to segment bone, air and soft tissue. A linear relationship between histogram normalized ZTE intensity and measured CT density in Hounsfield units ([Formula: see text]) in bone has been established thanks to a CT-MR database of 16 patients. Continuous AC maps were computed based on the segmented ZTE by setting a fixed linear attenuation coefficient (LAC) to air and soft tissue and by using the linear relationship to generate continuous μ values for the bone. Additionally, for the purpose of comparison, four other AC maps were generated: a ZTE derived AC map with a fixed LAC for the bone, an AC map based on the single-atlas approach as provided by the PET/MR manufacturer, a soft-tissue only AC map and, finally, the CT derived attenuation map used as the gold standard (CTAC). All these AC maps were used with different levels of smoothing for PET image reconstruction with and without time-of-flight (TOF). The subject-specific AC map generated by combining ZTE-based segmentation and linear scaling of the normalized ZTE signal into [Formula: see text] was found to be a good substitute for the measured CTAC map in brain PET/MR when used with a Gaussian smoothing kernel of [Formula: see text] corresponding to the PET scanner intrinsic resolution. As expected TOF reduces AC error regardless of the AC method. The continuous ZTE-AC performed better than the other alternative MR derived AC methods, reducing the quantification error between the MRAC corrected PET image and the reference CTAC corrected PET image.

  9. Subject-specific bone attenuation correction for brain PET/MR: can ZTE-MRI substitute CT scan accurately?

    NASA Astrophysics Data System (ADS)

    Khalifé, Maya; Fernandez, Brice; Jaubert, Olivier; Soussan, Michael; Brulon, Vincent; Buvat, Irène; Comtat, Claude

    2017-10-01

    In brain PET/MR applications, accurate attenuation maps are required for accurate PET image quantification. An implemented attenuation correction (AC) method for brain imaging is the single-atlas approach that estimates an AC map from an averaged CT template. As an alternative, we propose to use a zero echo time (ZTE) pulse sequence to segment bone, air and soft tissue. A linear relationship between histogram normalized ZTE intensity and measured CT density in Hounsfield units (HU ) in bone has been established thanks to a CT-MR database of 16 patients. Continuous AC maps were computed based on the segmented ZTE by setting a fixed linear attenuation coefficient (LAC) to air and soft tissue and by using the linear relationship to generate continuous μ values for the bone. Additionally, for the purpose of comparison, four other AC maps were generated: a ZTE derived AC map with a fixed LAC for the bone, an AC map based on the single-atlas approach as provided by the PET/MR manufacturer, a soft-tissue only AC map and, finally, the CT derived attenuation map used as the gold standard (CTAC). All these AC maps were used with different levels of smoothing for PET image reconstruction with and without time-of-flight (TOF). The subject-specific AC map generated by combining ZTE-based segmentation and linear scaling of the normalized ZTE signal into HU was found to be a good substitute for the measured CTAC map in brain PET/MR when used with a Gaussian smoothing kernel of 4~mm corresponding to the PET scanner intrinsic resolution. As expected TOF reduces AC error regardless of the AC method. The continuous ZTE-AC performed better than the other alternative MR derived AC methods, reducing the quantification error between the MRAC corrected PET image and the reference CTAC corrected PET image.

  10. Studying the effects of fuel treatment based on burn probability on a boreal forest landscape.

    PubMed

    Liu, Zhihua; Yang, Jian; He, Hong S

    2013-01-30

    Fuel treatment is assumed to be a primary tactic to mitigate intense and damaging wildfires. However, how to place treatment units across a landscape and assess its effectiveness is difficult for landscape-scale fuel management planning. In this study, we used a spatially explicit simulation model (LANDIS) to conduct wildfire risk assessments and optimize the placement of fuel treatments at the landscape scale. We first calculated a baseline burn probability map from empirical data (fuel, topography, weather, and fire ignition and size data) to assess fire risk. We then prioritized landscape-scale fuel treatment based on maps of burn probability and fuel loads (calculated from the interactions among tree composition, stand age, and disturbance history), and compared their effects on reducing fire risk. The burn probability map described the likelihood of burning on a given location; the fuel load map described the probability that a high fuel load will accumulate on a given location. Fuel treatment based on the burn probability map specified that stands with high burn probability be treated first, while fuel treatment based on the fuel load map specified that stands with high fuel loads be treated first. Our results indicated that fuel treatment based on burn probability greatly reduced the burned area and number of fires of different intensities. Fuel treatment based on burn probability also produced more dispersed and smaller high-risk fire patches and therefore can improve efficiency of subsequent fire suppression. The strength of our approach is that more model components (e.g., succession, fuel, and harvest) can be linked into LANDIS to map the spatially explicit wildfire risk and its dynamics to fuel management, vegetation dynamics, and harvesting. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. MAP Reconstruction for Fourier Rebinned TOF-PET Data

    PubMed Central

    Bai, Bing; Lin, Yanguang; Zhu, Wentao; Ren, Ran; Li, Quanzheng; Dahlbom, Magnus; DiFilippo, Frank; Leahy, Richard M.

    2014-01-01

    Time-of-flight (TOF) information improves signal to noise ratio in Positron Emission Tomography (PET). Computation cost in processing TOF-PET sinograms is substantially higher than for nonTOF data because the data in each line of response is divided among multiple time of flight bins. This additional cost has motivated research into methods for rebinning TOF data into lower dimensional representations that exploit redundancies inherent in TOF data. We have previously developed approximate Fourier methods that rebin TOF data into either 3D nonTOF or 2D nonTOF formats. We refer to these methods respectively as FORET-3D and FORET-2D. Here we describe maximum a posteriori (MAP) estimators for use with FORET rebinned data. We first derive approximate expressions for the variance of the rebinned data. We then use these results to rescale the data so that the variance and mean are approximately equal allowing us to use the Poisson likelihood model for MAP reconstruction. MAP reconstruction from these rebinned data uses a system matrix in which the detector response model accounts for the effects of rebinning. Using these methods we compare performance of FORET-2D and 3D with TOF and nonTOF reconstructions using phantom and clinical data. Our phantom results show a small loss in contrast recovery at matched noise levels using FORET compared to reconstruction from the original TOF data. Clinical examples show FORET images that are qualitatively similar to those obtained from the original TOF-PET data but a small increase in variance at matched resolution. Reconstruction time is reduced by a factor of 5 and 30 using FORET3D+MAP and FORET2D+MAP respectively compared to 3D TOF MAP, which makes these methods attractive for clinical applications. PMID:24504374

  12. Action-Based Dynamical Modelling For The Milky Way Disk

    NASA Astrophysics Data System (ADS)

    Trick, Wilma; Rix, Hans-Walter; Bovy, Jo

    2016-09-01

    We present Road Mapping, a full-likelihood dynamical modelling machinery, that aims to recover the Milky Way's (MW) gravitational potential from large samples of stars in the Galactic disk. Road Mapping models the observed positions and velocities of stars with a parameterized, action-based distribution function (DF) in a parameterized axisymmetric gravitational potential (Binney & McMillan 2011, Binney 2012, Bovy & Rix 2013).In anticipation of the Gaia data release in autumn, we have fully tested Road Mapping and demonstrated its robustness against the breakdown of its assumptions.Using large suites of mock data, we investigated in isolated test cases how the modelling would be affected if the data's true potential or DF was not included in the families of potentials and DFs assumed by Road Mapping, or if we misjudged measurement errors or the spatial selection function (SF) (Trick et al., submitted to ApJ). We found that the potential can be robustly recovered — given the limitations of the assumed potential model—, even for minor misjudgments in DF or SF, or for proper motion errors or distances known to within 10%.We were also able to demonstrate that Road Mapping is still successful if the strong assumption of axisymmetric breaks down (Trick et al., in preparation). Data drawn from a highresolution simulation (D'Onghia et al. 2013) of a MW-like galaxy with pronounced spiral arms does neither follow the assumed simple DF, nor does it come from an axisymmetric potential. We found that as long as the survey volume is large enough, Road Mapping gives good average constraints on the galaxy's potential.We are planning to apply Road Mapping to a real data set — the Tycho-2 catalogue (Hog et al. 2000) —very soon, and might be able to present some preliminary results already at the conference.

  13. Legal substance use and the development of a DSM-IV cannabis use disorder during adolescence: the TRAILS study.

    PubMed

    Prince van Leeuwen, Andrea; Creemers, Hanneke E; Verhulst, Frank C; Vollebergh, Wilma A M; Ormel, Johan; van Oort, Floor; Huizink, Anja C

    2014-02-01

    To examine whether early onset of tobacco or alcohol use, and continued use of tobacco or alcohol in early adolescence, are related to a higher likelihood of developing a cannabis use disorder during adolescence. Data were used from four consecutive assessment waves of the TRacking Adolescents' Individual Lives Survey (TRAILS), a general Dutch population study. TRAILS is an ongoing longitudinal study that will follow the same group of adolescents from the ages of 10 to 24 years. The sample consisted of 1108 (58% female) adolescents (mean ages at the four assessment waves are 11.09, 13.56, 16.27 and 19.05 years, respectively) Cannabis use disorders were assessed using the Composite International Diagnostic Interview 3.0 (CIDI). Adolescent tobacco and alcohol use were assessed using self-report questionnaires. Early-onset tobacco use [odds ratio (OR) = 1.82, confidence interval (CI) = 1.05-3.14, P < 0.05], but not early-onset alcohol use (OR = 1.33, CI = 0.84-2.12, P > 0.05), was associated with a higher likelihood of developing a cannabis use disorder. Similarly, adolescents who reported continued use of tobacco (OR = 2.47, CI = 1.02-5.98, P < 0.05), but not continued use of alcohol (OR = 1.71, CI = 0.87-3.38, P > 0.05), were more likely to develop a cannabis use disorder. Early-onset and continued tobacco use appear to predict the development of a cannabis use disorder in adolescence, whereas early onset and continued alcohol use do not. © 2013 Society for the Study of Addiction.

  14. The Relationship of Dysthymia, Minor Depression, and Gender to Changes in Smoking for Current and Former Smokers: Longitudinal Evaluation in the U.S. Population

    PubMed Central

    Weinberger, Andrea H.; Pilver, Corey E.; Desai, Rani A.; Mazure, Carolyn M.; McKee, Sherry A.

    2012-01-01

    BACKGROUND Although data clearly link major depression and smoking, little is known about the association between dysthymia and minor depression and smoking behavior. The current study examined changes in smoking over three years for current and former smokers with and without dysthymia and minor depression. METHODS Participants who were current or former daily cigarette smokers at Wave 1 of the National Epidemiologic Survey on Alcohol and Related Conditions and completed the Wave 2 assessment were included in these analyses (n=11,973; 46% female). Analyses examined the main and gender-specific effects of current dysthymia, lifetime dysthymia, and minor depression (a single diagnostic category that denoted current and or lifetime prevalence) on continued smoking for Wave 1 current daily smokers and continued abstinence for Wave 1 former daily smokers. RESULTS Wave 1 current daily smokers with current dysthymia (OR=2.13, 95% CI=1.23, 3.70) or minor depression (OR=1.53, 95% CI=1.07, 2.18) were more likely than smokers without the respective diagnosis to report continued smoking at Wave 2. Wave 1 former daily smokers with current dysthymia (OR=0.44, 95% CI=0.20, 0.96) and lifetime dysthymia (OR=0.37, 95% CI=0.15, 0.91) were less likely than those without the diagnosis to remain abstinent from smoking at Wave 2. The gender-by-diagnosis interactions were not significant, suggesting that the impact of dysthymia and minor depression on smoking behavior is similar among men and women. CONCLUSIONS Current dysthymia and minor depression are associated with a greater likelihood of continued smoking; current and lifetime dysthymia are associated with a decreased likelihood of continued smoking abstinence. PMID:22809897

  15. The relationship of dysthymia, minor depression, and gender to changes in smoking for current and former smokers: longitudinal evaluation in the U.S. population.

    PubMed

    Weinberger, Andrea H; Pilver, Corey E; Desai, Rani A; Mazure, Carolyn M; McKee, Sherry A

    2013-01-01

    Although data clearly link major depression and smoking, little is known about the association between dysthymia and minor depression and smoking behavior. The current study examined changes in smoking over 3 years for current and former smokers with and without dysthymia and minor depression. Participants who were current or former daily cigarette smokers at Wave 1 of the National Epidemiologic Survey on Alcohol and Related Conditions and completed the Wave 2 assessment were included in these analyses (n=11,973; 46% female). Analyses examined the main and gender-specific effects of current dysthymia, lifetime dysthymia, and minor depression (a single diagnostic category that denoted current and/or lifetime prevalence) on continued smoking for Wave 1 current daily smokers and continued abstinence for Wave 1 former daily smokers. Wave 1 current daily smokers with current dysthymia (OR=2.13, 95% CI=1.23, 3.70) or minor depression (OR=1.53, 95% CI=1.07, 2.18) were more likely than smokers without the respective diagnosis to report continued smoking at Wave 2. Wave 1 former daily smokers with current dysthymia (OR=0.44, 95% CI=0.20, 0.96) and lifetime dysthymia (OR=0.37, 95% CI=0.15, 0.91) were less likely than those without the diagnosis to remain abstinent from smoking at Wave 2. The gender-by-diagnosis interactions were not significant, suggesting that the impact of dysthymia and minor depression on smoking behavior is similar among men and women. Current dysthymia and minor depression are associated with a greater likelihood of continued smoking; current and lifetime dysthymia are associated with a decreased likelihood of continued smoking abstinence. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. A Delphi Study to Operationalize Evidence-Based Predictors in Secondary Transition

    ERIC Educational Resources Information Center

    Rowe, Dawn A.; Alverson, Charlotte Y.; Unruh, Deanne K.; Fowler, Catherine H.; Kellems, Ryan; Test, David W.

    2015-01-01

    Although there are many activities (e.g., transition services), derived from correlational research, that occur while students are in school that increase the likelihood of positive post-school outcomes, many teachers continue to provide services shown to have little to no effect on outcomes of students with disabilities. The purpose of this study…

  17. Social Media and E-Learning in Response to Seismic Events: Resilient Practices

    ERIC Educational Resources Information Center

    Tull, Susan; Dabner, Nicki; Ayebi-Arthur, Kofi

    2017-01-01

    The motivation to adopt innovative communication and e-learning practices in education settings can be stimulated by events such as natural disasters. Education institutions in the Pacific Rim cannot avoid the likelihood of natural disasters that could close one or more buildings on a campus and affect their ability to continue current educational…

  18. Continuing HBCUs' Historical Commitment to Personnel Preparation: Preparing Transition Professionals to Serve Students of Color with Disabilities

    ERIC Educational Resources Information Center

    Grillo, Lisa Maria; Ellis, Antonio L.; Durham, Jaquial D.

    2017-01-01

    The presence of teachers of color in transition education initiatives increases the likelihood that students of color with disabilities will experience success in meeting their postsecondary goals. Proposing the inclusion of postsecondary transition in certificate or degree offerings at Historically Black Colleges and Universities (HBCUs) directly…

  19. 8 CFR 241.14 - Continued detention of removable aliens on account of special circumstances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... to a mental condition or personality disorder and behavior associated with that condition or disorder... personality disorder and behavior associated with that condition or disorder, the alien is likely to engage in... personality disorder; (iv) The likelihood that the alien will engage in acts of violence in the future; and (v...

  20. 8 CFR 241.14 - Continued detention of removable aliens on account of special circumstances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... to a mental condition or personality disorder and behavior associated with that condition or disorder... personality disorder and behavior associated with that condition or disorder, the alien is likely to engage in... personality disorder; (iv) The likelihood that the alien will engage in acts of violence in the future; and (v...

  1. 8 CFR 241.14 - Continued detention of removable aliens on account of special circumstances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... to a mental condition or personality disorder and behavior associated with that condition or disorder... personality disorder and behavior associated with that condition or disorder, the alien is likely to engage in... personality disorder; (iv) The likelihood that the alien will engage in acts of violence in the future; and (v...

  2. Multi-scale evaluation of the environmental controls on burn probability in a southern Sierra Nevada landscape

    Treesearch

    Sean A. Parks; Marc-Andre Parisien; Carol Miller

    2011-01-01

    We examined the scale-dependent relationship between spatial fire likelihood or burn probability (BP) and some key environmental controls in the southern Sierra Nevada, California, USA. Continuous BP estimates were generated using a fire simulation model. The correspondence between BP (dependent variable) and elevation, ignition density, fuels and aspect was evaluated...

  3. Differences in Health Behaviors of Overweight or Obese College Students Compared to Healthy Weight Students

    ERIC Educational Resources Information Center

    Harrington, M. Rachel; Ickes, Melinda J.

    2016-01-01

    Background: Obesity continues to be an epidemic in college students, yet research is warranted to determine whether obesity increases the likelihood of risky health behaviors in this population. Purpose: The purpose of this study was to examine the association between body mass index (BMI) and health behaviors in college students. Methods: A…

  4. Understanding Female Sport Attrition in a Stereotypical Male Sport within the Framework of Eccles's Expectancy-Value Model

    ERIC Educational Resources Information Center

    Guillet, Emma; Sarrazin, Philippe; Fontayne, Paul; Brustad, Robert J.

    2006-01-01

    An empirical research study based upon the expectancy-value model of Eccles and colleagues (1983) investigated the effect of gender-role orientations on psychological dimensions of female athletes' sport participation and the likelihood of their continued participation in a stereotypical masculine activity. The model (Eccles et al., 1983) posits…

  5. Chapter 13: Effects of fuel and vegetation management activities on nonnative invasive plants

    Treesearch

    Erik J. Martinson; Molly E. Hunter; Jonathan P. Freeman; Philip N. Omi

    2008-01-01

    Twentieth century land use and management practices have increased the vertical and horizontal continuity of fuels over expansive landscapes. Thus the likelihood of large, severe wildfires has increased, especially in forest types that previously experienced more frequent, less severe fire (Allen and others 2002). Disturbances such as fire may promote nonnative plant...

  6. Logic Models as a Way to Support Online Students and Their Projects

    ERIC Educational Resources Information Center

    Strycker, Jesse

    2016-01-01

    As online enrollment continues to grow, students may need additional pedagogical supports to increase their likelihood of success in online environments that don't offer the same supports as those found in face to face classrooms. Logic models are a way to provide such support to students by helping to model project expectations, allowing students…

  7. U. S. light duty vehicle fleet emissions performance and the emissions impact of technology changes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabourin, M.; Platte, L

    1988-01-01

    This paper determines the level of improvement in Federal Test Procedure (FTP) exhaust emissions realized by typical in-use vehicles over the last twenty years as emission standards have become increasingly stringent. Furthermore, this paper explores the likelihood that in-use emission performance improvements will continue now that emission standards have stabilized.

  8. The Likelihood of Collaboration Between Central American Transnational Gangs and Terrorist Organizations

    DTIC Science & Technology

    2007-03-01

    As Rolando Gamez, a resident of Escuintla – a town 28 miles southwest of the capital, Guatemala City – maintains, "This is a war and the gang...Director of Investigations in El Salvador, Douglas Omar Garcia Fumes, agrees, "They continue to operate even after they’re arrested. Orders to kill are

  9. 24 CFR 570.456 - Ineligible activities and limitations on eligible activities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... relocation of a plant or facility from one area to another, if it is demonstrated to HUD's satisfaction that... has been a significant current pattern of movement, to areas reasonably proximate, of jobs of the... significant pattern of job movement and the likelihood of continuation of such a pattern has been from a...

  10. 47 CFR 73.4108 - FM transmitter site map submissions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false FM transmitter site map submissions. 73.4108 Section 73.4108 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.4108 FM transmitter site map...

  11. Higher-dimensional attractors with absolutely continuous invariant probability

    NASA Astrophysics Data System (ADS)

    Bocker, Carlos; Bortolotti, Ricardo

    2018-05-01

    Consider a dynamical system given by , where E is a linear expanding map of , C is a linear contracting map of and f is in . We provide sufficient conditions for E that imply the existence of an open set of pairs for which the corresponding dynamic T admits a unique absolutely continuous invariant probability. A geometrical characteristic of transversality between self-intersections of images of is present in the dynamic of the maps in . In addition, we give a condition between E and C under which it is possible to perturb f to obtain a pair in .

  12. Radar studies of the planets. [radar measurements of lunar surface, Mars, Mercury, and Venus

    NASA Technical Reports Server (NTRS)

    Ingalls, R. P.; Pettengill, G. H.; Rogers, A. E. E.; Sebring, P. B. (Editor); Shapiro, I. I.

    1974-01-01

    The radar measurements phase of the lunar studies involving reflectivity and topographic mapping of the visible lunar surface was ended in December 1972, but studies of the data and production of maps have continued. This work was supported by Manned Spacecraft Center, Houston. Topographic mapping of the equatorial regions of Mars has been carried out during the period of each opposition since that of 1967. The method comprised extended precise traveling time measurements to a small area centered on the subradar point. As measurements continued, planetary motions caused this point to sweep out extensive areas in both latitude and longitude permitting the development of a fairly extensive topographical map in the equatorial region. Radar observations of Mercury and Venus have also been made over the past few years. Refinements of planetary motions, reflectivity maps and determinations of rotation rates have resulted.

  13. New algorithms and methods to estimate maximum-likelihood phylogenies: assessing the performance of PhyML 3.0.

    PubMed

    Guindon, Stéphane; Dufayard, Jean-François; Lefort, Vincent; Anisimova, Maria; Hordijk, Wim; Gascuel, Olivier

    2010-05-01

    PhyML is a phylogeny software based on the maximum-likelihood principle. Early PhyML versions used a fast algorithm performing nearest neighbor interchanges to improve a reasonable starting tree topology. Since the original publication (Guindon S., Gascuel O. 2003. A simple, fast and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst. Biol. 52:696-704), PhyML has been widely used (>2500 citations in ISI Web of Science) because of its simplicity and a fair compromise between accuracy and speed. In the meantime, research around PhyML has continued, and this article describes the new algorithms and methods implemented in the program. First, we introduce a new algorithm to search the tree space with user-defined intensity using subtree pruning and regrafting topological moves. The parsimony criterion is used here to filter out the least promising topology modifications with respect to the likelihood function. The analysis of a large collection of real nucleotide and amino acid data sets of various sizes demonstrates the good performance of this method. Second, we describe a new test to assess the support of the data for internal branches of a phylogeny. This approach extends the recently proposed approximate likelihood-ratio test and relies on a nonparametric, Shimodaira-Hasegawa-like procedure. A detailed analysis of real alignments sheds light on the links between this new approach and the more classical nonparametric bootstrap method. Overall, our tests show that the last version (3.0) of PhyML is fast, accurate, stable, and ready to use. A Web server and binary files are available from http://www.atgc-montpellier.fr/phyml/.

  14. Automated thematic mapping and change detection of ERTS-A images. [digital interpretation of Arizona imagery

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. For the recognition of terrain types, spatial signatures are developed from the diffraction patterns of small areas of ERTS-1 images. This knowledge is exploited for the measurements of a small number of meaningful spatial features from the digital Fourier transforms of ERTS-1 image cells containing 32 x 32 picture elements. Using these spatial features and a heuristic algorithm, the terrain types in the vicinity of Phoenix, Arizona were recognized by the computer with a high accuracy. Then, the spatial features were combined with spectral features and using the maximum likelihood criterion the recognition accuracy of terrain types increased substantially. It was determined that the recognition accuracy with the maximum likelihood criterion depends on the statistics of the feature vectors. Nonlinear transformations of the feature vectors are required so that the terrain class statistics become approximately Gaussian. It was also determined that for a given geographic area the statistics of the classes remain invariable for a period of a month but vary substantially between seasons.

  15. Maximum likelihood sequence estimation for optical complex direct modulation.

    PubMed

    Che, Di; Yuan, Feng; Shieh, William

    2017-04-17

    Semiconductor lasers are versatile optical transmitters in nature. Through the direct modulation (DM), the intensity modulation is realized by the linear mapping between the injection current and the light power, while various angle modulations are enabled by the frequency chirp. Limited by the direct detection, DM lasers used to be exploited only as 1-D (intensity or angle) transmitters by suppressing or simply ignoring the other modulation. Nevertheless, through the digital coherent detection, simultaneous intensity and angle modulations (namely, 2-D complex DM, CDM) can be realized by a single laser diode. The crucial technique of CDM is the joint demodulation of intensity and differential phase with the maximum likelihood sequence estimation (MLSE), supported by a closed-form discrete signal approximation of frequency chirp to characterize the MLSE transition probability. This paper proposes a statistical method for the transition probability to significantly enhance the accuracy of the chirp model. Using the statistical estimation, we demonstrate the first single-channel 100-Gb/s PAM-4 transmission over 1600-km fiber with only 10G-class DM lasers.

  16. Mapping chemicals in air using an environmental CAT scanning system: evaluation of algorithms

    NASA Astrophysics Data System (ADS)

    Samanta, A.; Todd, L. A.

    A new technique is being developed which creates near real-time maps of chemical concentrations in air for environmental and occupational environmental applications. This technique, we call Environmental CAT Scanning, combines the real-time measuring technique of open-path Fourier transform infrared spectroscopy with the mapping capabilitites of computed tomography to produce two-dimensional concentration maps. With this system, a network of open-path measurements is obtained over an area; measurements are then processed using a tomographic algorithm to reconstruct the concentrations. This research focussed on the process of evaluating and selecting appropriate reconstruction algorithms, for use in the field, by using test concentration data from both computer simultation and laboratory chamber studies. Four algorithms were tested using three types of data: (1) experimental open-path data from studies that used a prototype opne-path Fourier transform/computed tomography system in an exposure chamber; (2) synthetic open-path data generated from maps created by kriging point samples taken in the chamber studies (in 1), and; (3) synthetic open-path data generated using a chemical dispersion model to create time seires maps. The iterative algorithms used to reconstruct the concentration data were: Algebraic Reconstruction Technique without Weights (ART1), Algebraic Reconstruction Technique with Weights (ARTW), Maximum Likelihood with Expectation Maximization (MLEM) and Multiplicative Algebraic Reconstruction Technique (MART). Maps were evaluated quantitatively and qualitatively. In general, MART and MLEM performed best, followed by ARTW and ART1. However, algorithm performance varied under different contaminant scenarios. This study showed the importance of using a variety of maps, particulary those generated using dispersion models. The time series maps provided a more rigorous test of the algorithms and allowed distinctions to be made among the algorithms. A comprehensive evaluation of algorithms, for the environmental application of tomography, requires the use of a battery of test concentration data before field implementation, which models reality and tests the limits of the algorithms.

  17. The National Map Pilot Projects

    USGS Publications Warehouse

    ,

    2002-01-01

    The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information. The National Map will serve as a foundation for integrating, sharing, and using other government and private sector data easily and consistently.

  18. Minimalism through intraoperative functional mapping.

    PubMed

    Berger, M S

    1996-01-01

    Intraoperative stimulation mapping may be used to avoid unnecessary risk to functional regions subserving language and sensori-motor pathways. Based on the data presented here, language localization is variable in the entire population, with only certainty existing for the inferior frontal region responsible for motor speech. Anatomical landmarks such as the anterior temporal tip for temporal lobe language sites and the posterior aspect of the lateral sphenoid wing for the frontal lobe language zones are unreliable in avoiding postoperative aphasias. Thus, individual mapping to identify essential language sites has the greatest likelihood of avoiding permanent deficits in naming, reading, and motor speech. In a similar approach, motor and sensory pathways from the cortex and underlying white matter may be reliably stimulated and mapped in both awake and asleep patients. Although these techniques require an additional operative time and equipment nominally priced, the result is often gratifying, as postoperative morbidity has been greatly reduced in the process of incorporating these surgical strategies. The patients quality of life is improved in terms of seizure control, with or without antiepileptic drugs. This avoids having to perform a second costly operative procedure, which is routinely done when extraoperative stimulation and recording is done via subdural grids. In addition, an aggressive tumor resection at the initial operation lengthens the time to tumor recurrence and often obviates the need for a subsequent reoperation. Thus, intraoperative functional mapping may be best alluded to as a surgical technique that results in "minimalism in the long term".

  19. Soil Salinity Mapping in Everglades National Park Using Remote Sensing Techniques

    NASA Astrophysics Data System (ADS)

    Su, H.; Khadim, F. K.; Blankenship, J.; Sobhan, K.

    2017-12-01

    The South Florida Everglades is a vast subtropical wetland with a globally unique hydrology and ecology, and it is designated as an International Biosphere Reserve and a Wetland of International Importance. Everglades National Park (ENP) is a hydro-ecologically enriched wetland with varying salinity contents, which is a concern for terrestrial ecosystem balance and sustainability. As such, in this study, time series soil salinity mapping was carried out for the ENP area. The mapping first entailed a maximum likelihood classification of seven land cover classes for the ENP area—namely mangrove forest, mangrove scrub, low-density forest, sawgrass, prairies and marshes, barren lands with woodland hammock and water—for the years 1996, 2000, 2006, 2010 and 2015. The classifications for 1996-2010 yielded accuracies of 82%-94%, and the 2015 classification was supported through ground truthing. Afterwards, electric conductivity (EC) tolerance thresholds for each vegetation class were established,which yielded soil salinity maps comprising four soil salinity classes—i.e., the non- (EC = 0 2 dS/m), low- (EC = 2 4 dS/m), moderate- (EC = 4 8 dS/m) and high-saline (EC = >8 dS/m) areas. The soil salinity maps visualized the spatial distribution of soil salinity with no significant temporal variations. The innovative approach of "land cover identification to salinity estimation" used in the study is pragmatic and application oriented, and the study upshots are also useful, considering the diversifying ecological context of the ENP area.

  20. Elements of the patient-centered medical home associated with health outcomes among veterans: the role of primary care continuity, expanded access, and care coordination.

    PubMed

    Nelson, Karin; Sun, Haili; Dolan, Emily; Maynard, Charles; Beste, Laruen; Bryson, Christopher; Schectman, Gordon; Fihn, Stephan D

    2014-01-01

    Care continuity, access, and coordination are important features of the patient-centered medical home model and have been emphasized in the Veterans Health Administration patient-centered medical home implementation, called the Patient Aligned Care Team. Data from more than 4.3 million Veterans were used to assess the relationship between these attributes of Patient Aligned Care Team and Veterans Health Administration hospitalization and mortality. Controlling for demographics and comorbidity, we found that continuity with a primary care provider was associated with a lower likelihood of hospitalization and mortality among a large population of Veterans receiving VA primary care.

  1. Impacts of the Interim Federal Health Program reforms: A stakeholder analysis of barriers to health care access and provision for refugees.

    PubMed

    Antonipillai, Valentina; Baumann, Andrea; Hunter, Andrea; Wahoush, Olive; O'Shea, Timothy

    2017-11-09

    Changes to the Interim Federal Health Program (IFHP) in 2012 reduced health care access for refugees and refugee claimants, generating concerns among key stakeholders. In 2014, a new IFHP temporarily reinstated access to some health services; however, little is known about these changes, and more information is needed to map the IFHP's impact. This study explores barriers occurring during the time period of the IFHP reforms to health care access and provision for refugees. A stakeholder analysis, using 23 semi-structured interviews, was conducted to obtain insight into stakeholder perceptions of the 2014 reforms, as well as stakeholders' position and their influence to assess the acceptability of the IFHP changes. The majority of stakeholders expressed concerns about the 2014 IFHP changes as a result of the continuing barriers posed by the 2012 retrenchments and the emergence of new barriers to health care access and provision for refugees. Key barriers identified included lack of communication and awareness, lack of continuity and comprehensive care, negative political discourse and increased costs. A few stakeholders supported the reforms as they represented some, but limited, access to health care. Overall, the reforms to the IFHP in 2014 generated barriers to health care access and provision that contributed to confusion among stakeholders, the transfer of refugee health responsibility to provincial authorities and the likelihood of increased health outcome disparities, as refugees and refugee claimants chose to delay seeking health care. The study recommends that policy-makers engage with refugee health stakeholders to formulate a policy that improves health care provision and access for refugee populations.

  2. The World Karst Aquifer Mapping project: concept, mapping procedure and map of Europe

    NASA Astrophysics Data System (ADS)

    Chen, Zhao; Auler, Augusto S.; Bakalowicz, Michel; Drew, David; Griger, Franziska; Hartmann, Jens; Jiang, Guanghui; Moosdorf, Nils; Richts, Andrea; Stevanovic, Zoran; Veni, George; Goldscheider, Nico

    2017-05-01

    Karst aquifers contribute substantially to freshwater supplies in many regions of the world, but are vulnerable to contamination and difficult to manage because of their unique hydrogeological characteristics. Many karst systems are hydraulically connected over wide areas and require transboundary exploration, protection and management. In order to obtain a better global overview of karst aquifers, to create a basis for sustainable international water-resources management, and to increase the awareness in the public and among decision makers, the World Karst Aquifer Mapping (WOKAM) project was established. The goal is to create a world map and database of karst aquifers, as a further development of earlier maps. This paper presents the basic concepts and the detailed mapping procedure, using France as an example to illustrate the step-by-step workflow, which includes generalization, differentiation of continuous and discontinuous carbonate and evaporite rock areas, and the identification of non-exposed karst aquifers. The map also shows selected caves and karst springs, which are collected in an associated global database. The draft karst aquifer map of Europe shows that 21.6% of the European land surface is characterized by the presence of (continuous or discontinuous) carbonate rocks; about 13.8% of the land surface is carbonate rock outcrop.

  3. Performance of Low-Density Parity-Check Coded Modulation

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2010-01-01

    This paper reports the simulated performance of each of the nine accumulate-repeat-4-jagged-accumulate (AR4JA) low-density parity-check (LDPC) codes [3] when used in conjunction with binary phase-shift-keying (BPSK), quadrature PSK (QPSK), 8-PSK, 16-ary amplitude PSK (16- APSK), and 32-APSK.We also report the performance under various mappings of bits to modulation symbols, 16-APSK and 32-APSK ring scalings, log-likelihood ratio (LLR) approximations, and decoder variations. One of the simple and well-performing LLR approximations can be expressed in a general equation that applies to all of the modulation types.

  4. Earthquake outlook for the San Francisco Bay region 2014–2043

    USGS Publications Warehouse

    Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn

    2016-06-13

    Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.

  5. The changing of coastal landform at Chikou barrier island and lagoon coast, Tainan, Southwestern Taiwan

    NASA Astrophysics Data System (ADS)

    Jen, C.-H.; Chyi, S.-J.; Hsiao, L.-L.; Wu, M.-S.; Lei, H.-F.

    2012-04-01

    The coast of southwestern Taiwan is mainly made of barriers and lagoons, which are prone to erosional and depositional processes. By using a serial maps, historical survey data, and RTK-GPS survey data, the changes of coast landforms are depicted. The maps being used in this study include (1) 1904 map(1:50000 scale), (2) 1920 map (1:50000 scale), (3) 1921 map (1:25000 scale), (4) 1924 map (1:25000 scale), (5) 1956 map (1:25000 scale), (6) 1975 map with ortho-rectified image (1:5000 scale), (7) 1983 map with ortho-rectified image (1:5000 scale), (8) 1989 map with ortho-rectified image (1:5000 scale), (9) 1992 map with ortho-rectified image (1:5000 scale), (10) 2001 map with ortho-rectified image (1:5000 scale). All maps are scanned and georeferenced to build a GIS archive for digitizing and further analysis. The results show that this coast was made of continuous sand barriers and lagoons. While lagoons were gradually shrinking, the sand barriers had remained stable from 1904 to 1924. After that, lagoons substantially deposited in the southern part and sand barriers became landward. In 1975 map, lagoons vanished and replaced with a tidal flat and tidal creeks. The following maps show that lagoons start to form again and sand barriers moving landward continuously. It is a significant sign of serious erosion in the coast. The RTK-GPS survey data in recent years show more detail of coast erosion and landform changes. The post-typhoon investigation results show that the seaward side of barrier island is eroded largely, especially for the two segments of the central part of the barrier island. Some depositions were found on the top of northern and central part of barrier dune, as well as washovers. In the southern barrier island, the depositions were carried to backshore and were obstructed in front of the bamboo piles and marine solid bags. The survey indicated the areas eroded by storm surge were gradually accumulating except for the beaches separate with plastic sheet piles and marine solid bags, especially the northern section-north, after the Typhoon Megi happened two month. In late February of 2011, there are some deposition on the top of primary dune, backdune and tidal flat. But the parts of seaward beach which wave can reach are continuously eroded, especially the central segment of the barrier island is mostly vulnerable. In particular, the latter part of southern beach was accumulated, concerning with alongshore current transport. In the late winter monsoon season, elevation changes are smaller than in the medium, corresponding with the wave condition. The latter part of south section begin to be eroded, the sediments may be taken away by the southward current. Area A, located the central of barrier island, attacked by wave continuously, elevation of dune decrease constantly, and then overwashed frequently. Keywords: sand barrier and lagoon coast, archive map analysis, RTK-GPS survey, overwash

  6. Intention of Continuing to use the Hospital Information System: Integrating the elaboration-likelihood, social influence and cognitive learning.

    PubMed

    Farzandipour, Mehrdad; Mohamadian, Hashem; Sohrabi, Niloufar

    2016-12-01

    Anticipating effective factors in information system acceptance by using persuasive messages, is one of the main issues less focused on so far. This is one of the first attempts at using the elaboration-likelihood model combined with the perception of emotional, cognitive, self-efficacy, informational and normative influence constructs, in order to investigate the determinants of intention to continue use of the hospital information system in Iran. The present study is a cross-sectional survey conducted in 2014. 600 nursing staff were chosen from clinical sectors of public hospitals using purposive sampling. The questionnaire survey was in two parts: Part one was comprised of demographic data, and part two included 52 questions pertaining to the constructs of the model in the study. To analyze the data, structural equation model using LISREL 8.5 software was applied. The findings suggest that self-efficacy (t= 6.01, β= 0.21), affective response (t= 5.84, β= 0.23), and cognitive response (t= 4.97, β= 0.21) explained 64% of the variance for the intention of continuing to use the hospital information system. Furthermore, the final model was able to explain 0.46 for self-efficacy, 0.44 for normative social influence, 0.52 for affective response, 0.55 for informational social influence, and 0.53 for cognitive response. Designing the necessary mechanisms and effective use of appropriate strategies to improve emotional and cognitive understanding and self-efficacy of the nursing staff is required, in order to increase the intention of continued use of the hospital information system in Iran.

  7. Dose-finding designs using a novel quasi-continuous endpoint for multiple toxicities

    PubMed Central

    Ezzalfani, Monia; Zohar, Sarah; Qin, Rui; Mandrekar, Sumithra J; Deley, Marie-Cécile Le

    2013-01-01

    The aim of a phase I oncology trial is to identify a dose with an acceptable safety profile. Most phase I designs use the dose-limiting toxicity, a binary endpoint, to assess the unacceptable level of toxicity. The dose-limiting toxicity might be incomplete for investigating molecularly targeted therapies as much useful toxicity information is discarded. In this work, we propose a quasi-continuous toxicity score, the total toxicity profile (TTP), to measure quantitatively and comprehensively the overall severity of multiple toxicities. We define the TTP as the Euclidean norm of the weights of toxicities experienced by a patient, where the weights reflect the relative clinical importance of each grade and toxicity type. We propose a dose-finding design, the quasi-likelihood continual reassessment method (CRM), incorporating the TTP score into the CRM, with a logistic model for the dose–toxicity relationship in a frequentist framework. Using simulations, we compared our design with three existing designs for quasi-continuous toxicity score (the Bayesian quasi-CRM with an empiric model and two nonparametric designs), all using the TTP score, under eight different scenarios. All designs using the TTP score to identify the recommended dose had good performance characteristics for most scenarios, with good overdosing control. For a sample size of 36, the percentage of correct selection for the quasi-likelihood CRM ranged from 80% to 90%, with similar results for the quasi-CRM design. These designs with TTP score present an appealing alternative to the conventional dose-finding designs, especially in the context of molecularly targeted agents. PMID:23335156

  8. Continuing Long Term Optical and Infrared Reverberation Mapping of 17 Sloan Digital Sky Survey Quasars

    NASA Astrophysics Data System (ADS)

    Gorjian, Varoujan; Barth, Aaron; Brandt, Niel; Dawson, Kyle; Green, Paul; Ho, Luis; Horne, Keith; Jiang, Linhua; McGreer, Ian; Schneider, Donald; Shen, Yue; Tao, Charling

    2018-05-01

    Previous Spitzer reverberation monitoring projects searching for UV/optical light absorbed and re-emitted in the IR by dust have been limited to low luminosity active galactic nuclei (AGN) that could potentially show reverberation within a single cycle ( 1 year). Cycle 11-12's two year baseline allowed for the reverberation mapping of 17 high-luminosity quasars from the Sloan Digital Sky Survey Reverberation Mapping project. We continued this monitoring in Cycle 13 and now propose to extend this program in Cycle 14. By combining ground-based monitoring from Pan-STARRS, CFHT, and Steward Observatory telescopes with Spitzer data we have for the first time detected dust reverberation in quasars. By continuing observations with this unqiue combination of resources we should detect reverberation in more objects and reduce the uncertainties for the remaining sources.

  9. Grids in topographic maps reduce distortions in the recall of learned object locations.

    PubMed

    Edler, Dennis; Bestgen, Anne-Kathrin; Kuchinke, Lars; Dickmann, Frank

    2014-01-01

    To date, it has been shown that cognitive map representations based on cartographic visualisations are systematically distorted. The grid is a traditional element of map graphics that has rarely been considered in research on perception-based spatial distortions. Grids do not only support the map reader in finding coordinates or locations of objects, they also provide a systematic structure for clustering visual map information ("spatial chunks"). The aim of this study was to examine whether different cartographic kinds of grids reduce spatial distortions and improve recall memory for object locations. Recall performance was measured as both the percentage of correctly recalled objects (hit rate) and the mean distance errors of correctly recalled objects (spatial accuracy). Different kinds of grids (continuous lines, dashed lines, crosses) were applied to topographic maps. These maps were also varied in their type of characteristic areas (LANDSCAPE) and different information layer compositions (DENSITY) to examine the effects of map complexity. The study involving 144 participants shows that all experimental cartographic factors (GRID, LANDSCAPE, DENSITY) improve recall performance and spatial accuracy of learned object locations. Overlaying a topographic map with a grid significantly reduces the mean distance errors of correctly recalled map objects. The paper includes a discussion of a square grid's usefulness concerning object location memory, independent of whether the grid is clearly visible (continuous or dashed lines) or only indicated by crosses.

  10. Digital Mapping Techniques '11–12 workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2014-01-01

    At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  11. Planck 2013 results. XXVI. Background geometry and topology of the Universe

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Enßlin, T. A.; Eriksen, H. K.; Fabre, O.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Laureijs, R. J.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Leroy, C.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; McEwen, J. D.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Peiris, H. V.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pogosyan, D.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Riazuelo, A.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-11-01

    The new cosmic microwave background (CMB) temperature maps from Planck provide the highest-quality full-sky view of the surface of last scattering available to date. This allows us to detect possible departures from the standard model of a globally homogeneous and isotropic cosmology on the largest scales. We search for correlations induced by a possible non-trivial topology with a fundamental domain intersecting, or nearly intersecting, the last scattering surface (at comoving distance χrec), both via a direct search for matched circular patterns at the intersections and by an optimal likelihood search for specific topologies. For the latter we consider flat spaces with cubic toroidal (T3), equal-sided chimney (T2) and slab (T1) topologies, three multi-connected spaces of constant positive curvature (dodecahedral, truncated cube and octahedral) and two compact negative-curvature spaces. These searches yield no detection of the compact topology with the scale below the diameter of the last scattering surface. For most compact topologies studied the likelihood maximized over the orientation of the space relative to the observed map shows some preference for multi-connected models just larger than the diameter of the last scattering surface. Since this effect is also present in simulated realizations of isotropic maps, we interpret it as the inevitable alignment of mild anisotropic correlations with chance features in a single sky realization; such a feature can also be present, in milder form, when the likelihood is marginalized over orientations. Thus marginalized, the limits on the radius ℛi of the largest sphere inscribed in topological domain (at log-likelihood-ratio Δln ℒ > -5 relative to a simply-connected flat Planck best-fit model) are: in a flat Universe, ℛi> 0.92χrec for the T3 cubic torus; ℛi> 0.71χrec for the T2 chimney; ℛi> 0.50χrec for the T1 slab; and in a positively curved Universe, ℛi> 1.03χrec for the dodecahedral space; ℛi> 1.0χrec for the truncated cube; and ℛi> 0.89χrec for the octahedral space. The limit for a wider class of topologies, i.e., those predicting matching pairs of back-to-back circles, among them tori and the three spherical cases listed above, coming from the matched-circles search, is ℛi> 0.94χrec at 99% confidence level. Similar limits apply to a wide, although not exhaustive, range of topologies. We also perform a Bayesian search for an anisotropic global Bianchi VIIh geometry. In the non-physical setting where the Bianchi cosmology is decoupled from the standard cosmology, Planck data favour the inclusion of a Bianchi component with a Bayes factor of at least 1.5 units of log-evidence. Indeed, the Bianchi pattern is quite efficient at accounting for some of the large-scale anomalies found in Planck data. However, the cosmological parameters that generate this pattern are in strong disagreement with those found from CMB anisotropy data alone. In the physically motivated setting where the Bianchi parameters are coupled and fitted simultaneously with the standard cosmological parameters, we find no evidence for a Bianchi VIIh cosmology and constrain the vorticity of such models to (ω/H)0< 8.1 × 10-10 (95% confidence level).

  12. Continuous family of finite-dimensional representations of a solvable Lie algebra arising from singularities

    PubMed Central

    Yau, Stephen S.-T.

    1983-01-01

    A natural mapping from the set of complex analytic isolated hypersurface singularities to the set of finite dimensional Lie algebras is first defined. It is proven that the image under this natural mapping is contained in the set of solvable Lie algebras. This approach gives rise to a continuous inequivalent family of finite dimensional representations of a solvable Lie algebra. PMID:16593401

  13. Blind beam-hardening correction from Poisson measurements

    NASA Astrophysics Data System (ADS)

    Gu, Renliang; Dogandžić, Aleksandar

    2016-02-01

    We develop a sparse image reconstruction method for Poisson-distributed polychromatic X-ray computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. We employ our mass-attenuation spectrum parameterization of the noiseless measurements and express the mass- attenuation spectrum as a linear combination of B-spline basis functions of order one. A block coordinate-descent algorithm is developed for constrained minimization of a penalized Poisson negative log-likelihood (NLL) cost function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and nonnegativity and sparsity of the density map image; the image sparsity is imposed using a convex total-variation (TV) norm penalty term. This algorithm alternates between a Nesterov's proximal-gradient (NPG) step for estimating the density map image and a limited-memory Broyden-Fletcher-Goldfarb-Shanno with box constraints (L-BFGS-B) step for estimating the incident-spectrum parameters. To accelerate convergence of the density- map NPG steps, we apply function restart and a step-size selection scheme that accounts for varying local Lipschitz constants of the Poisson NLL. Real X-ray CT reconstruction examples demonstrate the performance of the proposed scheme.

  14. System dynamic modelling to assess economic viability and risk trade-offs for ecological restoration in South Africa.

    PubMed

    Crookes, D J; Blignaut, J N; de Wit, M P; Esler, K J; Le Maitre, D C; Milton, S J; Mitchell, S A; Cloete, J; de Abreu, P; Fourie nee Vlok, H; Gull, K; Marx, D; Mugido, W; Ndhlovu, T; Nowell, M; Pauw, M; Rebelo, A

    2013-05-15

    Can markets assist by providing support for ecological restoration, and if so, under what conditions? The first step in addressing this question is to develop a consistent methodology for economic evaluation of ecological restoration projects. A risk analysis process was followed in which a system dynamics model was constructed for eight diverse case study sites where ecological restoration is currently being pursued. Restoration costs vary across each of these sites, as do the benefits associated with restored ecosystem functioning. The system dynamics model simulates the ecological, hydrological and economic benefits of ecological restoration and informs a portfolio mapping exercise where payoffs are matched against the likelihood of success of a project, as well as a number of other factors (such as project costs and risk measures). This is the first known application that couples ecological restoration with system dynamics and portfolio mapping. The results suggest an approach that is able to move beyond traditional indicators of project success, since the effect of discounting is virtually eliminated. We conclude that systems dynamic modelling with portfolio mapping can guide decisions on when markets for restoration activities may be feasible. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Intermediate-scale vegetation mapping of Innoko National Wildlife Refuge, Alaska using Landsat MSS digital data

    USGS Publications Warehouse

    Talbot, Stephen S.; Markon, Carl J.

    1988-01-01

    A Landsat-derived vegetation map was prepared for lnnoko National Wildlife Refuge. The refuge lies within the northern boreal subzone of northwestern central Alaska. Six major vegetation classes and 21 subclasses were recognized: forest (closed needleleaf, open needleleaf, needleleaf woodland, mixed, and broadleaf); broadleaf scrub (lowland, upland burn regeneration, subalpine); dwarf scrub (prostrate dwarf shrub tundra, erect dwarf shrub heath, dwarf shrub-graminoid peatland, dwarf shrub-graminoid tussock peatland, dwarf shrub raised bog with scattered trees, dwarf shrub-graminoid marsh); herbaceous (graminoid bog, graminoid marsh, graminoid tussock-dwarf shrub peatland); scarcely vegetated areas (scarcely vegetated scree and floodplain); and water (clear, sedimented). The methodology employed a cluster-block technique. Sample areas were described based on a combination of helicopter-ground survey, aerial photo-interpretation, and digital Landsat data. Major steps in the Landsat analysis involved preprocessing (geometric correction), derivation of statistical parameters for spectral classes, spectral class labeling of sample areas, preliminary classification of the entire study area using a maximum-likelihood algorithm, and final classification utilizing ancillary information such as digital elevation data. The final product is 1:250,000-scale vegetation map representative of distinctive regional patterns and suitable for use in comprehensive conservation planning.

  16. Inventory and mapping of flood inundation using interactive digital image analysis techniques

    USGS Publications Warehouse

    Rohde, Wayne G.; Nelson, Charles A.; Taranik, J.V.

    1979-01-01

    LANDSAT digital data and color infra-red photographs were used in a multiphase sampling scheme to estimate the area of agricultural land affected by a flood. The LANDSAT data were classified with a maximum likelihood algorithm. Stratification of the LANDSAT data, prior to classification, greatly reduced misclassification errors. The classification results were used to prepare a map overlay showing the areal extent of flooding. These data also provided statistics required to estimate sample size in a two phase sampling scheme, and provided quick, accurate estimates of areas flooded for the first phase. The measurements made in the second phase, based on ground data and photo-interpretation, were used with two phase sampling statistics to estimate the area of agricultural land affected by flooding These results show that LANDSAT digital data can be used to prepare map overlays showing the extent of flooding on agricultural land and, with two phase sampling procedures, can provide acreage estimates with sampling errors of about 5 percent. This procedure provides a technique for rapidly assessing the areal extent of flood conditions on agricultural land and would provide a basis for designing a sampling framework to estimate the impact of flooding on crop production.

  17. Mapping of epistatic quantitative trait loci in four-way crosses.

    PubMed

    He, Xiao-Hong; Qin, Hongde; Hu, Zhongli; Zhang, Tianzhen; Zhang, Yuan-Ming

    2011-01-01

    Four-way crosses (4WC) involving four different inbred lines often appear in plant and animal commercial breeding programs. Direct mapping of quantitative trait loci (QTL) in these commercial populations is both economical and practical. However, the existing statistical methods for mapping QTL in a 4WC population are built on the single-QTL genetic model. This simple genetic model fails to take into account QTL interactions, which play an important role in the genetic architecture of complex traits. In this paper, therefore, we attempted to develop a statistical method to detect epistatic QTL in 4WC population. Conditional probabilities of QTL genotypes, computed by the multi-point single locus method, were used to sample the genotypes of all putative QTL in the entire genome. The sampled genotypes were used to construct the design matrix for QTL effects. All QTL effects, including main and epistatic effects, were simultaneously estimated by the penalized maximum likelihood method. The proposed method was confirmed by a series of Monte Carlo simulation studies and real data analysis of cotton. The new method will provide novel tools for the genetic dissection of complex traits, construction of QTL networks, and analysis of heterosis.

  18. Classifying spatially heterogeneous wetland communities using machine learning algorithms and spectral and textural features.

    PubMed

    Szantoi, Zoltan; Escobedo, Francisco J; Abd-Elrahman, Amr; Pearlstine, Leonard; Dewitt, Bon; Smith, Scot

    2015-05-01

    Mapping of wetlands (marsh vs. swamp vs. upland) is a common remote sensing application.Yet, discriminating between similar freshwater communities such as graminoid/sedge fromremotely sensed imagery is more difficult. Most of this activity has been performed using medium to low resolution imagery. There are only a few studies using highspatial resolutionimagery and machine learning image classification algorithms for mapping heterogeneouswetland plantcommunities. This study addresses this void by analyzing whether machine learning classifierssuch as decisiontrees (DT) and artificial neural networks (ANN) can accurately classify graminoid/sedgecommunities usinghigh resolution aerial imagery and image texture data in the Everglades National Park, Florida.In addition tospectral bands, the normalized difference vegetation index, and first- and second-order texturefeatures derivedfrom the near-infrared band were analyzed. Classifier accuracies were assessed using confusiontablesand the calculated kappa coefficients of the resulting maps. The results indicated that an ANN(multilayerperceptron based on backpropagation) algorithm produced a statistically significantly higheraccuracy(82.04%) than the DT (QUEST) algorithm (80.48%) or the maximum likelihood (80.56%)classifier (α<0.05). Findings show that using multiple window sizes provided the best results. First-ordertexture featuresalso provided computational advantages and results that were not significantly different fromthose usingsecond-order texture features.

  19. Tests of Independence in Contingency Tables with Small Samples: A Comparison of Statistical Power.

    ERIC Educational Resources Information Center

    Parshall, Cynthia G.; Kromrey, Jeffrey D.

    1996-01-01

    Power and Type I error rates were estimated for contingency tables with small sample sizes for the following four types of tests: (1) Pearson's chi-square; (2) chi-square with Yates's continuity correction; (3) the likelihood ratio test; and (4) Fisher's Exact Test. Various marginal distributions, sample sizes, and effect sizes were examined. (SLD)

  20. Delayed mortality of eastern hardwoods after prescribed fire

    Treesearch

    Daniel A. Yaussy; Thomas A. Waldrop

    2010-01-01

    The Southern Appalachian Mountain and the Ohio Hills sites of the National Fire and Fire Surrogate Study are located in hardwood dominated forests. Mortality of trees was anticipated the first year after burning but it continued for up to 4 years after burning, which was not expected. Survival analysis showed that the likelihood of mortality was related to prior tree...

  1. Utilization of Lean Methodology to Refine Hiring Practices in a Clinical Research Center Setting

    ERIC Educational Resources Information Center

    Johnson, Marcus R.; Bullard, A. Jasmine; Whitley, R. Lawrence

    2018-01-01

    Background & Aims: Lean methodology is a continuous process improvement approach that is used to identify and eliminate unnecessary steps (or waste) in a process. It increases the likelihood that the highest level of value possible is provided to the end-user, or customer, in the form of the product delivered through that process. Lean…

  2. Perceived Barriers and Facilitators to Participation in Physical Activity during the School Lunch Break for Girls Aged 12-13 Years

    ERIC Educational Resources Information Center

    Watson, Amanda; Eliott, Jaklin; Mehta, Kaye

    2015-01-01

    Given the short-and long-term health implications associated with overweight and obesity plus the likelihood of overweight or obesity to continue into adulthood, addressing the causes of overweight and obesity in childhood is a significant public health concern. One underlying cause of overweight and obesity is insufficient physical activity. The…

  3. The impact of primary care on emergency department presentation and hospital admission with pneumonia: a case–control study of preschool-aged children

    PubMed Central

    Emery, Diane P; Milne, Tania; Gilchrist, Catherine A; Gibbons, Megan J; Robinson, Elizabeth; Coster, Gregor D; Forrest, Christopher B; Harnden, Anthony; Mant, David; Grant, Cameron C

    2015-01-01

    Background: In children, community-acquired pneumonia is a frequent cause of emergency department (ED) presentation and hospital admission. Quality primary care may prevent some of these hospital visits. Aims: The aim of this study was to identify primary care factors associated with ED presentation and hospital admission of preschool-aged children with community-acquired pneumonia. Methods: A case–control study was conducted by enrolling three groups: children presenting to the ED with pneumonia and admitted (n=326), or discharged home (n=179), and well-neighbourhood controls (n=351). Interviews with parents and primary care staff were conducted and health record review was performed. The association of primary care factors with ED presentation and hospital admission, controlling for available confounding factors, was determined using logistic regression. Results: Children were more likely to present to the ED with pneumonia if they did not have a usual general practitioner (GP) (odds ratio (OR)=2.50, 95% confidence interval (CI)=1.67–3.70), their GP worked ⩽20 h/week (OR=1.86, 95% CI=1.10–3.13) or their GP practice lacked an immunisation recall system (OR=5.44, 95% CI=2.26–13.09). Lower parent ratings for continuity (OR=1.63, 95% CI=1.01–2.62), communication (OR=2.01, 95% CI=1.29–3.14) and overall satisfaction (OR=2.16, 95% CI=1.34–3.47) increased the likelihood of ED presentation. Children were more likely to be admitted when antibiotics were prescribed in primary care (OR=2.50, 95% CI=1.43–4.55). Hospital admission was less likely if children did not have a usual GP (OR=0.22, 95% CI=0.11–0.40) or self-referred to the ED (OR=0.48, 95% CI=0.26–0.89). Conclusions: Accessible and continuous primary care is associated with a decreased likelihood of preschool-aged children with pneumonia presenting to the ED and an increased likelihood of hospital admission, implying more appropriate referral. Lower parental satisfaction is associated with an increased likelihood of ED presentation. PMID:25654661

  4. Geographic information system (GIS)-based maps of Appalachian basin oil and gas fields: Chapter C.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Kinney, Scott A.; Suitt, Stephen E.; Merrill, Matthew D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    In 2006 and 2007, the greenline Appalachian basin field maps were digitized under the supervision of Scott Kinney and converted to geographic information system (GIS) files for chapter I.1 (this volume). By converting these oil and gas field maps to a digital format and maintaining the field names where noted, they are now available for a variety of oil and gas and possibly carbon-dioxide sequestration projects. Having historical names assigned to known digitized conventional fields provides a convenient classification scheme into which cumulative production and ultimate field-size databases can be organized. Moreover, as exploratory and development drilling expands across the basin, many previously named fields that were originally treated as conventional fields have evolved into large, commonly unnamed continuous-type accumulations. These new digital maps will facilitate a comparison between EUR values from recently drilled, unnamed parts of continuous accumulations and EUR values from named fields discovered early during the exploration cycle of continuous accumulations.

  5. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    PubMed Central

    2010-01-01

    Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the detection of moderately irregularly shaped clusters. The multi-objective cohesion scan is most effective for the detection of highly irregularly shaped clusters. PMID:21034451

  6. A smoothed stochastic earthquake rate model considering seismicity and fault moment release for Europe

    NASA Astrophysics Data System (ADS)

    Hiemer, S.; Woessner, J.; Basili, R.; Danciu, L.; Giardini, D.; Wiemer, S.

    2014-08-01

    We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project `Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for testing periods of 10-20 yr. The testing results suggest that our model is a viable candidate model to serve for long-term forecasting on timescales of years to decades for the European region.

  7. Prospectively Evaluating the Collaboratory for the Study of Earthquake Predictability: An Evaluation of the UCERF2 and Updated Five-Year RELM Forecasts

    NASA Astrophysics Data System (ADS)

    Strader, Anne; Schneider, Max; Schorlemmer, Danijel; Liukis, Maria

    2016-04-01

    The Collaboratory for the Study of Earthquake Predictability (CSEP) was developed to rigorously test earthquake forecasts retrospectively and prospectively through reproducible, completely transparent experiments within a controlled environment (Zechar et al., 2010). During 2006-2011, thirteen five-year time-invariant prospective earthquake mainshock forecasts developed by the Regional Earthquake Likelihood Models (RELM) working group were evaluated through the CSEP testing center (Schorlemmer and Gerstenberger, 2007). The number, spatial, and magnitude components of the forecasts were compared to the respective observed seismicity components using a set of consistency tests (Schorlemmer et al., 2007, Zechar et al., 2010). In the initial experiment, all but three forecast models passed every test at the 95% significance level, with all forecasts displaying consistent log-likelihoods (L-test) and magnitude distributions (M-test) with the observed seismicity. In the ten-year RELM experiment update, we reevaluate these earthquake forecasts over an eight-year period from 2008-2016, to determine the consistency of previous likelihood testing results over longer time intervals. Additionally, we test the Uniform California Earthquake Rupture Forecast (UCERF2), developed by the U.S. Geological Survey (USGS), and the earthquake rate model developed by the California Geological Survey (CGS) and the USGS for the National Seismic Hazard Mapping Program (NSHMP) against the RELM forecasts. Both the UCERF2 and NSHMP forecasts pass all consistency tests, though the Helmstetter et al. (2007) and Shen et al. (2007) models exhibit greater information gain per earthquake according to the T- and W- tests (Rhoades et al., 2011). Though all but three RELM forecasts pass the spatial likelihood test (S-test), multiple forecasts fail the M-test due to overprediction of the number of earthquakes during the target period. Though there is no significant difference between the UCERF2 and NSHMP models, residual scores show that the NSHMP model is preferred in locations with earthquake occurrence, due to the lower seismicity rates forecasted by the UCERF2 model.

  8. Continuity of drunk and drugged driving behaviors four years post-college.

    PubMed

    Caldeira, Kimberly M; Arria, Amelia M; Allen, Hannah K; Bugbee, Brittany A; Vincent, Kathryn B; O'Grady, Kevin E

    2017-11-01

    Driving under the influence of alcohol is a leading cause of injury and premature death among young adults, and college-educated individuals are at particularly high risk. Less is known about driving under the influence of other drugs, which is on the rise. This study describes prospective seven-year trends in alcohol and other drug (AOD)-involved driving among a young-adult sample beginning with their second year of college (i.e., Years 2-8), and documents the extent of continuity in such behaviors across time. Originally recruited as incoming first-year students at one large public university, participants (n=1194) were interviewed annually about how frequently they drove while drunk/intoxicated (DWI), after drinking any alcohol (DAD), and/or while under the influence of other drugs (DD). Follow-up rates were high (>75% annually). Among participants with access to drive a car, annual prevalence peaked in Year 4 (modal age 21) for both DWI (24.3% wt ) and DD (19.1% wt ) and declined significantly thereafter through Year 8 (both ps<0.05). DAD was far more prevalent than DWI or DD, increasing from 40.5% wt in Year 2 to 66.9% wt in Year 5, and plateauing thereafter. Among marijuana-using participants, likelihood of DD was consistently greater than the likelihood of DWI among Heavy Episodic and Light-to-Moderate drinkers, and it declined significantly during Years 5-8 (p<0.05). Post-college declines in heavy drinking and DWI prevalence were encouraging but did not necessarily translate to reductions in likelihood of engaging in DWI, depending on drinking pattern. College-educated individuals represent an important target for AOD-involved driving prevention. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Social identity continuity and mental health among Syrian refugees in Turkey.

    PubMed

    Smeekes, Anouk; Verkuyten, Maykel; Çelebi, Elif; Acartürk, Ceren; Onkun, Samed

    2017-10-01

    Building upon social psychological work on social identity and mental health, this study among Syrian refugees in Turkey examined the importance of multiple group memberships and identity continuity for mental health and well-being. A survey study was conducted among the very difficult to reach population of Syrian refugees (N = 361). With path analysis in AMOS the associations were examined between multiple group memberships, social identity continuity and mental health and psychological well-being. Indicate that belonging to multiple groups before migration was related to a higher likelihood of having preserved group memberships after migration (i.e., sense of social identity continuity), which, in turn, predicted greater life satisfaction and lower levels of depression. Multiple group membership, however, was also directly related to higher depression. Findings are discussed in relation to the importance of multiple group membership and feelings of identity continuity for refugees.

  10. 27 CFR 9.188 - Horse Heaven Hills.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... of viticultural significance. (b) Approved Maps. The appropriate maps for determining the boundaries... Canyon, Spring Canyon, Sand Ridge, and Willow Creek) to the point where the 1,700-foot contour line intersects Sand Ridge Road in section 4, T5N, R22E, on the Douty Canyon map; then (4) Continue north...

  11. Concept Mapping as a Tool to Develop and Measure Students' Understanding in Science

    ERIC Educational Resources Information Center

    Tan, Sema; Erdimez, Omer; Zimmerman, Robert

    2017-01-01

    Concept maps measured a student's understanding of the complexity of concepts, and interrelationships. Novak and Gowin (1984) claimed that the continuous use of concept maps increased the complexity and interconnectedness of students' understanding of relationships between concepts in a particular science domain. This study has two purposes; the…

  12. The Strategic Design Inquiry: A Formal Methodology For Approaching, Designing, Integrating, And Articulating National Strategy

    DTIC Science & Technology

    2014-04-01

    15 Figure 4: Example cognitive map ... map , aligning planning efforts throughout the government. Even after strategy implementation, SDI calls for continuing, iterative learning and...the design before total commitment to it. Capturing this analysis on a cognitive map allows strategists to articulate a design to government

  13. Seep Detection using E/V Nautilus Integrated Seafloor Mapping and Remotely Operated Vehicles on the United States West Coast

    NASA Astrophysics Data System (ADS)

    Gee, L. J.; Raineault, N.; Kane, R.; Saunders, M.; Heffron, E.; Embley, R. W.; Merle, S. G.

    2017-12-01

    Exploration Vessel (E/V) Nautilus has been mapping the seafloor off the west coast of the United States, from Washington to California, for the past three years with a Kongsberg EM302 multibeam sonar. This system simultaneously collects bathymetry, seafloor and water column backscatter data, allowing an integrated approach to mapping to more completely characterize a region, and has identified over 1,000 seafloor seeps. Hydrographic multibeam sonars like the EM302 were designed for mapping the bathymetry. It is only in the last decade that major mapping projects included an integrated approach that utilizes the seabed and water column backscatter information in addition to the bathymetry. Nautilus mapping in the Eastern Pacific over the past three years has included a number of seep-specific expeditions, and utilized and adapted the preliminary mapping guidelines that have emerged from research. The likelihood of seep detection is affected by many factors: the environment: seabed geomorphology, surficial sediment, seep location/depth, regional oceanography and biology, the nature of the seeps themselves: size variation, varying flux, depth, and transience, the detection system: design of hydrographic multibeam sonars limits use for water column detection, the platform: variations in the vessel and operations such as noise, speed, and swath overlap. Nautilus integrated seafloor mapping provided multiple indicators of seep locations, but it remains difficult to assess the probability of seep detection. Even when seeps were detected, they have not always been located during ROV dives. However, the presence of associated features (methane hydrate and bacterial mats) serve as evidence of potential seep activity and reinforce the transient nature of the seeps. Not detecting a seep in the water column data does not necessarily indicate that there is not a seep at a given location, but with multiple passes over an area and by the use of other contextual data, an area may be classified as likely or unlikely to host seeps.

  14. Comparative map and trait viewer (CMTV): an integrated bioinformatic tool to construct consensus maps and compare QTL and functional genomics data across genomes and experiments.

    PubMed

    Sawkins, M C; Farmer, A D; Hoisington, D; Sullivan, J; Tolopko, A; Jiang, Z; Ribaut, J-M

    2004-10-01

    In the past few decades, a wealth of genomic data has been produced in a wide variety of species using a diverse array of functional and molecular marker approaches. In order to unlock the full potential of the information contained in these independent experiments, researchers need efficient and intuitive means to identify common genomic regions and genes involved in the expression of target phenotypic traits across diverse conditions. To address this need, we have developed a Comparative Map and Trait Viewer (CMTV) tool that can be used to construct dynamic aggregations of a variety of types of genomic datasets. By algorithmically determining correspondences between sets of objects on multiple genomic maps, the CMTV can display syntenic regions across taxa, combine maps from separate experiments into a consensus map, or project data from different maps into a common coordinate framework using dynamic coordinate translations between source and target maps. We present a case study that illustrates the utility of the tool for managing large and varied datasets by integrating data collected by CIMMYT in maize drought tolerance research with data from public sources. This example will focus on one of the visualization features for Quantitative Trait Locus (QTL) data, using likelihood ratio (LR) files produced by generic QTL analysis software and displaying the data in a unique visual manner across different combinations of traits, environments and crosses. Once a genomic region of interest has been identified, the CMTV can search and display additional QTLs meeting a particular threshold for that region, or other functional data such as sets of differentially expressed genes located in the region; it thus provides an easily used means for organizing and manipulating data sets that have been dynamically integrated under the focus of the researcher's specific hypothesis.

  15. Neural networks for satellite remote sensing and robotic sensor interpretation

    NASA Astrophysics Data System (ADS)

    Martens, Siegfried

    Remote sensing of forests and robotic sensor fusion can be viewed, in part, as supervised learning problems, mapping from sensory input to perceptual output. This dissertation develops ARTMAP neural networks for real-time category learning, pattern recognition, and prediction tailored to remote sensing and robotics applications. Three studies are presented. The first two use ARTMAP to create maps from remotely sensed data, while the third uses an ARTMAP system for sensor fusion on a mobile robot. The first study uses ARTMAP to predict vegetation mixtures in the Plumas National Forest based on spectral data from the Landsat Thematic Mapper satellite. While most previous ARTMAP systems have predicted discrete output classes, this project develops new capabilities for multi-valued prediction. On the mixture prediction task, the new network is shown to perform better than maximum likelihood and linear mixture models. The second remote sensing study uses an ARTMAP classification system to evaluate the relative importance of spectral and terrain data for map-making. This project has produced a large-scale map of remotely sensed vegetation in the Sierra National Forest. Network predictions are validated with ground truth data, and maps produced using the ARTMAP system are compared to a map produced by human experts. The ARTMAP Sierra map was generated in an afternoon, while the labor intensive expert method required nearly a year to perform the same task. The robotics research uses an ARTMAP system to integrate visual information and ultrasonic sensory information on a B14 mobile robot. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. ARTMAP effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.

  16. Use of a tracing task to assess visuomotor performance for evidence of concussion and recuperation.

    PubMed

    Kelty-Stephen, Damian G; Qureshi Ahmad, Mona; Stirling, Leia

    2015-12-01

    The likelihood of suffering a concussion while playing a contact sport ranges from 15-45% per year of play. These rates are highly variable as athletes seldom report concussive symptoms, or do not recognize their symptoms. We performed a prospective cohort study (n = 206, aged 10-17) to examine visuomotor tracing to determine the sensitivity for detecting neuromotor components of concussion. Tracing variability measures were investigated for a mean shift with presentation of concussion-related symptoms and a linear return toward baseline over subsequent return visits. Furthermore, previous research relating brain injury to the dissociation of smooth movements into "submovements" led to the expectation that cumulative micropause duration, a measure of motion continuity, might detect likelihood of injury. Separate linear mixed effects regressions of tracing measures indicated that 4 of the 5 tracing measures captured both short-term effects of injury and longer-term effects of recovery with subsequent visits. Cumulative micropause duration has a positive relationship with likelihood of participants having had a concussion. The present results suggest that future research should evaluate how well the coefficients for the tracing parameter in the logistic regression help to detect concussion in novel cases. (c) 2015 APA, all rights reserved).

  17. 8D likelihood effective Higgs couplings extraction framework in h → 4ℓ

    DOE PAGES

    Chen, Yi; Di Marco, Emanuele; Lykken, Joe; ...

    2015-01-23

    We present an overview of a comprehensive analysis framework aimed at performing direct extraction of all possible effective Higgs couplings to neutral electroweak gauge bosons in the decay to electrons and muons, the so called ‘golden channel’. Our framework is based primarily on a maximum likelihood method constructed from analytic expressions of the fully differential cross sections for h → 4l and for the dominant irreduciblemore » $$ q\\overline{q} $$ → 4l background, where 4l = 2e2μ, 4e, 4μ. Detector effects are included by an explicit convolution of these analytic expressions with the appropriate transfer function over all center of mass variables. Utilizing the full set of observables, we construct an unbinned detector-level likelihood which is continuous in the effective couplings. We consider possible ZZ, Zγ, and γγ couplings simultaneously, allowing for general CP odd/even admixtures. A broad overview is given of how the convolution is performed and we discuss the principles and theoretical basis of the framework. This framework can be used in a variety of ways to study Higgs couplings in the golden channel using data obtained at the LHC and other future colliders.« less

  18. The likelihood of achieving quantified road safety targets: a binary logistic regression model for possible factors.

    PubMed

    Sze, N N; Wong, S C; Lee, C Y

    2014-12-01

    In past several decades, many countries have set quantified road safety targets to motivate transport authorities to develop systematic road safety strategies and measures and facilitate the achievement of continuous road safety improvement. Studies have been conducted to evaluate the association between the setting of quantified road safety targets and road fatality reduction, in both the short and long run, by comparing road fatalities before and after the implementation of a quantified road safety target. However, not much work has been done to evaluate whether the quantified road safety targets are actually achieved. In this study, we used a binary logistic regression model to examine the factors - including vehicle ownership, fatality rate, and national income, in addition to level of ambition and duration of target - that contribute to a target's success. We analyzed 55 quantified road safety targets set by 29 countries from 1981 to 2009, and the results indicate that targets that are in progress and with lower level of ambitions had a higher likelihood of eventually being achieved. Moreover, possible interaction effects on the association between level of ambition and the likelihood of success are also revealed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Coupling Self-Organizing Maps with a Naïve Bayesian classifier: A case study for classifying Vermont streams using geomorphic, habitat and biological assessment data

    NASA Astrophysics Data System (ADS)

    Fytilis, N.; Rizzo, D. M.

    2012-12-01

    Environmental managers are increasingly required to forecast the long-term effects and the resilience or vulnerability of biophysical systems to human-generated stresses. Mitigation strategies for hydrological and environmental systems need to be assessed in the presence of uncertainty. An important aspect of such complex systems is the assessment of variable uncertainty on the model response outputs. We develop a new classification tool that couples a Naïve Bayesian Classifier with a modified Kohonen Self-Organizing Map to tackle this challenge. For proof-of-concept, we use rapid geomorphic and reach-scale habitat assessments data from over 2500 Vermont stream reaches (~1371 stream miles) assessed by the Vermont Agency of Natural Resources (VTANR). In addition, the Vermont Department of Environmental Conservation (VTDEC) estimates stream habitat biodiversity indices (macro-invertebrates and fish) and a variety of water quality data. Our approach fully utilizes the existing VTANR and VTDEC data sets to improve classification of stream-reach habitat and biological integrity. The combined SOM-Naïve Bayesian architecture is sufficiently flexible to allow for continual updates and increased accuracy associated with acquiring new data. The Kohonen Self-Organizing Map (SOM) is an unsupervised artificial neural network that autonomously analyzes properties inherent in a given a set of data. It is typically used to cluster data vectors into similar categories when a priori classes do not exist. The ability of the SOM to convert nonlinear, high dimensional data to some user-defined lower dimension and mine large amounts of data types (i.e., discrete or continuous, biological or geomorphic data) makes it ideal for characterizing the sensitivity of river networks in a variety of contexts. The procedure is data-driven, and therefore does not require the development of site-specific, process-based classification stream models, or sets of if-then-else rules associated with expert systems. This has the potential to save time and resources, while enabling a truly adaptive management approach using existing knowledge (expressed as prior probabilities) and new information (expressed as likelihood functions) to update estimates (i.e., in this case, improved stream classifications expressed as posterior probabilities). The distribution parameters of these posterior probabilities are used to quantify uncertainty associated with environmental data. Since classification plays a leading role in the future development of data-enabled science and engineering, such a computational tool is applicable to a variety of engineering applications. The ability of the new classification neural network to characterize streams with high environmental risk is essential for a proactive adaptive watershed management approach.

  20. SLAMM: Visual monocular SLAM with continuous mapping using multiple maps

    PubMed Central

    Md. Sabri, Aznul Qalid; Loo, Chu Kiong; Mansoor, Ali Mohammed

    2018-01-01

    This paper presents the concept of Simultaneous Localization and Multi-Mapping (SLAMM). It is a system that ensures continuous mapping and information preservation despite failures in tracking due to corrupted frames or sensor’s malfunction; making it suitable for real-world applications. It works with single or multiple robots. In a single robot scenario the algorithm generates a new map at the time of tracking failure, and later it merges maps at the event of loop closure. Similarly, maps generated from multiple robots are merged without prior knowledge of their relative poses; which makes this algorithm flexible. The system works in real time at frame-rate speed. The proposed approach was tested on the KITTI and TUM RGB-D public datasets and it showed superior results compared to the state-of-the-arts in calibrated visual monocular keyframe-based SLAM. The mean tracking time is around 22 milliseconds. The initialization is twice as fast as it is in ORB-SLAM, and the retrieved map can reach up to 90 percent more in terms of information preservation depending on tracking loss and loop closure events. For the benefit of the community, the source code along with a framework to be run with Bebop drone are made available at https://github.com/hdaoud/ORBSLAMM. PMID:29702697

  1. Autosomal dominant retinitis pigmentosa: No evidence for nonallelic genetic heterogeneity on 3q

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar-Singh, R.; He Wang; Humphries, P.

    1993-02-01

    Since the initial report of linkage of autosomal dominant retinitis pigmentosa (adRP) to the long arm of chromosome 3, several mutations in the gene encoding rhodopsin, which also maps to 3q, have been reported in adRP pedigrees. However, there has been some discussion as to the possibility of a second adRP locus on 3q. This suggestion has important diagnostic and research implications and must raise doubts about the usefulness of linked markers for reliable diagnosis of RP patients. In order to address this issue the authors have performed an admixture test (A-test) on 10 D3S47-linked adRP pedigrees and have foundmore » a likelihood ratio of heterogeneity versus homogeneity of 4.90. They performed a second A-test, combining the data from all families with known rhodopsin mutations. In this test they obtained a reduced likelihood ratio of heterogeneity versus homogeneity, of 1.0. On the basis of these statistical analyses they have found no significant support for two adRP loci on chromosome 3q. Furthermore, using 40 CEPH families, they have localized the rhodopsin gene to the D3S47-D3S20 interval, with a maximum lod score (Z[sub m]) of 20 and have found that the order qter-D3S47-rhodopsin-D3S20-cen is significantly more likely than any other order. In addition, they have mapped (Z[sub m] = 30) the microsatellite marker D3S621 relative to other loci in this region of the genome. 27 refs., 3 figs., 3 tabs.« less

  2. DEVELOPING ATMOSPHERIC RETRIEVAL METHODS FOR DIRECT IMAGING SPECTROSCOPY OF GAS GIANTS IN REFLECTED LIGHT. I. METHANE ABUNDANCES AND BASIC CLOUD PROPERTIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lupu, Roxana E.; Marley, Mark S.; Zahnle, Kevin

    Upcoming space-based coronagraphic instruments in the next decade will perform reflected light spectroscopy and photometry of cool directly imaged extrasolar giant planets. We are developing a new atmospheric retrieval methodology to help assess the science return and inform the instrument design for such future missions, and ultimately interpret the resulting observations. Our retrieval technique employs a geometric albedo model coupled with both a Markov chain Monte Carlo Ensemble Sampler ( emcee ) and a multimodal nested sampling algorithm ( MultiNest ) to map the posterior distribution. This combination makes the global evidence calculation more robust for any given model andmore » highlights possible discrepancies in the likelihood maps. As a proof of concept, our current atmospheric model contains one or two cloud layers, methane as a major absorber, and a H{sub 2}–He background gas. This 6-to-9 parameter model is appropriate for Jupiter-like planets and can be easily expanded in the future. In addition to deriving the marginal likelihood distribution and confidence intervals for the model parameters, we perform model selection to determine the significance of methane and cloud detection as a function of expected signal-to-noise ratio in the presence of spectral noise correlations. After internal validation, the method is applied to realistic spectra of Jupiter, Saturn, and HD 99492c, a model observing target. We find that the presence or absence of clouds and methane can be determined with high confidence, while parameter uncertainties are model dependent and correlated. Such general methods will also be applicable to the interpretation of direct imaging spectra of cloudy terrestrial planets.« less

  3. Developing Atmospheric Retrieval Methods for Direct Imaging Spectroscopy of Gas Giants in Reflected Light I: Methane Abundances and Basic Cloud Properties

    NASA Technical Reports Server (NTRS)

    Lupu, R. E.; Marley, M. S.; Lewis, N.; Line, M.; Traub, W.; Zahnle, K.

    2016-01-01

    Reflected light spectroscopy and photometry of cool, directly imaged extrasolar giant planets are expected to be performed in the next decade by space-based telescopes equipped with optical wavelength coronagraphs and integral field spectrographs, such as the Wide-Field Infrared Survey Telescope (WFIRST). We are developing a new atmospheric retrieval methodology to help assess the science return and inform the instrument design for such future missions, and ultimately interpret the resulting observations. Our retrieval technique employs an albedo model coupled with both a Markov chain Monte Carlo Ensemble Sampler (emcee) and a multimodal nested sampling algorithm (MultiNest) to map the posterior distribution. This combination makes the global evidence calculation more robust for any given model, and highlights possible discrepancies in the likelihood maps. Here we apply this methodology to simulated spectra of cool giant planets. As a proof-of-concept, our current atmospheric model contains 1 or 2 cloud layers, methane as a major absorber, and a H2-He background gas. This 6-to-9 parameter model is appropriate for Jupiter-like planets and can be easily expanded in the future. In addition to deriving the marginal likelihood distribution and confidence intervals for the model parameters, we perform model selection to determine the significance of methane and cloud detection as a function of expected signal-to-noise, in the presence of spectral noise correlations. After internal validation, the method is applied to realistic reflected-light spectra of Jupiter, Saturn, and HD 99492 c, a likely observing target. We find that the presence or absence of clouds and methane can be determined with high accuracy, while parameters uncertainties are model-dependent.

  4. Drivers of wetland conversion: a global meta-analysis.

    PubMed

    van Asselen, Sanneke; Verburg, Peter H; Vermaat, Jan E; Janse, Jan H

    2013-01-01

    Meta-analysis of case studies has become an important tool for synthesizing case study findings in land change. Meta-analyses of deforestation, urbanization, desertification and change in shifting cultivation systems have been published. This present study adds to this literature, with an analysis of the proximate causes and underlying forces of wetland conversion at a global scale using two complementary approaches of systematic review. Firstly, a meta-analysis of 105 case-study papers describing wetland conversion was performed, showing that different combinations of multiple-factor proximate causes, and underlying forces, drive wetland conversion. Agricultural development has been the main proximate cause of wetland conversion, and economic growth and population density are the most frequently identified underlying forces. Secondly, to add a more quantitative component to the study, a logistic meta-regression analysis was performed to estimate the likelihood of wetland conversion worldwide, using globally-consistent biophysical and socioeconomic location factor maps. Significant factors explaining wetland conversion, in order of importance, are market influence, total wetland area (lower conversion probability), mean annual temperature and cropland or built-up area. The regression analyses results support the outcomes of the meta-analysis of the processes of conversion mentioned in the individual case studies. In other meta-analyses of land change, similar factors (e.g., agricultural development, population growth, market/economic factors) are also identified as important causes of various types of land change (e.g., deforestation, desertification). Meta-analysis helps to identify commonalities across the various local case studies and identify which variables may lead to individual cases to behave differently. The meta-regression provides maps indicating the likelihood of wetland conversion worldwide based on the location factors that have determined historic conversions.

  5. Event-related fMRI studies of episodic encoding and retrieval: meta-analyses using activation likelihood estimation.

    PubMed

    Spaniol, Julia; Davidson, Patrick S R; Kim, Alice S N; Han, Hua; Moscovitch, Morris; Grady, Cheryl L

    2009-07-01

    The recent surge in event-related fMRI studies of episodic memory has generated a wealth of information about the neural correlates of encoding and retrieval processes. However, interpretation of individual studies is hampered by methodological differences, and by the fact that sample sizes are typically small. We submitted results from studies of episodic memory in healthy young adults, published between 1998 and 2007, to a voxel-wise quantitative meta-analysis using activation likelihood estimation [Laird, A. R., McMillan, K. M., Lancaster, J. L., Kochunov, P., Turkeltaub, P. E., & Pardo, J. V., et al. (2005). A comparison of label-based review and ALE meta-analysis in the stroop task. Human Brain Mapping, 25, 6-21]. We conducted separate meta-analyses for four contrasts of interest: episodic encoding success as measured in the subsequent-memory paradigm (subsequent Hit vs. Miss), episodic retrieval success (Hit vs. Correct Rejection), objective recollection (e.g., Source Hit vs. Item Hit), and subjective recollection (e.g., Remember vs. Know). Concordance maps revealed significant cross-study overlap for each contrast. In each case, the left hemisphere showed greater concordance than the right hemisphere. Both encoding and retrieval success were associated with activation in medial-temporal, prefrontal, and parietal regions. Left ventrolateral prefrontal cortex (PFC) and medial-temporal regions were more strongly involved in encoding, whereas left superior parietal and dorsolateral and anterior PFC regions were more strongly involved in retrieval. Objective recollection was associated with activation in multiple PFC regions, as well as multiple posterior parietal and medial-temporal areas, but not hippocampus. Subjective recollection, in contrast, showed left hippocampal involvement. In summary, these results identify broadly consistent activation patterns associated with episodic encoding and retrieval, and subjective and objective recollection, but also subtle differences among these processes.

  6. Drivers of Wetland Conversion: a Global Meta-Analysis

    PubMed Central

    van Asselen, Sanneke; Verburg, Peter H.; Vermaat, Jan E.; Janse, Jan H.

    2013-01-01

    Meta-analysis of case studies has become an important tool for synthesizing case study findings in land change. Meta-analyses of deforestation, urbanization, desertification and change in shifting cultivation systems have been published. This present study adds to this literature, with an analysis of the proximate causes and underlying forces of wetland conversion at a global scale using two complementary approaches of systematic review. Firstly, a meta-analysis of 105 case-study papers describing wetland conversion was performed, showing that different combinations of multiple-factor proximate causes, and underlying forces, drive wetland conversion. Agricultural development has been the main proximate cause of wetland conversion, and economic growth and population density are the most frequently identified underlying forces. Secondly, to add a more quantitative component to the study, a logistic meta-regression analysis was performed to estimate the likelihood of wetland conversion worldwide, using globally-consistent biophysical and socioeconomic location factor maps. Significant factors explaining wetland conversion, in order of importance, are market influence, total wetland area (lower conversion probability), mean annual temperature and cropland or built-up area. The regression analyses results support the outcomes of the meta-analysis of the processes of conversion mentioned in the individual case studies. In other meta-analyses of land change, similar factors (e.g., agricultural development, population growth, market/economic factors) are also identified as important causes of various types of land change (e.g., deforestation, desertification). Meta-analysis helps to identify commonalities across the various local case studies and identify which variables may lead to individual cases to behave differently. The meta-regression provides maps indicating the likelihood of wetland conversion worldwide based on the location factors that have determined historic conversions. PMID:24282580

  7. Housing arrangement and location determine the likelihood of housing loss due to wildfire

    USGS Publications Warehouse

    Syphard, Alexandra D.; Keeley, Jon E.; Massada, Avi Bar; Brennan, Teresa J.; Radeloff, Volker C.

    2012-01-01

    Surging wildfires across the globe are contributing to escalating residential losses and have major social, economic, and ecological consequences. The highest losses in the U.S. occur in southern California, where nearly 1000 homes per year have been destroyed by wildfires since 2000. Wildfire risk reduction efforts focus primarily on fuel reduction and, to a lesser degree, on house characteristics and homeowner responsibility. However, the extent to which land use planning could alleviate wildfire risk has been largely missing from the debate despite large numbers of homes being placed in the most hazardous parts of the landscape. Our goal was to examine how housing location and arrangement affects the likelihood that a home will be lost when a wildfire occurs. We developed an extensive geographic dataset of structure locations, including more than 5500 structures that were destroyed or damaged by wildfire since 2001, and identified the main contributors to property loss in two extensive, fire-prone regions in southern California. The arrangement and location of structures strongly affected their susceptibility to wildfire, with property loss most likely at low to intermediate structure densities and in areas with a history of frequent fire. Rates of structure loss were higher when structures were surrounded by wildland vegetation, but were generally higher in herbaceous fuel types than in higher fuel-volume woody types. Empirically based maps developed using housing pattern and location performed better in distinguishing hazardous from non-hazardous areas than maps based on fuel distribution. The strong importance of housing arrangement and location indicate that land use planning may be a critical tool for reducing fire risk, but it will require reliable delineations of the most hazardous locations.

  8. Mapping the “What” and “Where” Visual Cortices and Their Atrophy in Alzheimer's Disease: Combined Activation Likelihood Estimation with Voxel-Based Morphometry

    PubMed Central

    Deng, Yanjia; Shi, Lin; Lei, Yi; Liang, Peipeng; Li, Kuncheng; Chu, Winnie C. W.; Wang, Defeng

    2016-01-01

    The human cortical regions for processing high-level visual (HLV) functions of different categories remain ambiguous, especially in terms of their conjunctions and specifications. Moreover, the neurobiology of declined HLV functions in patients with Alzheimer's disease (AD) has not been fully investigated. This study provides a functionally sorted overview of HLV cortices for processing “what” and “where” visual perceptions and it investigates their atrophy in AD and MCI patients. Based upon activation likelihood estimation (ALE), brain regions responsible for processing five categories of visual perceptions included in “what” and “where” visions (i.e., object, face, word, motion, and spatial visions) were analyzed, and subsequent contrast analyses were performed to show regions with conjunctive and specific activations for processing these visual functions. Next, based on the resulting ALE maps, the atrophy of HLV cortices in AD and MCI patients was evaluated using voxel-based morphometry. Our ALE results showed brain regions for processing visual perception across the five categories, as well as areas of conjunction and specification. Our comparisons of gray matter (GM) volume demonstrated atrophy of three “where” visual cortices in late MCI group and extensive atrophy of HLV cortices (25 regions in both “what” and “where” visual cortices) in AD group. In addition, the GM volume of atrophied visual cortices in AD and MCI subjects was found to be correlated to the deterioration of overall cognitive status and to the cognitive performances related to memory, execution, and object recognition functions. In summary, these findings may add to our understanding of HLV network organization and of the evolution of visual perceptual dysfunction in AD as the disease progresses. PMID:27445770

  9. Imaging different components of a tectonic tremor sequence in southwestern Japan using an automatic statistical detection and location method

    NASA Astrophysics Data System (ADS)

    Poiata, Natalia; Vilotte, Jean-Pierre; Bernard, Pascal; Satriano, Claudio; Obara, Kazushige

    2018-06-01

    In this study, we demonstrate the capability of an automatic network-based detection and location method to extract and analyse different components of tectonic tremor activity by analysing a 9-day energetic tectonic tremor sequence occurring at the downdip extension of the subducting slab in southwestern Japan. The applied method exploits the coherency of multiscale, frequency-selective characteristics of non-stationary signals recorded across the seismic network. Use of different characteristic functions, in the signal processing step of the method, allows to extract and locate the sources of short-duration impulsive signal transients associated with low-frequency earthquakes and of longer-duration energy transients during the tectonic tremor sequence. Frequency-dependent characteristic functions, based on higher-order statistics' properties of the seismic signals, are used for the detection and location of low-frequency earthquakes. This allows extracting a more complete (˜6.5 times more events) and time-resolved catalogue of low-frequency earthquakes than the routine catalogue provided by the Japan Meteorological Agency. As such, this catalogue allows resolving the space-time evolution of the low-frequency earthquakes activity in great detail, unravelling spatial and temporal clustering, modulation in response to tide, and different scales of space-time migration patterns. In the second part of the study, the detection and source location of longer-duration signal energy transients within the tectonic tremor sequence is performed using characteristic functions built from smoothed frequency-dependent energy envelopes. This leads to a catalogue of longer-duration energy sources during the tectonic tremor sequence, characterized by their durations and 3-D spatial likelihood maps of the energy-release source regions. The summary 3-D likelihood map for the 9-day tectonic tremor sequence, built from this catalogue, exhibits an along-strike spatial segmentation of the long-duration energy-release regions, matching the large-scale clustering features evidenced from the low-frequency earthquake's activity analysis. Further examination of the two catalogues showed that the extracted short-duration low-frequency earthquakes activity coincides in space, within about 10-15 km distance, with the longer-duration energy sources during the tectonic tremor sequence. This observation provides a potential constraint on the size of the longer-duration energy-radiating source region in relation with the clustering of low-frequency earthquakes activity during the analysed tectonic tremor sequence. We show that advanced statistical network-based methods offer new capabilities for automatic high-resolution detection, location and monitoring of different scale-components of tectonic tremor activity, enriching existing slow earthquakes catalogues. Systematic application of such methods to large continuous data sets will allow imaging the slow transient seismic energy-release activity at higher resolution, and therefore, provide new insights into the underlying multiscale mechanisms of slow earthquakes generation.

  10. Imaging different components of a tectonic tremor sequence in southwestern Japan using an automatic statistical detection and location method

    NASA Astrophysics Data System (ADS)

    Poiata, Natalia; Vilotte, Jean-Pierre; Bernard, Pascal; Satriano, Claudio; Obara, Kazushige

    2018-02-01

    In this study, we demonstrate the capability of an automatic network-based detection and location method to extract and analyse different components of tectonic tremor activity by analysing a 9-day energetic tectonic tremor sequence occurring at the down-dip extension of the subducting slab in southwestern Japan. The applied method exploits the coherency of multi-scale, frequency-selective characteristics of non-stationary signals recorded across the seismic network. Use of different characteristic functions, in the signal processing step of the method, allows to extract and locate the sources of short-duration impulsive signal transients associated with low-frequency earthquakes and of longer-duration energy transients during the tectonic tremor sequence. Frequency-dependent characteristic functions, based on higher-order statistics' properties of the seismic signals, are used for the detection and location of low-frequency earthquakes. This allows extracting a more complete (˜6.5 times more events) and time-resolved catalogue of low-frequency earthquakes than the routine catalogue provided by the Japan Meteorological Agency. As such, this catalogue allows resolving the space-time evolution of the low-frequency earthquakes activity in great detail, unravelling spatial and temporal clustering, modulation in response to tide, and different scales of space-time migration patterns. In the second part of the study, the detection and source location of longer-duration signal energy transients within the tectonic tremor sequence is performed using characteristic functions built from smoothed frequency-dependent energy envelopes. This leads to a catalogue of longer-duration energy sources during the tectonic tremor sequence, characterized by their durations and 3-D spatial likelihood maps of the energy-release source regions. The summary 3-D likelihood map for the 9-day tectonic tremor sequence, built from this catalogue, exhibits an along-strike spatial segmentation of the long-duration energy-release regions, matching the large-scale clustering features evidenced from the low-frequency earthquake's activity analysis. Further examination of the two catalogues showed that the extracted short-duration low-frequency earthquakes activity coincides in space, within about 10-15 km distance, with the longer-duration energy sources during the tectonic tremor sequence. This observation provides a potential constraint on the size of the longer-duration energy-radiating source region in relation with the clustering of low-frequency earthquakes activity during the analysed tectonic tremor sequence. We show that advanced statistical network-based methods offer new capabilities for automatic high-resolution detection, location and monitoring of different scale-components of tectonic tremor activity, enriching existing slow earthquakes catalogues. Systematic application of such methods to large continuous data sets will allow imaging the slow transient seismic energy-release activity at higher resolution, and therefore, provide new insights into the underlying multi-scale mechanisms of slow earthquakes generation.

  11. Mapping cancer mortality-to-incidence ratios to illustrate racial and sex disparities in a high-risk population.

    PubMed

    Hébert, James R; Daguise, Virginie G; Hurley, Deborah M; Wilkerson, Rebecca C; Mosley, Catishia M; Adams, Swann A; Puett, Robin; Burch, James B; Steck, Susan E; Bolick-Aldrich, Susan W

    2009-06-01

    Comparisons of incidence and mortality rates are the metrics used most commonly to define cancer-related racial disparities. In the US, and particularly in South Carolina, these largely disfavor African Americans (AAs). Computed from readily available data sources, the mortality-to-incidence rate ratio (MIR) provides a population-based indicator of survival. South Carolina Central Cancer Registry incidence data and Vital Registry death data were used to construct MIRs. ArcGIS 9.2 mapping software was used to map cancer MIRs by sex and race for 8 Health Regions within South Carolina for all cancers combined and for breast, cervical, colorectal, lung, oral, and prostate cancers. Racial differences in cancer MIRs were observed for both sexes for all cancers combined and for most individual sites. The largest racial differences were observed for female breast, prostate, and oral cancers, and AAs had MIRs nearly twice those of European Americans (EAs). Comparing and mapping race- and sex-specific cancer MIRs provides a powerful way to observe the scope of the cancer problem. By using these methods, in the current study, AAs had much higher cancer MIRs compared with EAs for most cancer sites in nearly all regions of South Carolina. Future work must be directed at explaining and addressing the underlying differences in cancer outcomes by region and race. MIR mapping allows for pinpointing areas where future research has the greatest likelihood of identifying the causes of large, persistent, cancer-related disparities. Other regions with access to high-quality data may find it useful to compare MIRs and conduct MIR mapping. (c) 2009 American Cancer Society.

  12. Analysis of Multipsectral Time Series for supporting Forest Management Plans

    NASA Astrophysics Data System (ADS)

    Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.

    2010-05-01

    Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.

  13. Land cover mapping with emphasis to burnt area delineation using co-orbital ALI and Landsat TM imagery

    NASA Astrophysics Data System (ADS)

    Petropoulos, George P.; Kontoes, Charalambos C.; Keramitsoglou, Iphigenia

    2012-08-01

    In this study, the potential of EO-1 Advanced Land Imager (ALI) radiometer for land cover and especially burnt area mapping from a single image analysis is investigated. Co-orbital imagery from the Landsat Thematic Mapper (TM) was also utilised for comparison purposes. Both images were acquired shortly after the suppression of a fire occurred during the summer of 2009 North-East of Athens, the capital of Greece. The Maximum Likelihood (ML), Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs) classifiers were parameterised and subsequently applied to the acquired satellite datasets. Evaluation of the land use/cover mapping accuracy was based on the error matrix statistics. Also, the McNemar test was used to evaluate the statistical significance of the differences between the approaches tested. Derived burnt area estimates were validated against the operationally deployed Services and Applications For Emergency Response (SAFER) Burnt Scar Mapping service. All classifiers applied to either ALI or TM imagery proved flexible enough to map land cover and also to extract the burnt area from other land surface types. The highest total classification accuracy and burnt area detection capability was returned from the application of SVMs to ALI data. This was due to the SVMs ability to identify an optimal separating hyperplane for best classes' separation that was able to better utilise ALI's advanced technological characteristics in comparison to those of TM sensor. This study is to our knowledge the first of its kind, effectively demonstrating the benefits of the combined application of SVMs to ALI data further implying that ALI technology may prove highly valuable in mapping burnt areas and land use/cover if it is incorporated into the development of Landsat 8 mission, planned to be launched in the coming years.

  14. Development of a transformation model to derive general population-based utility: Mapping the pruritus-visual analog scale (VAS) to the EQ-5D utility.

    PubMed

    Park, Sun-Young; Park, Eun-Ja; Suh, Hae Sun; Ha, Dongmun; Lee, Eui-Kyung

    2017-08-01

    Although nonpreference-based disease-specific measures are widely used in clinical studies, they cannot generate utilities for economic evaluation. A solution to this problem is to estimate utilities from disease-specific instruments using the mapping function. This study aimed to develop a transformation model for mapping the pruritus-visual analog scale (VAS) to the EuroQol 5-Dimension 3-Level (EQ-5D-3L) utility index in pruritus. A cross-sectional survey was conducted with a sample (n = 268) drawn from the general population of South Korea. Data were randomly divided into 2 groups, one for estimating and the other for validating mapping models. To select the best model, we developed and compared 3 separate models using demographic information and the pruritus-VAS as independent variables. The predictive performance was assessed using the mean absolute deviation and root mean square error in a separate dataset. Among the 3 models, model 2 using age, age squared, sex, and the pruritus-VAS as independent variables had the best performance based on the goodness of fit and model simplicity, with a log likelihood of 187.13. The 3 models had similar precision errors based on mean absolute deviation and root mean square error in the validation dataset. No statistically significant difference was observed between the mean observed and predicted values in all models. In conclusion, model 2 was chosen as the preferred mapping model. Outcomes measured as the pruritus-VAS can be transformed into the EQ-5D-3L utility index using this mapping model, which makes an economic evaluation possible when only pruritus-VAS data are available. © 2017 John Wiley & Sons, Ltd.

  15. Mapping Cancer Mortality-to-Incidence Ratios to Illustrate Racial and Sex Disparities in a High-risk Population

    PubMed Central

    Hébert, James R.; Daguise, Virginie G.; Hurley, Deborah M.; Wilkerson, Rebecca C.; Mosley, Catishia M.; Adams, Swann A.; Puett, Robin; Burch, James B.; Steck, Susan E.; Bolick-Aldrich, Susan W.

    2009-01-01

    Background Comparisons of incidence and mortality rates are the metrics used most commonly to define cancer-related racial disparities. In the US, and particularly in South Carolina, these largely disfavor African Americans (AAs). Computed from readily available data sources, the mortality-to-incidence rate ratio (MIR) provides a population-based indicator of survival. Methods South Carolina Central Cancer Registry incidence data and Vital Registry death data were used to construct MIRs. ArcGIS 9.2 mapping software was used to map cancer MIRs by sex and race for 8 Health Regions within South Carolina for all cancers combined and for breast, cervical, colorectal, lung, oral, and prostate cancers. Results Racial differences in cancer MIRs were observed for both sexes for all cancers combined and for most individual sites. The largest racial differences were observed for female breast, prostate, and oral cancers, and AAs had MIRs nearly twice those of European Americans (EAs). Conclusions Comparing and mapping race- and sex-specific cancer MIRs provides a powerful way to observe the scope of the cancer problem. By using these methods, in the current study, AAs had much higher cancer MIRs compared with EAs for most cancer sites in nearly all regions of South Carolina. Future work must be directed at explaining and addressing the underlying differences in cancer outcomes by region and race. MIR mapping allows for pinpointing areas where future research has the greatest likelihood of identifying the causes of large, persistent, cancer-related disparities. Other regions with access to high-quality data may find it useful to compare MIRs and conduct MIR mapping. PMID:19296515

  16. Direct Reconstruction of CT-Based Attenuation Correction Images for PET With Cluster-Based Penalties

    NASA Astrophysics Data System (ADS)

    Kim, Soo Mee; Alessio, Adam M.; De Man, Bruno; Kinahan, Paul E.

    2017-03-01

    Extremely low-dose (LD) CT acquisitions used for PET attenuation correction have high levels of noise and potential bias artifacts due to photon starvation. This paper explores the use of a priori knowledge for iterative image reconstruction of the CT-based attenuation map. We investigate a maximum a posteriori framework with cluster-based multinomial penalty for direct iterative coordinate decent (dICD) reconstruction of the PET attenuation map. The objective function for direct iterative attenuation map reconstruction used a Poisson log-likelihood data fit term and evaluated two image penalty terms of spatial and mixture distributions. The spatial regularization is based on a quadratic penalty. For the mixture penalty, we assumed that the attenuation map may consist of four material clusters: air + background, lung, soft tissue, and bone. Using simulated noisy sinogram data, dICD reconstruction was performed with different strengths of the spatial and mixture penalties. The combined spatial and mixture penalties reduced the root mean squared error (RMSE) by roughly two times compared with a weighted least square and filtered backprojection reconstruction of CT images. The combined spatial and mixture penalties resulted in only slightly lower RMSE compared with a spatial quadratic penalty alone. For direct PET attenuation map reconstruction from ultra-LD CT acquisitions, the combination of spatial and mixture penalties offers regularization of both variance and bias and is a potential method to reconstruct attenuation maps with negligible patient dose. The presented results, using a best-case histogram suggest that the mixture penalty does not offer a substantive benefit over conventional quadratic regularization and diminishes enthusiasm for exploring future application of the mixture penalty.

  17. Exploring the Influence of Topographic Correction and SWIR Spectral Information Inclusion on Burnt Scars Detection From High Resolution EO Imagery: A Case Study Using ASTER imagery

    NASA Astrophysics Data System (ADS)

    Said, Yahia A.; Petropoulos, George; Srivastava, Prashant K.

    2014-05-01

    Information on burned area estimates is of key importance in environmental and ecological studies as well as in fire management including damage assessment and planning of post-fire recovery of affected areas. Earth Observation (EO) provides today the most efficient way in obtaining such information in a rapid, consistent and cost-effective manner. The present study aimed at exploring the effect of topographic correction to the burnt area delineation in conditions characteristic of a Mediterranean environment using ASTER high resolution multispectral remotely sensed imagery. A further objective was to investigate the potential added-value of the inclusion of the shortwave infrared (SWIR) bands in improving the retrievals of burned area cartography from the ASTER data. In particular the capability of the Maximum Likelihood (ML), the Support Vector Machines (SVMs) and Object-based Image Analysis (OBIA) classification techniques has been examined herein for the purposes of our study. As a case study is used a typical Mediterranean site on which a fire event occurred in Greece during the summer of 2007, for which post-fire ASTER imagery has been acquired. Our results indicated that the combination of topographic correction (ortho-rectification) with the inclusion of the SWIR bands returned the most accurate results in terms of burnt area mapping. In terms of image processing methods, OBIA showed the best results and found as the most promising approach for burned area mapping with least absolute difference from the validation polygon followed by SVM and ML. All in all, our study provides an important contribution to the understanding of the capability of high resolution imagery such as that from ASTER sensor and corroborates the usefulness particularly of the topographic correction as an image processing step when in delineating the burnt areas from such data. It also provides further evidence that use of EO technology can offer an effective practical tool for the extent of ecosystem destruction from wildfires, providing extremely useful information in co-ordinating efforts for the recovery of fire-affected ecosystems after wildfire. Keywords: Remote Sensing, ASTER, Burned area mapping, Maximum Likelihood, Support Vector Machines, Object-based image analysis, Greece

  18. Facial nerve mapping and monitoring in lymphatic malformation surgery.

    PubMed

    Chiara, Jospeh; Kinney, Greg; Slimp, Jefferson; Lee, Gi Soo; Oliaei, Sepehr; Perkins, Jonathan A

    2009-10-01

    Establish the efficacy of preoperative facial nerve mapping and continuous intraoperative EMG monitoring in protecting the facial nerve during resection of cervicofacial lymphatic malformations. Retrospective study in which patients were clinically followed for at least 6 months postoperatively, and long-term outcome was evaluated. Patient demographics, lesion characteristics (i.e., size, stage, location) were recorded. Operative notes revealed surgical techniques, findings, and complications. Preoperative, short-/long-term postoperative facial nerve function was standardized using the House-Brackmann Classification. Mapping was done prior to incision by percutaneously stimulating the facial nerve and its branches and recording the motor responses. Intraoperative monitoring and mapping were accomplished using a four-channel, free-running EMG. Neurophysiologists continuously monitored EMG responses and blindly analyzed intraoperative findings and final EMG interpretations for abnormalities. Seven patients collectively underwent 8 lymphatic malformation surgeries. Median age was 30 months (2-105 months). Lymphatic malformation diagnosis was recorded in 6/8 surgeries. Facial nerve function was House-Brackmann grade I in 8/8 cases preoperatively. Facial nerve was abnormally elongated in 1/8 cases. EMG monitoring recorded abnormal activity in 4/8 cases--two suggesting facial nerve irritation, and two with possible facial nerve damage. Transient or long-term facial nerve paresis occurred in 1/8 cases (House-Brackmann grade II). Preoperative facial nerve mapping combined with continuous intraoperative EMG and mapping is a successful method of identifying the facial nerve course and protecting it from injury during resection of cervicofacial lymphatic malformations involving the facial nerve.

  19. U.S. Geological Survey's ShakeCast: A cloud-based future

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Turner, Loren; Bekiri, Nebi

    2014-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap portrays the extent of potentially damaging shaking. In turn, the ShakeCast system, a freely-available, post-earthquake situational awareness application, automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. ShakeCast is particularly suitable for earthquake planning and response purposes by Departments of Transportation (DOTs), critical facility and lifeline utilities, large businesses, engineering and financial services, and loss and risk modelers. Recent important developments to the ShakeCast system and its user base are described. The newly-released Version 3 of the ShakeCast system encompasses advancements in seismology, earthquake engineering, and information technology applicable to the legacy ShakeCast installation (Version 2). In particular, this upgrade includes a full statistical fragility analysis framework for general assessment of structures as part of the near real-time system, direct access to additional earthquake-specific USGS products besides ShakeMap (PAGER, DYFI?, tectonic summary, etc.), significant improvements in the graphical user interface, including a console view for operations centers, and custom, user-defined hazard and loss modules. The release also introduces a new adaption option to port ShakeCast to the "cloud". Employing Amazon Web Services (AWS), users now have a low-cost alternative to local hosting, by fully offloading hardware, software, and communication obligations to the cloud. Other advantages of the "ShakeCast Cloud" strategy include (1) Reliability and robustness of offsite operations, (2) Scalability naturally accommodated, (3), Serviceability, problems reduced due to software and hardware uniformity, (4) Testability, freely available for new users, (5) Remotely supported, allowing expert-facilitated maintenance, (6) Adoptability, simplified with disk images, and (7) Security, built in at the very high level associated with AWS. The ShakeCast user base continues to expand and broaden. For example, Caltrans, the prototypical ShakeCast user and development supporter, has been providing guidance to other DOTs on the use of the National Bridge Inventory (NBI) database to implement fully-functional ShakeCast systems in their states. A long-term goal underway is to further "connect the DOTs" via a Transportation Pooled Fund (TPF) with participating state DOTs. We also review some of the many other users and uses of ShakeCast. Lastly, on the hazard input front, we detail related ShakeMap improvements and ongoing advancements in estimating the likelihood of shaking-induced secondary hazards at structures, facilities, bridges, and along roadways due to landslides and liquefaction, and implemented within the ShakeCast framework.

  20. Delay-dependent dynamical analysis of complex-valued memristive neural networks: Continuous-time and discrete-time cases.

    PubMed

    Wang, Jinling; Jiang, Haijun; Ma, Tianlong; Hu, Cheng

    2018-05-01

    This paper considers the delay-dependent stability of memristive complex-valued neural networks (MCVNNs). A novel linear mapping function is presented to transform the complex-valued system into the real-valued system. Under such mapping function, both continuous-time and discrete-time MCVNNs are analyzed in this paper. Firstly, when activation functions are continuous but not Lipschitz continuous, an extended matrix inequality is proved to ensure the stability of continuous-time MCVNNs. Furthermore, if activation functions are discontinuous, a discontinuous adaptive controller is designed to acquire its stability by applying Lyapunov-Krasovskii functionals. Secondly, compared with techniques in continuous-time MCVNNs, the Halanay-type inequality and comparison principle are firstly used to exploit the dynamical behaviors of discrete-time MCVNNs. Finally, the effectiveness of theoretical results is illustrated through numerical examples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Fuzzification of continuous-value spatial evidence for mineral prospectivity mapping

    NASA Astrophysics Data System (ADS)

    Yousefi, Mahyar; Carranza, Emmanuel John M.

    2015-01-01

    Complexities of geological processes portrayed as certain feature in a map (e.g., faults) are natural sources of uncertainties in decision-making for exploration of mineral deposits. Besides natural sources of uncertainties, knowledge-driven (e.g., fuzzy logic) mineral prospectivity mapping (MPM) is also plagued and incurs further uncertainty in subjective judgment of analyst when there is no reliable proven value of evidential scores corresponding to relative importance of geological features that can directly be measured. In this regard, analysts apply expert opinion to assess relative importance of spatial evidences as meaningful decision support. This paper aims for fuzzification of continuous spatial data used as proxy evidence to facilitate and to support fuzzy MPM to generate exploration target areas for further examination of undiscovered deposits. In addition, this paper proposes to adapt the concept of expected value to further improve fuzzy logic MPM because the analysis of uncertain variables can be presented in terms of their expected value. The proposed modified expected value approach to MPM is not only a multi-criteria approach but it also treats uncertainty of geological processes a depicted by maps or spatial data in term of biased weighting more realistically in comparison with classified evidential maps because fuzzy membership scores are defined continuously whereby, for example, there is no need to categorize distances from evidential features to proximity classes using arbitrary intervals. The proposed continuous weighting approach and then integrating the weighted evidence layers by using modified expected value function, described in this paper can be used efficiently in either greenfields or brownfields.

  2. Smoker Reactivity to Cues: Effects on Craving and on Smoking behavior

    PubMed Central

    Shiffman, Saul; Dunbar, Michael; Kirchner, Thomas; Li, Xiaoxue; Tindle, Hilary; Anderson, Stewart; Scholl, Sarah

    2013-01-01

    We assessed craving and smoking in response to smoking-relevant cues. 207 daily smokers viewed images related to one of six cue sets (cigarettes, positive and negative affect, alcohol, smoking prohibitions, and neutral cues) in separate sessions. Compared to neutral cues, cigarette cues significantly increased craving, and positive affect cues significantly decreased craving. When subjects were then allowed to smoke during continuing cue exposure, cues did not affect the likelihood of smoking or the amount smoked (number of cigarettes, number of puffs, puff time, or increased carbon monoxide). However, craving intensity predicted likelihood of smoking, latency to smoke, and amount smoked, with craving increases after cue exposure making significant independent contributions. Some craving effects were curvilinear, suggesting that they are subject to thresholds and might not be observed under some circumstances. PMID:22708884

  3. A Global Drought Observatory for Emergency Response

    NASA Astrophysics Data System (ADS)

    Vogt, Jürgen; de Jager, Alfred; Carrão, Hugo; Magni, Diego; Mazzeschi, Marco; Barbosa, Paulo

    2016-04-01

    Droughts are occurring on all continents and across all climates. While in developed countries they cause significant economic and environmental damages, in less developed countries they may cause major humanitarian catastrophes. The magnitude of the problem and the expected increase in drought frequency, extent and severity in many, often highly vulnerable regions of the world demand a change from the current reactive, crisis-management approach towards a more pro-active, risk management approach. Such approach needs adequate and timely information from global to local scales as well as adequate drought management plans. Drought information systems are important for continuous monitoring and forecasting of the situation in order to provide timely information on developing drought events and their potential impacts. Against this background, the Joint Research Centre (JRC) is developing a Global Drought Observatory (GDO) for the European Commission's humanitarian services, providing up-to-date information on droughts world-wide and their potential impacts. Drought monitoring is achieved by a combination of meteorological and biophysical indicators, while the societal vulnerability to droughts is assessed through the targeted analysis of a series of social, economic and infrastructural indicators. The combination of the information on the occurrence and severity of a drought, on the assets at risk and on the societal vulnerability in the drought affected areas results in a likelihood of impact, which is expressed by a Likelihood of Drought Impact (LDI) indicator. The location, extent and magnitude of the LDI is then further analyzed against the number of people and land use/land cover types affected in order to provide the decision bodies with information on the potential humanitarian and economic bearings in the affected countries or regions. All information is presented through web-mapping interfaces based on OGC standards and customized reports can be drawn by the user. The system will be further developed by increasing the number of sectorial impact indicators and validated against known and documented cases around the world. The poster will provide an overview on the system, the LDI and first analysis results.

  4. 27 CFR 9.192 - Wahluke Slope.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... boundary line of section 15, which forms a portion of the boundary line of the Hanford Site, T15N/R26E, Wahatis Peak map; then (4) Proceed generally southwest along the Hanford Site boundary in a series of 90... Bridge map, and continue onto the Priest Rapids NE map to the intersection of the Hanford Site boundary...

  5. 27 CFR 9.192 - Wahluke Slope.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... boundary line of section 15, which forms a portion of the boundary line of the Hanford Site, T15N/R26E, Wahatis Peak map; then (4) Proceed generally southwest along the Hanford Site boundary in a series of 90... Bridge map, and continue onto the Priest Rapids NE map to the intersection of the Hanford Site boundary...

  6. 27 CFR 9.192 - Wahluke Slope.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... boundary line of section 15, which forms a portion of the boundary line of the Hanford Site, T15N/R26E, Wahatis Peak map; then (4) Proceed generally southwest along the Hanford Site boundary in a series of 90... Bridge map, and continue onto the Priest Rapids NE map to the intersection of the Hanford Site boundary...

  7. 27 CFR 9.192 - Wahluke Slope.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... boundary line of section 15, which forms a portion of the boundary line of the Hanford Site, T15N/R26E, Wahatis Peak map; then (4) Proceed generally southwest along the Hanford Site boundary in a series of 90... Bridge map, and continue onto the Priest Rapids NE map to the intersection of the Hanford Site boundary...

  8. 27 CFR 9.192 - Wahluke Slope.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... boundary line of section 15, which forms a portion of the boundary line of the Hanford Site, T15N/R26E, Wahatis Peak map; then (4) Proceed generally southwest along the Hanford Site boundary in a series of 90... Bridge map, and continue onto the Priest Rapids NE map to the intersection of the Hanford Site boundary...

  9. Mapping Their Place: Preschoolers Explore Space, Place, and Literacy

    ERIC Educational Resources Information Center

    Fantozzi, Victoria B.; Cottino, Elizabeth; Gennarelli, Cindy

    2013-01-01

    While maps and globes continue to be an important part of the geography and social studies curricula, there has been some debate about the ability of young children to engage in maps in a meaningful way. Some researchers have argued that children younger than seven do not have the spatial-cognitive abilities to truly understand the perspective and…

  10. Student Progress to Graduation in New York City High Schools: A Metric Designed by New Visions for Public Schools. Part I: Core Components

    ERIC Educational Resources Information Center

    Fairchild, Susan; Gunton, Brad; Donohue, Beverly; Berry, Carolyn; Genn, Ruth; Knevals, Jessica

    2011-01-01

    Students who achieve critical academic benchmarks such as high attendance rates, continuous levels of credit accumulation, and high grades have a greater likelihood of success throughout high school and beyond. However, keeping students on track toward meeting graduation requirements and quickly identifying students who are at risk of falling off…

  11. 78 FR 72639 - Non-Malleable Cast Iron Pipe Fittings From the People's Republic of China: Final Results of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... dumping margins likely to prevail is indicated in the ``Final Results of Sunset Review'' section of this... likelihood of continuation or recurrence of dumping and the magnitude of the margins likely to prevail if the... dumping margin likely to prevail [FR Doc. 2013-28952 Filed 12-2-13; 8:45 am] BILLING CODE 3510-DS-P ...

  12. "Your Pronunciation and Your Accent is Very Excellent": Orientations of Identity during Compliment Sequences in English as a Lingua Franca Encounters

    ERIC Educational Resources Information Center

    Jenks, Christopher

    2013-01-01

    The widespread use of English has--for better or worse--shaped the social and communicative norms and practices of many people the world over, and the likelihood of this continuing for the foreseeable future raises questions concerning English ownership, linguistic imperialism, language attrition, and mutual intelligibility, to name a few. These…

  13. Longitudinal Analysis of the Role of Perceived Self-Efficacy for Self-Regulated Learning in Academic Continuance and Achievement

    ERIC Educational Resources Information Center

    Caprara, Gian Vittorio; Fida, Roberta; Vecchione, Michele; Del Bove, Giannetta; Vecchio, Giovanni Maria; Barbaranelli, Claudio; Bandura, Albert

    2008-01-01

    The present study examined the developmental course of perceived efficacy for self-regulated learning and its contribution to academic achievement and likelihood of remaining in school in a sample of 412 Italian students (48% males and 52% females ranging in age from 12 to 22 years). Latent growth curve analysis revealed a progressive decline in…

  14. Polychromatic sparse image reconstruction and mass attenuation spectrum estimation via B-spline basis function expansion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, Renliang, E-mail: Venliang@iastate.edu, E-mail: ald@iastate.edu; Dogandžić, Aleksandar, E-mail: Venliang@iastate.edu, E-mail: ald@iastate.edu

    2015-03-31

    We develop a sparse image reconstruction method for polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. To obtain a parsimonious measurement model parameterization, we first rewrite the measurement equation using our mass-attenuation parameterization, which has the Laplace integral form. The unknown mass-attenuation spectrum is expanded into basis functions using a B-spline basis of order one. We develop a block coordinate-descent algorithm for constrained minimization of a penalized negative log-likelihood function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and sparsity of themore » density map image in the wavelet domain. This algorithm alternates between a Nesterov’s proximal-gradient step for estimating the density map image and an active-set step for estimating the incident spectrum parameters. Numerical simulations demonstrate the performance of the proposed scheme.« less

  15. Coping zone construction and mapping: an exploratory study of contextual coping, PTSD, and childhood violence exposure in urban areas.

    PubMed

    Sloan-Power, Elizabeth M; Boxer, Paul; McGuirl, Colleen; Church, Ruslana

    2013-06-01

    This mixed-method study explored how urban children aged 11 to 14 cope with multicontextual violence exposures simultaneously and analyzed the immediate action steps these children took when faced with such violence over time. Participants' (N = 12) narratives were initially analyzed utilizing a grounded theory framework as 68 violent incidents were coded for perceived threat and coping levels. Coping strategies were examined from a Transactional Model of Stress and Coping perspective taking into account the context and severity of each violent exposure itself. A comprehensive assessment map was developed to plot and visually reveal participants (N = 12) overall contextualized coping responses. Overall "coping zone" scores were generated to index perceived threat and coping responses associated with each violent incident described. These scores were then correlated with indicators of post-traumatic stress disorder (PTSD). Results indicated that urban children with less optimal coping zone scores across context have a greater likelihood of PTSD than do children who do not.

  16. Feature Statistics Modulate the Activation of Meaning During Spoken Word Processing.

    PubMed

    Devereux, Barry J; Taylor, Kirsten I; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K

    2016-03-01

    Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in (distinctiveness/sharedness) and likelihood of co-occurrence (correlational strength)--determine conceptual activation. To test these claims, we investigated the role of distinctiveness/sharedness and correlational strength in speech-to-meaning mapping, using a lexical decision task and computational simulations. Responses were faster for concepts with higher sharedness, suggesting that shared features are facilitatory in tasks like lexical decision that require access to them. Correlational strength facilitated responses for slower participants, suggesting a time-sensitive co-occurrence-driven settling mechanism. The computational simulation showed similar effects, with early effects of shared features and later effects of correlational strength. These results support a general-to-specific account of conceptual processing, whereby early activation of shared features is followed by the gradual emergence of a specific target representation. Copyright © 2015 The Authors. Cognitive Science published by Cognitive Science Society, Inc.

  17. Combining Radar and Optical Data for Forest Disturbance Studies

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon; Smith, David E. (Technical Monitor)

    2002-01-01

    Disturbance is an important factor in determining the carbon balance and succession of forests. Until the early 1990's researchers have focused on using optical or thermal sensors to detect and map forest disturbances from wild fires, logging or insect outbreaks. As part of a NASA Siberian mapping project, a study evaluated the capability of three different radar sensors (ERS, JERS and Radarsat) and an optical sensor (Landsat 7) to detect fire scars, logging and insect damage in the boreal forest. This paper describes the data sets and techniques used to evaluate the use of remote sensing to detect disturbance in central Siberian forests. Using images from each sensor individually and combined an assessment of the utility of using these sensors was developed. Transformed Divergence analysis and maximum likelihood classification revealed that Landsat data was the single best data type for this purpose. However, the combined use of the three radar and optical sensors did improve the results of discriminating these disturbances.

  18. Gender differences in working memory networks: A BrainMap meta-analysis

    PubMed Central

    Hill, Ashley C.; Laird, Angela R.; Robinson, Jennifer L.

    2014-01-01

    Gender differences in psychological processes have been of great interest in a variety of fields. While the majority of research in this area has focused on specific differences in relation to test performance, this study sought to determine the underlying neurofunctional differences observed during working memory, a pivotal cognitive process shown to be predictive of academic achievement and intelligence. Using the BrainMap database, we performed a meta-analysis and applied activation likelihood estimation to our search set. Our results demonstrate consistent working memory networks across genders, but also provide evidence for gender-specific networks whereby females consistently activate more limbic (e.g., amygdala and hippocampus) and prefrontal structures (e.g., right inferior frontal gyrus), and males activate a distributed network inclusive of more parietal regions. These data provide a framework for future investigation using functional or effective connectivity methods to elucidate the underpinnings of gender differences in neural network recruitment during working memory tasks. PMID:25042764

  19. A determination of the optimum time of year for remotely classifying marsh vegetation from LANDSAT multispectral scanner data. [Louisiana

    NASA Technical Reports Server (NTRS)

    Butera, M. K. (Principal Investigator)

    1978-01-01

    The author has identified the following significant results. A technique was used to determine the optimum time for classifying marsh vegetation from computer-processed LANDSAT MSS data. The technique depended on the analysis of data derived from supervised pattern recognition by maximum likelihood theory. A dispersion index, created by the ratio of separability among the class spectral means to variability within the classes, defined the optimum classification time. Data compared from seven LANDSAT passes acquired over the same area of Louisiana marsh indicated that June and September were optimum marsh mapping times to collectively classify Baccharis halimifolia, Spartina patens, Spartina alterniflora, Juncus roemericanus, and Distichlis spicata. The same technique was used to determine the optimum classification time for individual species. April appeared to be the best month to map Juncus roemericanus; May, Spartina alterniflora; June, Baccharis halimifolia; and September, Spartina patens and Distichlis spicata. This information is important, for instance, when a single species is recognized to indicate a particular environmental condition.

  20. Ground-water vulnerability to nitrate contamination in the mid-atlantic region

    USGS Publications Warehouse

    Greene, Earl A.; LaMotte, Andrew E.; Cullinan, Kerri-Ann; Smith, Elizabeth R.

    2005-01-01

    The U.S. Environmental Protection Agency?s (USEPA) Regional Vulnerability Assessment (ReVA) Program has developed a set of statistical tools to support regional-scale, integrated ecological risk-assessment studies. One of these tools, developed by the U.S. Geological Survey (USGS), is used with available water-quality data obtained from USGS National Water-Quality Assessment (NAWQA) and other studies in association with land cover, geology, soils, and other geographic data to develop logistic-regression equations that predict the vulnerability of ground water to nitrate concentrations exceeding specified thresholds in the Mid-Atlantic Region. The models were developed and applied to produce spatial probability maps showing the likelihood of elevated concentrations of nitrate in the region. These maps can be used to identify areas that currently are at risk and help identify areas where ground water has been affected by human activities. This information can be used by regional and local water managers to protect water supplies and identify land-use planning solutions and monitoring programs in these vulnerable areas.

Top