Sample records for variation detection framework

  1. Thermodynamic framework to assess low abundance DNA mutation detection by hybridization.

    PubMed

    Willems, Hanny; Jacobs, An; Hadiwikarta, Wahyu Wijaya; Venken, Tom; Valkenborg, Dirk; Van Roy, Nadine; Vandesompele, Jo; Hooyberghs, Jef

    2017-01-01

    The knowledge of genomic DNA variations in patient samples has a high and increasing value for human diagnostics in its broadest sense. Although many methods and sensors to detect or quantify these variations are available or under development, the number of underlying physico-chemical detection principles is limited. One of these principles is the hybridization of sample target DNA versus nucleic acid probes. We introduce a novel thermodynamics approach and develop a framework to exploit the specific detection capabilities of nucleic acid hybridization, using generic principles applicable to any platform. As a case study, we detect point mutations in the KRAS oncogene on a microarray platform. For the given platform and hybridization conditions, we demonstrate the multiplex detection capability of hybridization and assess the detection limit using thermodynamic considerations; DNA containing point mutations in a background of wild type sequences can be identified down to at least 1% relative concentration. In order to show the clinical relevance, the detection capabilities are confirmed on challenging formalin-fixed paraffin-embedded clinical tumor samples. This enzyme-free detection framework contains the accuracy and efficiency to screen for hundreds of mutations in a single run with many potential applications in molecular diagnostics and the field of personalised medicine.

  2. Thermodynamic framework to assess low abundance DNA mutation detection by hybridization

    PubMed Central

    Willems, Hanny; Jacobs, An; Hadiwikarta, Wahyu Wijaya; Venken, Tom; Valkenborg, Dirk; Van Roy, Nadine; Vandesompele, Jo; Hooyberghs, Jef

    2017-01-01

    The knowledge of genomic DNA variations in patient samples has a high and increasing value for human diagnostics in its broadest sense. Although many methods and sensors to detect or quantify these variations are available or under development, the number of underlying physico-chemical detection principles is limited. One of these principles is the hybridization of sample target DNA versus nucleic acid probes. We introduce a novel thermodynamics approach and develop a framework to exploit the specific detection capabilities of nucleic acid hybridization, using generic principles applicable to any platform. As a case study, we detect point mutations in the KRAS oncogene on a microarray platform. For the given platform and hybridization conditions, we demonstrate the multiplex detection capability of hybridization and assess the detection limit using thermodynamic considerations; DNA containing point mutations in a background of wild type sequences can be identified down to at least 1% relative concentration. In order to show the clinical relevance, the detection capabilities are confirmed on challenging formalin-fixed paraffin-embedded clinical tumor samples. This enzyme-free detection framework contains the accuracy and efficiency to screen for hundreds of mutations in a single run with many potential applications in molecular diagnostics and the field of personalised medicine. PMID:28542229

  3. A Perfect Match Genomic Landscape Provides a Unified Framework for the Precise Detection of Variation in Natural and Synthetic Haploid Genomes

    PubMed Central

    Palacios-Flores, Kim; García-Sotelo, Jair; Castillo, Alejandra; Uribe, Carina; Aguilar, Luis; Morales, Lucía; Gómez-Romero, Laura; Reyes, José; Garciarubio, Alejandro; Boege, Margareta; Dávila, Guillermo

    2018-01-01

    We present a conceptually simple, sensitive, precise, and essentially nonstatistical solution for the analysis of genome variation in haploid organisms. The generation of a Perfect Match Genomic Landscape (PMGL), which computes intergenome identity with single nucleotide resolution, reveals signatures of variation wherever a query genome differs from a reference genome. Such signatures encode the precise location of different types of variants, including single nucleotide variants, deletions, insertions, and amplifications, effectively introducing the concept of a general signature of variation. The precise nature of variants is then resolved through the generation of targeted alignments between specific sets of sequence reads and known regions of the reference genome. Thus, the perfect match logic decouples the identification of the location of variants from the characterization of their nature, providing a unified framework for the detection of genome variation. We assessed the performance of the PMGL strategy via simulation experiments. We determined the variation profiles of natural genomes and of a synthetic chromosome, both in the context of haploid yeast strains. Our approach uncovered variants that have previously escaped detection. Moreover, our strategy is ideally suited for further refining high-quality reference genomes. The source codes for the automated PMGL pipeline have been deposited in a public repository. PMID:29367403

  4. A Perfect Match Genomic Landscape Provides a Unified Framework for the Precise Detection of Variation in Natural and Synthetic Haploid Genomes.

    PubMed

    Palacios-Flores, Kim; García-Sotelo, Jair; Castillo, Alejandra; Uribe, Carina; Aguilar, Luis; Morales, Lucía; Gómez-Romero, Laura; Reyes, José; Garciarubio, Alejandro; Boege, Margareta; Dávila, Guillermo

    2018-04-01

    We present a conceptually simple, sensitive, precise, and essentially nonstatistical solution for the analysis of genome variation in haploid organisms. The generation of a Perfect Match Genomic Landscape (PMGL), which computes intergenome identity with single nucleotide resolution, reveals signatures of variation wherever a query genome differs from a reference genome. Such signatures encode the precise location of different types of variants, including single nucleotide variants, deletions, insertions, and amplifications, effectively introducing the concept of a general signature of variation. The precise nature of variants is then resolved through the generation of targeted alignments between specific sets of sequence reads and known regions of the reference genome. Thus, the perfect match logic decouples the identification of the location of variants from the characterization of their nature, providing a unified framework for the detection of genome variation. We assessed the performance of the PMGL strategy via simulation experiments. We determined the variation profiles of natural genomes and of a synthetic chromosome, both in the context of haploid yeast strains. Our approach uncovered variants that have previously escaped detection. Moreover, our strategy is ideally suited for further refining high-quality reference genomes. The source codes for the automated PMGL pipeline have been deposited in a public repository. Copyright © 2018 by the Genetics Society of America.

  5. Modeling kinetic rate variation in third generation DNA sequencing data to detect putative modifications to DNA bases

    PubMed Central

    Schadt, Eric E.; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H.; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A.; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew

    2013-01-01

    Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types. PMID:23093720

  6. Modeling kinetic rate variation in third generation DNA sequencing data to detect putative modifications to DNA bases.

    PubMed

    Schadt, Eric E; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew

    2013-01-01

    Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types.

  7. Multilevel Contextual 3-D CNNs for False Positive Reduction in Pulmonary Nodule Detection.

    PubMed

    Dou, Qi; Chen, Hao; Yu, Lequan; Qin, Jing; Heng, Pheng-Ann

    2017-07-01

    False positive reduction is one of the most crucial components in an automated pulmonary nodule detection system, which plays an important role in lung cancer diagnosis and early treatment. The objective of this paper is to effectively address the challenges in this task and therefore to accurately discriminate the true nodules from a large number of candidates. We propose a novel method employing three-dimensional (3-D) convolutional neural networks (CNNs) for false positive reduction in automated pulmonary nodule detection from volumetric computed tomography (CT) scans. Compared with its 2-D counterparts, the 3-D CNNs can encode richer spatial information and extract more representative features via their hierarchical architecture trained with 3-D samples. More importantly, we further propose a simple yet effective strategy to encode multilevel contextual information to meet the challenges coming with the large variations and hard mimics of pulmonary nodules. The proposed framework has been extensively validated in the LUNA16 challenge held in conjunction with ISBI 2016, where we achieved the highest competition performance metric (CPM) score in the false positive reduction track. Experimental results demonstrated the importance and effectiveness of integrating multilevel contextual information into 3-D CNN framework for automated pulmonary nodule detection in volumetric CT data. While our method is tailored for pulmonary nodule detection, the proposed framework is general and can be easily extended to many other 3-D object detection tasks from volumetric medical images, where the targeting objects have large variations and are accompanied by a number of hard mimics.

  8. Mapping morphological shape as a high-dimensional functional curve

    PubMed Central

    Fu, Guifang; Huang, Mian; Bo, Wenhao; Hao, Han; Wu, Rongling

    2018-01-01

    Abstract Detecting how genes regulate biological shape has become a multidisciplinary research interest because of its wide application in many disciplines. Despite its fundamental importance, the challenges of accurately extracting information from an image, statistically modeling the high-dimensional shape and meticulously locating shape quantitative trait loci (QTL) affect the progress of this research. In this article, we propose a novel integrated framework that incorporates shape analysis, statistical curve modeling and genetic mapping to detect significant QTLs regulating variation of biological shape traits. After quantifying morphological shape via a radius centroid contour approach, each shape, as a phenotype, was characterized as a high-dimensional curve, varying as angle θ runs clockwise with the first point starting from angle zero. We then modeled the dynamic trajectories of three mean curves and variation patterns as functions of θ. Our framework led to the detection of a few significant QTLs regulating the variation of leaf shape collected from a natural population of poplar, Populus szechuanica var tibetica. This population, distributed at altitudes 2000–4500 m above sea level, is an evolutionarily important plant species. This is the first work in the quantitative genetic shape mapping area that emphasizes a sense of ‘function’ instead of decomposing the shape into a few discrete principal components, as the majority of shape studies do. PMID:28062411

  9. Peak tree: a new tool for multiscale hierarchical representation and peak detection of mass spectrometry data.

    PubMed

    Zhang, Peng; Li, Houqiang; Wang, Honghui; Wong, Stephen T C; Zhou, Xiaobo

    2011-01-01

    Peak detection is one of the most important steps in mass spectrometry (MS) analysis. However, the detection result is greatly affected by severe spectrum variations. Unfortunately, most current peak detection methods are neither flexible enough to revise false detection results nor robust enough to resist spectrum variations. To improve flexibility, we introduce peak tree to represent the peak information in MS spectra. Each tree node is a peak judgment on a range of scales, and each tree decomposition, as a set of nodes, is a candidate peak detection result. To improve robustness, we combine peak detection and common peak alignment into a closed-loop framework, which finds the optimal decomposition via both peak intensity and common peak information. The common peak information is derived and loopily refined from the density clustering of the latest peak detection result. Finally, we present an improved ant colony optimization biomarker selection method to build a whole MS analysis system. Experiment shows that our peak detection method can better resist spectrum variations and provide higher sensitivity and lower false detection rates than conventional methods. The benefits from our peak-tree-based system for MS disease analysis are also proved on real SELDI data.

  10. Metabolic Model-Based Integration of Microbiome Taxonomic and Metabolomic Profiles Elucidates Mechanistic Links between Ecological and Metabolic Variation.

    PubMed

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan

    2016-01-01

    Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.

  11. Metabolic Model-Based Integration of Microbiome Taxonomic and Metabolomic Profiles Elucidates Mechanistic Links between Ecological and Metabolic Variation

    PubMed Central

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.

    2016-01-01

    ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563

  12. Improving detection of copy-number variation by simultaneous bias correction and read-depth segmentation.

    PubMed

    Szatkiewicz, Jin P; Wang, WeiBo; Sullivan, Patrick F; Wang, Wei; Sun, Wei

    2013-02-01

    Structural variation is an important class of genetic variation in mammals. High-throughput sequencing (HTS) technologies promise to revolutionize copy-number variation (CNV) detection but present substantial analytic challenges. Converging evidence suggests that multiple types of CNV-informative data (e.g. read-depth, read-pair, split-read) need be considered, and that sophisticated methods are needed for more accurate CNV detection. We observed that various sources of experimental biases in HTS confound read-depth estimation, and note that bias correction has not been adequately addressed by existing methods. We present a novel read-depth-based method, GENSENG, which uses a hidden Markov model and negative binomial regression framework to identify regions of discrete copy-number changes while simultaneously accounting for the effects of multiple confounders. Based on extensive calibration using multiple HTS data sets, we conclude that our method outperforms existing read-depth-based CNV detection algorithms. The concept of simultaneous bias correction and CNV detection can serve as a basis for combining read-depth with other types of information such as read-pair or split-read in a single analysis. A user-friendly and computationally efficient implementation of our method is freely available.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha

    ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community.more » Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCEStudies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.« less

  14. Modeling stream fish distributions using interval-censored detection times.

    PubMed

    Ferreira, Mário; Filipe, Ana Filipa; Bardos, David C; Magalhães, Maria Filomena; Beja, Pedro

    2016-08-01

    Controlling for imperfect detection is important for developing species distribution models (SDMs). Occupancy-detection models based on the time needed to detect a species can be used to address this problem, but this is hindered when times to detection are not known precisely. Here, we extend the time-to-detection model to deal with detections recorded in time intervals and illustrate the method using a case study on stream fish distribution modeling. We collected electrofishing samples of six fish species across a Mediterranean watershed in Northeast Portugal. Based on a Bayesian hierarchical framework, we modeled the probability of water presence in stream channels, and the probability of species occupancy conditional on water presence, in relation to environmental and spatial variables. We also modeled time-to-first detection conditional on occupancy in relation to local factors, using modified interval-censored exponential survival models. Posterior distributions of occupancy probabilities derived from the models were used to produce species distribution maps. Simulations indicated that the modified time-to-detection model provided unbiased parameter estimates despite interval-censoring. There was a tendency for spatial variation in detection rates to be primarily influenced by depth and, to a lesser extent, stream width. Species occupancies were consistently affected by stream order, elevation, and annual precipitation. Bayesian P-values and AUCs indicated that all models had adequate fit and high discrimination ability, respectively. Mapping of predicted occupancy probabilities showed widespread distribution by most species, but uncertainty was generally higher in tributaries and upper reaches. The interval-censored time-to-detection model provides a practical solution to model occupancy-detection when detections are recorded in time intervals. This modeling framework is useful for developing SDMs while controlling for variation in detection rates, as it uses simple data that can be readily collected by field ecologists.

  15. The 4D hyperspherical diffusion wavelet: A new method for the detection of localized anatomical variation.

    PubMed

    Hosseinbor, Ameer Pasha; Kim, Won Hwa; Adluru, Nagesh; Acharya, Amit; Vorperian, Houri K; Chung, Moo K

    2014-01-01

    Recently, the HyperSPHARM algorithm was proposed to parameterize multiple disjoint objects in a holistic manner using the 4D hyperspherical harmonics. The HyperSPHARM coefficients are global; they cannot be used to directly infer localized variations in signal. In this paper, we present a unified wavelet framework that links Hyper-SPHARM to the diffusion wavelet transform. Specifically, we will show that the HyperSPHARM basis forms a subset of a wavelet-based multiscale representation of surface-based signals. This wavelet, termed the hyperspherical diffusion wavelet, is a consequence of the equivalence of isotropic heat diffusion smoothing and the diffusion wavelet transform on the hypersphere. Our framework allows for the statistical inference of highly localized anatomical changes, which we demonstrate in the first-ever developmental study on the hyoid bone investigating gender and age effects. We also show that the hyperspherical wavelet successfully picks up group-wise differences that are barely detectable using SPHARM.

  16. The 4D Hyperspherical Diffusion Wavelet: A New Method for the Detection of Localized Anatomical Variation

    PubMed Central

    Hosseinbor, A. Pasha; Kim, Won Hwa; Adluru, Nagesh; Acharya, Amit; Vorperian, Houri K.; Chung, Moo K.

    2014-01-01

    Recently, the HyperSPHARM algorithm was proposed to parameterize multiple disjoint objects in a holistic manner using the 4D hyperspherical harmonics. The HyperSPHARM coefficients are global; they cannot be used to directly infer localized variations in signal. In this paper, we present a unified wavelet framework that links HyperSPHARM to the diffusion wavelet transform. Specifically, we will show that the HyperSPHARM basis forms a subset of a wavelet-based multiscale representation of surface-based signals. This wavelet, termed the hyperspherical diffusion wavelet, is a consequence of the equivalence of isotropic heat diffusion smoothing and the diffusion wavelet transform on the hypersphere. Our framework allows for the statistical inference of highly localized anatomical changes, which we demonstrate in the firstever developmental study on the hyoid bone investigating gender and age effects. We also show that the hyperspherical wavelet successfully picks up group-wise differences that are barely detectable using SPHARM. PMID:25320783

  17. VARiD: a variation detection framework for color-space and letter-space platforms.

    PubMed

    Dalca, Adrian V; Rumble, Stephen M; Levy, Samuel; Brudno, Michael

    2010-06-15

    High-throughput sequencing (HTS) technologies are transforming the study of genomic variation. The various HTS technologies have different sequencing biases and error rates, and while most HTS technologies sequence the residues of the genome directly, generating base calls for each position, the Applied Biosystem's SOLiD platform generates dibase-coded (color space) sequences. While combining data from the various platforms should increase the accuracy of variation detection, to date there are only a few tools that can identify variants from color space data, and none that can analyze color space and regular (letter space) data together. We present VARiD--a probabilistic method for variation detection from both letter- and color-space reads simultaneously. VARiD is based on a hidden Markov model and uses the forward-backward algorithm to accurately identify heterozygous, homozygous and tri-allelic SNPs, as well as micro-indels. Our analysis shows that VARiD performs better than the AB SOLiD toolset at detecting variants from color-space data alone, and improves the calls dramatically when letter- and color-space reads are combined. The toolset is freely available at http://compbio.cs.utoronto.ca/varid.

  18. Using multilevel spatial models to understand salamander site occupancy patterns after wildfire

    USGS Publications Warehouse

    Chelgren, Nathan; Adams, Michael J.; Bailey, Larissa L.; Bury, R. Bruce

    2011-01-01

    Studies of the distribution of elusive forest wildlife have suffered from the confounding of true presence with the uncertainty of detection. Occupancy modeling, which incorporates probabilities of species detection conditional on presence, is an emerging approach for reducing observation bias. However, the current likelihood modeling framework is restrictive for handling unexplained sources of variation in the response that may occur when there are dependence structures such as smaller sampling units that are nested within larger sampling units. We used multilevel Bayesian occupancy modeling to handle dependence structures and to partition sources of variation in occupancy of sites by terrestrial salamanders (family Plethodontidae) within and surrounding an earlier wildfire in western Oregon, USA. Comparison of model fit favored a spatial N-mixture model that accounted for variation in salamander abundance over models that were based on binary detection/non-detection data. Though catch per unit effort was higher in burned areas than unburned, there was strong support that this pattern was due to a higher probability of capture for individuals in burned plots. Within the burn, the odds of capturing an individual given it was present were 2.06 times the odds outside the burn, reflecting reduced complexity of ground cover in the burn. There was weak support that true occupancy was lower within the burned area. While the odds of occupancy in the burn were 0.49 times the odds outside the burn among the five species, the magnitude of variation attributed to the burn was small in comparison to variation attributed to other landscape variables and to unexplained, spatially autocorrelated random variation. While ordinary occupancy models may separate the biological pattern of interest from variation in detection probability when all sources of variation are known, the addition of random effects structures for unexplained sources of variation in occupancy and detection probability may often more appropriately represent levels of uncertainty. ?? 2011 by the Ecological Society of America.

  19. Identification of temporal variations in mental workload using locally-linear-embedding-based EEG feature reduction and support-vector-machine-based clustering and classification techniques.

    PubMed

    Yin, Zhong; Zhang, Jianhua

    2014-07-01

    Identifying the abnormal changes of mental workload (MWL) over time is quite crucial for preventing the accidents due to cognitive overload and inattention of human operators in safety-critical human-machine systems. It is known that various neuroimaging technologies can be used to identify the MWL variations. In order to classify MWL into a few discrete levels using representative MWL indicators and small-sized training samples, a novel EEG-based approach by combining locally linear embedding (LLE), support vector clustering (SVC) and support vector data description (SVDD) techniques is proposed and evaluated by using the experimentally measured data. The MWL indicators from different cortical regions are first elicited by using the LLE technique. Then, the SVC approach is used to find the clusters of these MWL indicators and thereby to detect MWL variations. It is shown that the clusters can be interpreted as the binary class MWL. Furthermore, a trained binary SVDD classifier is shown to be capable of detecting slight variations of those indicators. By combining the two schemes, a SVC-SVDD framework is proposed, where the clear-cut (smaller) cluster is detected by SVC first and then a subsequent SVDD model is utilized to divide the overlapped (larger) cluster into two classes. Finally, three-class MWL levels (low, normal and high) can be identified automatically. The experimental data analysis results are compared with those of several existing methods. It has been demonstrated that the proposed framework can lead to acceptable computational accuracy and has the advantages of both unsupervised and supervised training strategies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. A two-step framework for the registration of HE stained and FTIR images

    NASA Astrophysics Data System (ADS)

    Peñaranda, Francisco; Naranjo, Valery; Verdú, Rafaél.; Lloyd, Gavin R.; Nallala, Jayakrupakar; Stone, Nick

    2016-03-01

    FTIR spectroscopy is an emerging technology with high potential for cancer diagnosis but with particular physical phenomena that require special processing. Little work has been done in the field with the aim of registering hyperspectral Fourier-Transform Infrared (FTIR) spectroscopic images and Hematoxilin and Eosin (HE) stained histological images of contiguous slices of tissue. This registration is necessary to transfer the location of relevant structures that the pathologist may identify in the gold standard HE images. A two-step registration framework is presented where a representative gray image extracted from the FTIR hypercube is used as an input. This representative image, which must have a spatial contrast as similar as possible to a gray image obtained from the HE image, is calculated through the spectrum variation in the fingerprint region. In the first step of the registration algorithm a similarity transformation is estimated from interest points, which are automatically detected by the popular SURF algorithm. In the second stage, a variational registration framework defined in the frequency domain compensates for local anatomical variations between both images. After a proper tuning of some parameters the proposed registration framework works in an automated way. The method was tested on 7 samples of colon tissue in different stages of cancer. Very promising qualitative and quantitative results were obtained (a mean correlation ratio of 92.16% with a standard deviation of 3.10%).

  1. Robust Cell Detection of Histopathological Brain Tumor Images Using Sparse Reconstruction and Adaptive Dictionary Selection

    PubMed Central

    Su, Hai; Xing, Fuyong; Yang, Lin

    2016-01-01

    Successful diagnostic and prognostic stratification, treatment outcome prediction, and therapy planning depend on reproducible and accurate pathology analysis. Computer aided diagnosis (CAD) is a useful tool to help doctors make better decisions in cancer diagnosis and treatment. Accurate cell detection is often an essential prerequisite for subsequent cellular analysis. The major challenge of robust brain tumor nuclei/cell detection is to handle significant variations in cell appearance and to split touching cells. In this paper, we present an automatic cell detection framework using sparse reconstruction and adaptive dictionary learning. The main contributions of our method are: 1) A sparse reconstruction based approach to split touching cells; 2) An adaptive dictionary learning method used to handle cell appearance variations. The proposed method has been extensively tested on a data set with more than 2000 cells extracted from 32 whole slide scanned images. The automatic cell detection results are compared with the manually annotated ground truth and other state-of-the-art cell detection algorithms. The proposed method achieves the best cell detection accuracy with a F1 score = 0.96. PMID:26812706

  2. Automated segmentation and tracking of non-rigid objects in time-lapse microscopy videos of polymorphonuclear neutrophils.

    PubMed

    Brandes, Susanne; Mokhtari, Zeinab; Essig, Fabian; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo

    2015-02-01

    Time-lapse microscopy is an important technique to study the dynamics of various biological processes. The labor-intensive manual analysis of microscopy videos is increasingly replaced by automated segmentation and tracking methods. These methods are often limited to certain cell morphologies and/or cell stainings. In this paper, we present an automated segmentation and tracking framework that does not have these restrictions. In particular, our framework handles highly variable cell shapes and does not rely on any cell stainings. Our segmentation approach is based on a combination of spatial and temporal image variations to detect moving cells in microscopy videos. This method yields a sensitivity of 99% and a precision of 95% in object detection. The tracking of cells consists of different steps, starting from single-cell tracking based on a nearest-neighbor-approach, detection of cell-cell interactions and splitting of cell clusters, and finally combining tracklets using methods from graph theory. The segmentation and tracking framework was applied to synthetic as well as experimental datasets with varying cell densities implying different numbers of cell-cell interactions. We established a validation framework to measure the performance of our tracking technique. The cell tracking accuracy was found to be >99% for all datasets indicating a high accuracy for connecting the detected cells between different time points. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. A framework to evaluate the effects of small area variations in healthcare infrastructure on diagnostics and patient outcomes of rare diseases based on administrative data.

    PubMed

    Stargardt, Tom; Schreyögg, Jonas

    2012-05-01

    Small area variations in healthcare infrastructure may result in differences in early detection and outcomes for patients with rare diseases. It is our aim to provide a framework for evaluating small area variations in healthcare infrastructure on the diagnostics and health outcomes of rare diseases. We focus on administrative data as it allows (a) for relatively large sample sizes even though the prevalence of rare diseases is very low, and (b) makes it possible to link information on healthcare infrastructure to morbidity, mortality, and utilization. For identifying patients with a rare disease in a database, a combination of different classification systems has to be used due to usually multiple diseases sharing one ICD code. Outcomes should be chosen that are (a) appropriate for the disease, (b) identifiable and reliably coded in the administrative database, and (c) observable during the limited time period of the follow-up. Risk adjustment using summary scores of disease-specific or comprehensive risk adjustment instruments might be preferable over empirical weights because of the lower number of variables needed. The proposed framework will help to identify differences in time to diagnosis and treatment outcomes across areas in the context of rare diseases. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Unsupervised universal steganalyzer for high-dimensional steganalytic features

    NASA Astrophysics Data System (ADS)

    Hou, Xiaodan; Zhang, Tao

    2016-11-01

    The research in developing steganalytic features has been highly successful. These features are extremely powerful when applied to supervised binary classification problems. However, they are incompatible with unsupervised universal steganalysis because the unsupervised method cannot distinguish embedding distortion from varying levels of noises caused by cover variation. This study attempts to alleviate the problem by introducing similarity retrieval of image statistical properties (SRISP), with the specific aim of mitigating the effect of cover variation on the existing steganalytic features. First, cover images with some statistical properties similar to those of a given test image are searched from a retrieval cover database to establish an aided sample set. Then, unsupervised outlier detection is performed on a test set composed of the given test image and its aided sample set to determine the type (cover or stego) of the given test image. Our proposed framework, called SRISP-aided unsupervised outlier detection, requires no training. Thus, it does not suffer from model mismatch mess. Compared with prior unsupervised outlier detectors that do not consider SRISP, the proposed framework not only retains the universality but also exhibits superior performance when applied to high-dimensional steganalytic features.

  5. Pedestrian detection from thermal images: A sparse representation based approach

    NASA Astrophysics Data System (ADS)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  6. Hierarchical models of animal abundance and occurrence

    USGS Publications Warehouse

    Royle, J. Andrew; Dorazio, R.M.

    2006-01-01

    Much of animal ecology is devoted to studies of abundance and occurrence of species, based on surveys of spatially referenced sample units. These surveys frequently yield sparse counts that are contaminated by imperfect detection, making direct inference about abundance or occurrence based on observational data infeasible. This article describes a flexible hierarchical modeling framework for estimation and inference about animal abundance and occurrence from survey data that are subject to imperfect detection. Within this framework, we specify models of abundance and detectability of animals at the level of the local populations defined by the sample units. Information at the level of the local population is aggregated by specifying models that describe variation in abundance and detection among sites. We describe likelihood-based and Bayesian methods for estimation and inference under the resulting hierarchical model. We provide two examples of the application of hierarchical models to animal survey data, the first based on removal counts of stream fish and the second based on avian quadrat counts. For both examples, we provide a Bayesian analysis of the models using the software WinBUGS.

  7. CNV-RF Is a Random Forest-Based Copy Number Variation Detection Method Using Next-Generation Sequencing.

    PubMed

    Onsongo, Getiria; Baughn, Linda B; Bower, Matthew; Henzler, Christine; Schomaker, Matthew; Silverstein, Kevin A T; Thyagarajan, Bharat

    2016-11-01

    Simultaneous detection of small copy number variations (CNVs) (<0.5 kb) and single-nucleotide variants in clinically significant genes is of great interest for clinical laboratories. The analytical variability in next-generation sequencing (NGS) and artifacts in coverage data because of issues with mappability along with lack of robust bioinformatics tools for CNV detection have limited the utility of targeted NGS data to identify CNVs. We describe the development and implementation of a bioinformatics algorithm, copy number variation-random forest (CNV-RF), that incorporates a machine learning component to identify CNVs from targeted NGS data. Using CNV-RF, we identified 12 of 13 deletions in samples with known CNVs, two cases with duplications, and identified novel deletions in 22 additional cases. Furthermore, no CNVs were identified among 60 genes in 14 cases with normal copy number and no CNVs were identified in another 104 patients with clinical suspicion of CNVs. All positive deletions and duplications were confirmed using a quantitative PCR method. CNV-RF also detected heterozygous deletions and duplications with a specificity of 50% across 4813 genes. The ability of CNV-RF to detect clinically relevant CNVs with a high degree of sensitivity along with confirmation using a low-cost quantitative PCR method provides a framework for providing comprehensive NGS-based CNV/single-nucleotide variant detection in a clinical molecular diagnostics laboratory. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  8. The Beta Pictoris circumstellar disk. XV - Highly ionized species near Beta Pictoris

    NASA Technical Reports Server (NTRS)

    Deleuil, M.; Gry, C.; Lagrange-Henri, A.-M.; Vidal-Madjar, A.; Beust, H.; Ferlet, R.; Moos, H. W.; Livengood, T. A.; Ziskin, D.; Feldman, P. D.

    1993-01-01

    Temporal variations of the Fe II, Mg II, and Al III circumstellar lines towards Beta Pictoris have been detected and monitored since 1985. However, the unusual presence of Al III ions is still puzzling, since the UV stellar flux from an A5V star such as Beta Pic is insufficient to produce such an ion. In order to better define the origin of such a phenomenon, new observations have been carried out to detect faint signatures of other highly ionized species in the short UV wavelength range, where the stellar continuum flux is low. These observations reveal variations not only near the C IV doublet lines, but also in C I and Al II lines, two weakly ionized species, not clearly detectable until now. In the framework of an infalling body scenario, highly ionized species would be created in the tail, far from the comet head, by collisions with ambient gas surrounding the star, or a weak stellar wind. Spectral changes have also been detected near a CO molecular band location, which, if confirmed, would provide the first molecular signature around Beta Pictoris.

  9. Multiview human activity recognition system based on spatiotemporal template for video surveillance system

    NASA Astrophysics Data System (ADS)

    Kushwaha, Alok Kumar Singh; Srivastava, Rajeev

    2015-09-01

    An efficient view invariant framework for the recognition of human activities from an input video sequence is presented. The proposed framework is composed of three consecutive modules: (i) detect and locate people by background subtraction, (ii) view invariant spatiotemporal template creation for different activities, (iii) and finally, template matching is performed for view invariant activity recognition. The foreground objects present in a scene are extracted using change detection and background modeling. The view invariant templates are constructed using the motion history images and object shape information for different human activities in a video sequence. For matching the spatiotemporal templates for various activities, the moment invariants and Mahalanobis distance are used. The proposed approach is tested successfully on our own viewpoint dataset, KTH action recognition dataset, i3DPost multiview dataset, MSR viewpoint action dataset, VideoWeb multiview dataset, and WVU multiview human action recognition dataset. From the experimental results and analysis over the chosen datasets, it is observed that the proposed framework is robust, flexible, and efficient with respect to multiple views activity recognition, scale, and phase variations.

  10. Steganalysis Techniques for Documents and Images

    DTIC Science & Technology

    2005-05-01

    steganography . We then illustrated the efficacy of our model using variations of LSB steganography . For binary images , we have made significant progress in...efforts have focused on two areas. The first area is LSB steganalysis for grayscale images . Here, as we had proposed (as a challenging task), we have...generalized our previous steganalysis technique of sample pair analysis to a theoretical framework for the detection of the LSB steganography . The new

  11. Bayesian Community Detection in the Space of Group-Level Functional Differences

    PubMed Central

    Venkataraman, Archana; Yang, Daniel Y.-J.; Pelphrey, Kevin A.; Duncan, James S.

    2017-01-01

    We propose a unified Bayesian framework to detect both hyper- and hypo-active communities within whole-brain fMRI data. Specifically, our model identifies dense subgraphs that exhibit population-level differences in functional synchrony between a control and clinical group. We derive a variational EM algorithm to solve for the latent posterior distributions and parameter estimates, which subsequently inform us about the afflicted network topology. We demonstrate that our method provides valuable insights into the neural mechanisms underlying social dysfunction in autism, as verified by the Neurosynth meta-analytic database. In contrast, both univariate testing and community detection via recursive edge elimination fail to identify stable functional communities associated with the disorder. PMID:26955022

  12. Bayesian Community Detection in the Space of Group-Level Functional Differences.

    PubMed

    Venkataraman, Archana; Yang, Daniel Y-J; Pelphrey, Kevin A; Duncan, James S

    2016-08-01

    We propose a unified Bayesian framework to detect both hyper- and hypo-active communities within whole-brain fMRI data. Specifically, our model identifies dense subgraphs that exhibit population-level differences in functional synchrony between a control and clinical group. We derive a variational EM algorithm to solve for the latent posterior distributions and parameter estimates, which subsequently inform us about the afflicted network topology. We demonstrate that our method provides valuable insights into the neural mechanisms underlying social dysfunction in autism, as verified by the Neurosynth meta-analytic database. In contrast, both univariate testing and community detection via recursive edge elimination fail to identify stable functional communities associated with the disorder.

  13. A Unified Framework for Street-View Panorama Stitching

    PubMed Central

    Li, Li; Yao, Jian; Xie, Renping; Xia, Menghan; Zhang, Wei

    2016-01-01

    In this paper, we propose a unified framework to generate a pleasant and high-quality street-view panorama by stitching multiple panoramic images captured from the cameras mounted on the mobile platform. Our proposed framework is comprised of four major steps: image warping, color correction, optimal seam line detection and image blending. Since the input images are captured without a precisely common projection center from the scenes with the depth differences with respect to the cameras to different extents, such images cannot be precisely aligned in geometry. Therefore, an efficient image warping method based on the dense optical flow field is proposed to greatly suppress the influence of large geometric misalignment at first. Then, to lessen the influence of photometric inconsistencies caused by the illumination variations and different exposure settings, we propose an efficient color correction algorithm via matching extreme points of histograms to greatly decrease color differences between warped images. After that, the optimal seam lines between adjacent input images are detected via the graph cut energy minimization framework. At last, the Laplacian pyramid blending algorithm is applied to further eliminate the stitching artifacts along the optimal seam lines. Experimental results on a large set of challenging street-view panoramic images captured form the real world illustrate that the proposed system is capable of creating high-quality panoramas. PMID:28025481

  14. Detection and attribution of climate change at regional scale: case study of Karkheh river basin in the west of Iran

    NASA Astrophysics Data System (ADS)

    Zohrabi, Narges; Goodarzi, Elahe; Massah Bavani, Alireza; Najafi, Husain

    2017-11-01

    This research aims at providing a statistical framework for detection and attribution of climate variability and change at regional scale when at least 30 years of observation data are available. While extensive research has been done on detecting significant observed trends in hydroclimate variables and attribution to anthropogenic greenhouse gas emissions in large continents, less attention has been paid for regional scale analysis. The latter is mainly important for adaptation to climate change in different sectors including but not limited to energy, agriculture, and water resources planning and management, and it is still an open discussion in many countries including the West Asian ones. In the absence of regional climate models, an informative framework is suggested providing useful insights for policymakers. It benefits from general flexibility, not being computationally expensive, and applying several trend tests to analyze temporal variations in temperature and precipitation (gradual and step changes). The framework is implemented for a very important river basin in the west of Iran. In general, some increasing and decreasing trends of the interannual precipitation and temperature have been detected. For precipitation annual time series, a reducing step was seen around 1996 compared with the gradual change in most of the stations, which have not experience a dramatical change. The range of natural forcing is found to be ±76 % for precipitation and ±1.4 °C for temperature considering a two-dimensional diagram of precipitation and temperature anomalies from 1000-year control run of global climate model (GCM). Findings out of applying the proposed framework may provide useful insights into how to approach structural and non-structural climate change adaptation strategies from central governments.

  15. Plant trait detection with multi-scale spectrometry

    NASA Astrophysics Data System (ADS)

    Gamon, J. A.; Wang, R.

    2017-12-01

    Proximal and remote sensing using imaging spectrometry offers new opportunities for detecting plant traits, with benefits for phenotyping, productivity estimation, stress detection, and biodiversity studies. Using proximal and airborne spectrometry, we evaluated variation in plant optical properties at various spatial and spectral scales with the goal of identifying optimal scales for distinguishing plant traits related to photosynthetic function. Using directed approaches based on physiological vegetation indices, and statistical approaches based on spectral information content, we explored alternate ways of distinguishing plant traits with imaging spectrometry. With both leaf traits and canopy structure contributing to the signals, results exhibit a strong scale dependence. Our results demonstrate the benefits of multi-scale experimental approaches within a clear conceptual framework when applying remote sensing methods to plant trait detection for phenotyping, productivity, and biodiversity studies.

  16. Hierarchical spatial models of abundance and occurrence from imperfect survey data

    USGS Publications Warehouse

    Royle, J. Andrew; Kery, M.; Gautier, R.; Schmid, Hans

    2007-01-01

    Many estimation and inference problems arising from large-scale animal surveys are focused on developing an understanding of patterns in abundance or occurrence of a species based on spatially referenced count data. One fundamental challenge, then, is that it is generally not feasible to completely enumerate ('census') all individuals present in each sample unit. This observation bias may consist of several components, including spatial coverage bias (not all individuals in the Population are exposed to sampling) and detection bias (exposed individuals may go undetected). Thus, observations are biased for the state variable (abundance, occupancy) that is the object of inference. Moreover, data are often sparse for most observation locations, requiring consideration of methods for spatially aggregating or otherwise combining sparse data among sample units. The development of methods that unify spatial statistical models with models accommodating non-detection is necessary to resolve important spatial inference problems based on animal survey data. In this paper, we develop a novel hierarchical spatial model for estimation of abundance and occurrence from survey data wherein detection is imperfect. Our application is focused on spatial inference problems in the Swiss Survey of Common Breeding Birds. The observation model for the survey data is specified conditional on the unknown quadrat population size, N(s). We augment the observation model with a spatial process model for N(s), describing the spatial variation in abundance of the species. The model includes explicit sources of variation in habitat structure (forest, elevation) and latent variation in the form of a correlated spatial process. This provides a model-based framework for combining the spatially referenced samples while at the same time yielding a unified treatment of estimation problems involving both abundance and occurrence. We provide a Bayesian framework for analysis and prediction based on the integrated likelihood, and we use the model to obtain estimates of abundance and occurrence maps for the European Jay (Garrulus glandarius), a widespread, elusive, forest bird. The naive national abundance estimate ignoring imperfect detection and incomplete quadrat coverage was 77 766 territories. Accounting for imperfect detection added approximately 18 000 territories, and adjusting for coverage bias added another 131 000 territories to yield a fully corrected estimate of the national total of about 227 000 territories. This is approximately three times as high as previous estimates that assume every territory is detected in each quadrat.

  17. Factors influencing variation in physician adenoma detection rates: a theory-based approach for performance improvement.

    PubMed

    Atkins, Louise; Hunkeler, Enid M; Jensen, Christopher D; Michie, Susan; Lee, Jeffrey K; Doubeni, Chyke A; Zauber, Ann G; Levin, Theodore R; Quinn, Virginia P; Corley, Douglas A

    2016-03-01

    Interventions to improve physician adenoma detection rates for colonoscopy have generally not been successful, and there are little data on the factors contributing to variation that may be appropriate targets for intervention. We sought to identify factors that may influence variation in detection rates by using theory-based tools for understanding behavior. We separately studied gastroenterologists and endoscopy nurses at 3 Kaiser Permanente Northern California medical centers to identify potentially modifiable factors relevant to physician adenoma detection rate variability by using structured group interviews (focus groups) and theory-based tools for understanding behavior and eliciting behavior change: the Capability, Opportunity, and Motivation behavior model; the Theoretical Domains Framework; and the Behavior Change Wheel. Nine factors potentially associated with adenoma detection rate variability were identified, including 6 related to capability (uncertainty about which types of polyps to remove, style of endoscopy team leadership, compromised ability to focus during an examination due to distractions, examination technique during withdrawal, difficulty detecting certain types of adenomas, and examiner fatigue and pain), 2 related to opportunity (perceived pressure due to the number of examinations expected per shift and social pressure to finish examinations before scheduled breaks or the end of a shift), and 1 related to motivation (valuing a meticulous examination as the top priority). Examples of potential intervention strategies are provided. By using theory-based tools, this study identified several novel and potentially modifiable factors relating to capability, opportunity, and motivation that may contribute to adenoma detection rate variability and be appropriate targets for future intervention trials. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  18. A Bayesian framework to estimate diversification rates and their variation through time and space

    PubMed Central

    2011-01-01

    Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling. PMID:22013891

  19. Integrating Online and Offline Three-Dimensional Deep Learning for Automated Polyp Detection in Colonoscopy Videos.

    PubMed

    Lequan Yu; Hao Chen; Qi Dou; Jing Qin; Pheng Ann Heng

    2017-01-01

    Automated polyp detection in colonoscopy videos has been demonstrated to be a promising way for colorectal cancer prevention and diagnosis. Traditional manual screening is time consuming, operator dependent, and error prone; hence, automated detection approach is highly demanded in clinical practice. However, automated polyp detection is very challenging due to high intraclass variations in polyp size, color, shape, and texture, and low interclass variations between polyps and hard mimics. In this paper, we propose a novel offline and online three-dimensional (3-D) deep learning integration framework by leveraging the 3-D fully convolutional network (3D-FCN) to tackle this challenging problem. Compared with the previous methods employing hand-crafted features or 2-D convolutional neural network, the 3D-FCN is capable of learning more representative spatio-temporal features from colonoscopy videos, and hence has more powerful discrimination capability. More importantly, we propose a novel online learning scheme to deal with the problem of limited training data by harnessing the specific information of an input video in the learning process. We integrate offline and online learning to effectively reduce the number of false positives generated by the offline network and further improve the detection performance. Extensive experiments on the dataset of MICCAI 2015 Challenge on Polyp Detection demonstrated the better performance of our method when compared with other competitors.

  20. A novel semi-transductive learning framework for efficient atypicality detection in chest radiographs

    NASA Astrophysics Data System (ADS)

    Alzubaidi, Mohammad; Balasubramanian, Vineeth; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.

    2012-03-01

    Inductive learning refers to machine learning algorithms that learn a model from a set of training data instances. Any test instance is then classified by comparing it to the learned model. When the set of training instances lend themselves well to modeling, the use of a model substantially reduces the computation cost of classification. However, some training data sets are complex, and do not lend themselves well to modeling. Transductive learning refers to machine learning algorithms that classify test instances by comparing them to all of the training instances, without creating an explicit model. This can produce better classification performance, but at a much higher computational cost. Medical images vary greatly across human populations, constituting a data set that does not lend itself well to modeling. Our previous work showed that the wide variations seen across training sets of "normal" chest radiographs make it difficult to successfully classify test radiographs with an inductive (modeling) approach, and that a transductive approach leads to much better performance in detecting atypical regions. The problem with the transductive approach is its high computational cost. This paper develops and demonstrates a novel semi-transductive framework that can address the unique challenges of atypicality detection in chest radiographs. The proposed framework combines the superior performance of transductive methods with the reduced computational cost of inductive methods. Our results show that the proposed semitransductive approach provides both effective and efficient detection of atypical regions within a set of chest radiographs previously labeled by Mayo Clinic expert thoracic radiologists.

  1. CASPER: computer-aided segmentation of imperceptible motion-a learning-based tracking of an invisible needle in ultrasound.

    PubMed

    Beigi, Parmida; Rohling, Robert; Salcudean, Septimiu E; Ng, Gary C

    2017-11-01

    This paper presents a new micro-motion-based approach to track a needle in ultrasound images captured by a handheld transducer. We propose a novel learning-based framework to track a handheld needle by detecting microscale variations of motion dynamics over time. The current state of the art on using motion analysis for needle detection uses absolute motion and hence work well only when the transducer is static. We have introduced and evaluated novel spatiotemporal and spectral features, obtained from the phase image, in a self-supervised tracking framework to improve the detection accuracy in the subsequent frames using incremental training. Our proposed tracking method involves volumetric feature selection and differential flow analysis to incorporate the neighboring pixels and mitigate the effects of the subtle tremor motion of a handheld transducer. To evaluate the detection accuracy, the method is tested on porcine tissue in-vivo, during the needle insertion in the biceps femoris muscle. Experimental results show the mean, standard deviation and root-mean-square errors of [Formula: see text], [Formula: see text] and [Formula: see text] in the insertion angle, and 0.82, 1.21, 1.47 mm, in the needle tip, respectively. Compared to the appearance-based detection approaches, the proposed method is especially suitable for needles with ultrasonic characteristics that are imperceptible in the static image and to the naked eye.

  2. Estimate the effective connectivity in multi-coupled neural mass model using particle swarm optimization

    NASA Astrophysics Data System (ADS)

    Shan, Bonan; Wang, Jiang; Deng, Bin; Zhang, Zhen; Wei, Xile

    2017-03-01

    Assessment of the effective connectivity among different brain regions during seizure is a crucial problem in neuroscience today. As a consequence, a new model inversion framework of brain function imaging is introduced in this manuscript. This framework is based on approximating brain networks using a multi-coupled neural mass model (NMM). NMM describes the excitatory and inhibitory neural interactions, capturing the mechanisms involved in seizure initiation, evolution and termination. Particle swarm optimization method is used to estimate the effective connectivity variation (the parameters of NMM) and the epileptiform dynamics (the states of NMM) that cannot be directly measured using electrophysiological measurement alone. The estimated effective connectivity includes both the local connectivity parameters within a single region NMM and the remote connectivity parameters between multi-coupled NMMs. When the epileptiform activities are estimated, a proportional-integral controller outputs control signal so that the epileptiform spikes can be inhibited immediately. Numerical simulations are carried out to illustrate the effectiveness of the proposed framework. The framework and the results have a profound impact on the way we detect and treat epilepsy.

  3. Quantifying Variation in Gait Features from Wearable Inertial Sensors Using Mixed Effects Models

    PubMed Central

    Cresswell, Kellen Garrison; Shin, Yongyun; Chen, Shanshan

    2017-01-01

    The emerging technology of wearable inertial sensors has shown its advantages in collecting continuous longitudinal gait data outside laboratories. This freedom also presents challenges in collecting high-fidelity gait data. In the free-living environment, without constant supervision from researchers, sensor-based gait features are susceptible to variation from confounding factors such as gait speed and mounting uncertainty, which are challenging to control or estimate. This paper is one of the first attempts in the field to tackle such challenges using statistical modeling. By accepting the uncertainties and variation associated with wearable sensor-based gait data, we shift our efforts from detecting and correcting those variations to modeling them statistically. From gait data collected on one healthy, non-elderly subject during 48 full-factorial trials, we identified four major sources of variation, and quantified their impact on one gait outcome—range per cycle—using a random effects model and a fixed effects model. The methodology developed in this paper lays the groundwork for a statistical framework to account for sources of variation in wearable gait data, thus facilitating informative statistical inference for free-living gait analysis. PMID:28245602

  4. Quantitative gene-gene and gene-environment mapping for leaf shape variation using tree-based models.

    PubMed

    Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun

    2017-01-01

    Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  5. Analysis of Idiom Variation in the Framework of Linguistic Subjectivity

    ERIC Educational Resources Information Center

    Liu, Zhengyuan

    2012-01-01

    Idiom variation is a ubiquitous linguistic phenomenon which has raised a lot of research questions. The past approach was either formal or functional. Both of them did not pay much attention to cognitive factors of language users. By putting idiom variation in the framework of linguistic subjectivity, we have offered a new perspective in the…

  6. Variation Theory: A Theory of Learning and a Useful Theoretical Framework for Chemical Education Research

    ERIC Educational Resources Information Center

    Bussey, Thomas J.; Orgill, MaryKay; Crippen, Kent J.

    2013-01-01

    Instructors are constantly baffled by the fact that two students who are sitting in the same class, who have access to the same materials, can come to understand a particular chemistry concept differently. Variation theory offers a theoretical framework from which to explore possible variations in experience and the resulting differences in…

  7. Toward Monitoring Parkinson's Through Analysis of Static Handwriting Samples: A Quantitative Analytical Framework.

    PubMed

    Zhi, Naiqian; Jaeger, Beverly Kris; Gouldstone, Andrew; Sipahi, Rifat; Frank, Samuel

    2017-03-01

    Detection of changes in micrographia as a manifestation of symptomatic progression or therapeutic response in Parkinson's disease (PD) is challenging as such changes can be subtle. A computerized toolkit based on quantitative analysis of handwriting samples would be valuable as it could supplement and support clinical assessments, help monitor micrographia, and link it to PD. Such a toolkit would be especially useful if it could detect subtle yet relevant changes in handwriting morphology, thus enhancing resolution of the detection procedure. This would be made possible by developing a set of metrics sensitive enough to detect and discern micrographia with specificity. Several metrics that are sensitive to the characteristics of micrographia were developed, with minimal sensitivity to confounding handwriting artifacts. These metrics capture character size-reduction, ink utilization, and pixel density within a writing sample from left to right. They are used here to "score" handwritten signatures of 12 different individuals corresponding to healthy and symptomatic PD conditions, and sample control signatures that had been artificially reduced in size for comparison purposes. Moreover, metric analyses of samples from ten of the 12 individuals for which clinical diagnosis time is known show considerable informative variations when applied to static signature samples obtained before and after diagnosis. In particular, a measure called pixel density variation showed statistically significant differences ( ) between two comparison groups of remote signature recordings: earlier versus recent, based on independent and paired t-test analyses on a total of 40 signature samples. The quantitative framework developed here has the potential to be used in future controlled experiments to study micrographia and links to PD from various aspects, including monitoring and assessment of applied interventions and treatments. The inherent value in this methodology is further enhanced by its reliance solely on static signatures, not requiring dynamic sampling with specialized equipment.

  8. Designing deep sequencing experiments: detecting structural variation and estimating transcript abundance.

    PubMed

    Bashir, Ali; Bansal, Vikas; Bafna, Vineet

    2010-06-18

    Massively parallel DNA sequencing technologies have enabled the sequencing of several individual human genomes. These technologies are also being used in novel ways for mRNA expression profiling, genome-wide discovery of transcription-factor binding sites, small RNA discovery, etc. The multitude of sequencing platforms, each with their unique characteristics, pose a number of design challenges, regarding the technology to be used and the depth of sequencing required for a particular sequencing application. Here we describe a number of analytical and empirical results to address design questions for two applications: detection of structural variations from paired-end sequencing and estimating mRNA transcript abundance. For structural variation, our results provide explicit trade-offs between the detection and resolution of rearrangement breakpoints, and the optimal mix of paired-read insert lengths. Specifically, we prove that optimal detection and resolution of breakpoints is achieved using a mix of exactly two insert library lengths. Furthermore, we derive explicit formulae to determine these insert length combinations, enabling a 15% improvement in breakpoint detection at the same experimental cost. On empirical short read data, these predictions show good concordance with Illumina 200 bp and 2 Kbp insert length libraries. For transcriptome sequencing, we determine the sequencing depth needed to detect rare transcripts from a small pilot study. With only 1 Million reads, we derive corrections that enable almost perfect prediction of the underlying expression probability distribution, and use this to predict the sequencing depth required to detect low expressed genes with greater than 95% probability. Together, our results form a generic framework for many design considerations related to high-throughput sequencing. We provide software tools http://bix.ucsd.edu/projects/NGS-DesignTools to derive platform independent guidelines for designing sequencing experiments (amount of sequencing, choice of insert length, mix of libraries) for novel applications of next generation sequencing.

  9. The Impact of Accelerating Faster than Exponential Population Growth on Genetic Variation

    PubMed Central

    Reppell, Mark; Boehnke, Michael; Zöllner, Sebastian

    2014-01-01

    Current human sequencing projects observe an abundance of extremely rare genetic variation, suggesting recent acceleration of population growth. To better understand the impact of such accelerating growth on the quantity and nature of genetic variation, we present a new class of models capable of incorporating faster than exponential growth in a coalescent framework. Our work shows that such accelerated growth affects only the population size in the recent past and thus large samples are required to detect the models’ effects on patterns of variation. When we compare models with fixed initial growth rate, models with accelerating growth achieve very large current population sizes and large samples from these populations contain more variation than samples from populations with constant growth. This increase is driven almost entirely by an increase in singleton variation. Moreover, linkage disequilibrium decays faster in populations with accelerating growth. When we instead condition on current population size, models with accelerating growth result in less overall variation and slower linkage disequilibrium decay compared to models with exponential growth. We also find that pairwise linkage disequilibrium of very rare variants contains information about growth rates in the recent past. Finally, we demonstrate that models of accelerating growth may substantially change estimates of present-day effective population sizes and growth times. PMID:24381333

  10. The impact of accelerating faster than exponential population growth on genetic variation.

    PubMed

    Reppell, Mark; Boehnke, Michael; Zöllner, Sebastian

    2014-03-01

    Current human sequencing projects observe an abundance of extremely rare genetic variation, suggesting recent acceleration of population growth. To better understand the impact of such accelerating growth on the quantity and nature of genetic variation, we present a new class of models capable of incorporating faster than exponential growth in a coalescent framework. Our work shows that such accelerated growth affects only the population size in the recent past and thus large samples are required to detect the models' effects on patterns of variation. When we compare models with fixed initial growth rate, models with accelerating growth achieve very large current population sizes and large samples from these populations contain more variation than samples from populations with constant growth. This increase is driven almost entirely by an increase in singleton variation. Moreover, linkage disequilibrium decays faster in populations with accelerating growth. When we instead condition on current population size, models with accelerating growth result in less overall variation and slower linkage disequilibrium decay compared to models with exponential growth. We also find that pairwise linkage disequilibrium of very rare variants contains information about growth rates in the recent past. Finally, we demonstrate that models of accelerating growth may substantially change estimates of present-day effective population sizes and growth times.

  11. Detection of large color variation in the potentially hazardous asteroid (297274) 1996 SK

    NASA Astrophysics Data System (ADS)

    Lin, Chien-Hsien; Ip, Wing-Huen; Lin, Zhong-Yi; Yoshida, Fumi; Cheng, Yu-Chi

    2014-03-01

    Low-inclination near-earth asteroid (NEA) (297274) 1996 SK, which is also classified as a potentially hazardous asteroid, has a highly eccentric orbit. It was studied by multi-wavelength photometry within the framework of an NEA color survey at Lulin Observatory. Here, we report the finding of large color variation across the surface of (297274) 1996 SK within one asteroidal rotation period of 4.656 ± 0.122 hours and classify it as an S-type asteroid according to its average colors of B — V = 0.767 ± 0.033, V — R = 0.482 ± 0.021, V — I = 0.801 ± 0.025 and the corresponding relative reflectance spectrum. These results might be indicative of differential space weathering or compositional inhomogeneity in the surface materials.

  12. Detectability index of differential phase contrast CT compared with conventional CT: a preliminary channelized Hotelling observer study

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Yang, Yi; Tang, Shaojie

    2013-03-01

    Under the framework of model observer with signal and background exactly known (SKE/BKE), we investigate the detectability of differential phase contrast CT compared with that of the conventional attenuation-based CT. Using the channelized Hotelling observer and the radially symmetric difference-of-Gaussians channel template , we investigate the detectability index and its variation over the dimension of object and detector cells. The preliminary data show that the differential phase contrast CT outperforms the conventional attenuation-based CT significantly in the detectability index while both the object to be detected and the cell of detector used for data acquisition are relatively small. However, the differential phase contrast CT's dominance in the detectability index diminishes with increasing dimension of either object or detector cell, and virtually disappears while the dimension of object or detector cell approaches a threshold, respectively. It is hoped that the preliminary data reported in this paper may provide insightful understanding of the differential phase contrast CT's characteristic in the detectability index and its comparison with that of the conventional attenuation-based CT.

  13. Accounting for Incomplete Species Detection in Fish Community Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A; Orth, Dr. Donald J; Jager, Yetta

    2013-01-01

    Riverine fish assemblages are heterogeneous and very difficult to characterize with a one-size-fits-all approach to sampling. Furthermore, detecting changes in fish assemblages over time requires accounting for variation in sampling designs. We present a modeling approach that permits heterogeneous sampling by accounting for site and sampling covariates (including method) in a model-based framework for estimation (versus a sampling-based framework). We snorkeled during three surveys and electrofished during a single survey in suite of delineated habitats stratified by reach types. We developed single-species occupancy models to determine covariates influencing patch occupancy and species detection probabilities whereas community occupancy models estimated speciesmore » richness in light of incomplete detections. For most species, information-theoretic criteria showed higher support for models that included patch size and reach as covariates of occupancy. In addition, models including patch size and sampling method as covariates of detection probabilities also had higher support. Detection probability estimates for snorkeling surveys were higher for larger non-benthic species whereas electrofishing was more effective at detecting smaller benthic species. The number of sites and sampling occasions required to accurately estimate occupancy varied among fish species. For rare benthic species, our results suggested that higher number of occasions, and especially the addition of electrofishing, may be required to improve detection probabilities and obtain accurate occupancy estimates. Community models suggested that richness was 41% higher than the number of species actually observed and the addition of an electrofishing survey increased estimated richness by 13%. These results can be useful to future fish assemblage monitoring efforts by informing sampling designs, such as site selection (e.g. stratifying based on patch size) and determining effort required (e.g. number of sites versus occasions).« less

  14. A stochastic inference of de novo CNV detection and association test in multiplex schizophrenia families.

    PubMed

    Wang, Shi-Heng; Chen, Wei J; Tsai, Yu-Chin; Huang, Yung-Hsiang; Hwu, Hai-Gwo; Hsiao, Chuhsing K

    2013-01-01

    The copy number variation (CNV) is a type of genetic variation in the genome. It is measured based on signal intensity measures and can be assessed repeatedly to reduce the uncertainty in PCR-based typing. Studies have shown that CNVs may lead to phenotypic variation and modification of disease expression. Various challenges exist, however, in the exploration of CNV-disease association. Here we construct latent variables to infer the discrete CNV values and to estimate the probability of mutations. In addition, we propose to pool rare variants to increase the statistical power and we conduct family studies to mitigate the computational burden in determining the composition of CNVs on each chromosome. To explore in a stochastic sense the association between the collapsing CNV variants and disease status, we utilize a Bayesian hierarchical model incorporating the mutation parameters. This model assigns integers in a probabilistic sense to the quantitatively measured copy numbers, and is able to test simultaneously the association for all variants of interest in a regression framework. This integrative model can account for the uncertainty in copy number assignment and differentiate if the variation was de novo or inherited on the basis of posterior probabilities. For family studies, this model can accommodate the dependence within family members and among repeated CNV data. Moreover, the Mendelian rule can be assumed under this model and yet the genetic variation, including de novo and inherited variation, can still be included and quantified directly for each individual. Finally, simulation studies show that this model has high true positive and low false positive rates in the detection of de novo mutation.

  15. Predicting behavioural responses to novel organisms: state-dependent detection theory

    PubMed Central

    Sih, Andrew

    2017-01-01

    Human activity alters natural habitats for many species. Understanding variation in animals' behavioural responses to these changing environments is critical. We show how signal detection theory can be used within a wider framework of state-dependent modelling to predict behavioural responses to a major environmental change: novel, exotic species. We allow thresholds for action to be a function of reserves, and demonstrate how optimal thresholds can be calculated. We term this framework ‘state-dependent detection theory’ (SDDT). We focus on behavioural and fitness outcomes when animals continue to use formerly adaptive thresholds following environmental change. In a simple example, we show that exposure to novel animals which appear dangerous—but are actually safe—(e.g. ecotourists) can have catastrophic consequences for ‘prey’ (organisms that respond as if the new organisms are predators), significantly increasing mortality even when the novel species is not predatory. SDDT also reveals that the effect on reproduction can be greater than the effect on lifespan. We investigate factors that influence the effect of novel organisms, and address the potential for behavioural adjustments (via evolution or learning) to recover otherwise reduced fitness. Although effects of environmental change are often difficult to predict, we suggest that SDDT provides a useful route ahead. PMID:28100814

  16. Predicting behavioural responses to novel organisms: state-dependent detection theory.

    PubMed

    Trimmer, Pete C; Ehlman, Sean M; Sih, Andrew

    2017-01-25

    Human activity alters natural habitats for many species. Understanding variation in animals' behavioural responses to these changing environments is critical. We show how signal detection theory can be used within a wider framework of state-dependent modelling to predict behavioural responses to a major environmental change: novel, exotic species. We allow thresholds for action to be a function of reserves, and demonstrate how optimal thresholds can be calculated. We term this framework 'state-dependent detection theory' (SDDT). We focus on behavioural and fitness outcomes when animals continue to use formerly adaptive thresholds following environmental change. In a simple example, we show that exposure to novel animals which appear dangerous-but are actually safe-(e.g. ecotourists) can have catastrophic consequences for 'prey' (organisms that respond as if the new organisms are predators), significantly increasing mortality even when the novel species is not predatory. SDDT also reveals that the effect on reproduction can be greater than the effect on lifespan. We investigate factors that influence the effect of novel organisms, and address the potential for behavioural adjustments (via evolution or learning) to recover otherwise reduced fitness. Although effects of environmental change are often difficult to predict, we suggest that SDDT provides a useful route ahead. © 2017 The Author(s).

  17. Detecting Adaptation in Protein-Coding Genes Using a Bayesian Site-Heterogeneous Mutation-Selection Codon Substitution Model.

    PubMed

    Rodrigue, Nicolas; Lartillot, Nicolas

    2017-01-01

    Codon substitution models have traditionally attempted to uncover signatures of adaptation within protein-coding genes by contrasting the rates of synonymous and non-synonymous substitutions. Another modeling approach, known as the mutation-selection framework, attempts to explicitly account for selective patterns at the amino acid level, with some approaches allowing for heterogeneity in these patterns across codon sites. Under such a model, substitutions at a given position occur at the neutral or nearly neutral rate when they are synonymous, or when they correspond to replacements between amino acids of similar fitness; substitutions from high to low (low to high) fitness amino acids have comparatively low (high) rates. Here, we study the use of such a mutation-selection framework as a null model for the detection of adaptation. Following previous works in this direction, we include a deviation parameter that has the effect of capturing the surplus, or deficit, in non-synonymous rates, relative to what would be expected under a mutation-selection modeling framework that includes a Dirichlet process approach to account for across-codon-site variation in amino acid fitness profiles. We use simulations, along with a few real data sets, to study the behavior of the approach, and find it to have good power with a low false-positive rate. Altogether, we emphasize the potential of recent mutation-selection models in the detection of adaptation, calling for further model refinements as well as large-scale applications. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  18. Compressed Genotyping

    PubMed Central

    Erlich, Yaniv; Gordon, Assaf; Brand, Michael; Hannon, Gregory J.; Mitra, Partha P.

    2011-01-01

    Over the past three decades we have steadily increased our knowledge on the genetic basis of many severe disorders. Nevertheless, there are still great challenges in applying this knowledge routinely in the clinic, mainly due to the relatively tedious and expensive process of genotyping. Since the genetic variations that underlie the disorders are relatively rare in the population, they can be thought of as a sparse signal. Using methods and ideas from compressed sensing and group testing, we have developed a cost-effective genotyping protocol to detect carriers for severe genetic disorders. In particular, we have adapted our scheme to a recently developed class of high throughput DNA sequencing technologies. The mathematical framework presented here has some important distinctions from the ’traditional’ compressed sensing and group testing frameworks in order to address biological and technical constraints of our setting. PMID:21451737

  19. Control sample design using a geodemographic discriminator: An application of Super Profiles

    NASA Astrophysics Data System (ADS)

    Brown, Peter J. B.; McCulloch, Peter G.; Williams, Evelyn M. I.; Ashurst, Darren C.

    The development and application of an innovative sampling framework for use in a British study of the early detection of gastric cancer are described. The Super Profiles geodemographic discriminator is used in the identification of geographically distinct control and contrast areas from which samples of cancer registry case records may be drawn for comparison with the records of patients participating in the gastric cancer intervention project. Preliminary results of the application of the framework are presented and confirm its effectiveness in satisfactorily reflecting known patterns of variation in cancer occurrence by age, gender and social class. The method works well for cancers with a known and clear social gradient, such as lung and breast cancer, moderately well for gastric cancer and somewhat less well for oesophageal cancer, where the social class gradient is less clear.

  20. A magnetic flux leakage and magnetostrictive guided wave hybrid transducer for detecting bridge cables.

    PubMed

    Xu, Jiang; Wu, Xinjun; Cheng, Cheng; Ben, Anran

    2012-01-01

    Condition assessment of cables has gained considerable attention for the bridge safety. A magnetic flux leakage and magnetostrictive guided wave hybrid transducer is provided to inspect bridge cables. The similarities and differences between the two methods are investigated. The hybrid transducer for bridge cables consists of an aluminum framework, climbing modules, embedded magnetizers and a ribbon coil. The static axial magnetic field provided by the magnetizers meets the needs of the magnetic flux leakage testing and the magnetostrictive guided wave testing. The magnetizers also provide the attraction for the climbing modules. In the magnetic flux leakage testing for the free length of cable, the coil induces the axial leakage magnetic field. In the magnetostrictive guided wave testing for the anchorage zone, the coil provides a pulse high power variational magnetic field for generating guided waves; the coil induces the magnetic field variation for receiving guided waves. The experimental results show that the transducer with the corresponding inspection system could be applied to detect the broken wires in the free length and in the anchorage zone of bridge cables.

  1. A Magnetic Flux Leakage and Magnetostrictive Guided Wave Hybrid Transducer for Detecting Bridge Cables

    PubMed Central

    Xu, Jiang; Wu, Xinjun; Cheng, Cheng; Ben, Anran

    2012-01-01

    Condition assessment of cables has gained considerable attention for the bridge safety. A magnetic flux leakage and magnetostrictive guided wave hybrid transducer is provided to inspect bridge cables. The similarities and differences between the two methods are investigated. The hybrid transducer for bridge cables consists of an aluminum framework, climbing modules, embedded magnetizers and a ribbon coil. The static axial magnetic field provided by the magnetizers meets the needs of the magnetic flux leakage testing and the magnetostrictive guided wave testing. The magnetizers also provide the attraction for the climbing modules. In the magnetic flux leakage testing for the free length of cable, the coil induces the axial leakage magnetic field. In the magnetostrictive guided wave testing for the anchorage zone, the coil provides a pulse high power variational magnetic field for generating guided waves; the coil induces the magnetic field variation for receiving guided waves. The experimental results show that the transducer with the corresponding inspection system could be applied to detect the broken wires in the free length and in the anchorage zone of bridge cables. PMID:22368483

  2. iCopyDAV: Integrated platform for copy number variations—Detection, annotation and visualization

    PubMed Central

    Vogeti, Sriharsha

    2018-01-01

    Discovery of copy number variations (CNVs), a major category of structural variations, have dramatically changed our understanding of differences between individuals and provide an alternate paradigm for the genetic basis of human diseases. CNVs include both copy gain and copy loss events and their detection genome-wide is now possible using high-throughput, low-cost next generation sequencing (NGS) methods. However, accurate detection of CNVs from NGS data is not straightforward due to non-uniform coverage of reads resulting from various systemic biases. We have developed an integrated platform, iCopyDAV, to handle some of these issues in CNV detection in whole genome NGS data. It has a modular framework comprising five major modules: data pre-treatment, segmentation, variant calling, annotation and visualization. An important feature of iCopyDAV is the functional annotation module that enables the user to identify and prioritize CNVs encompassing various functional elements, genomic features and disease-associations. Parallelization of the segmentation algorithms makes the iCopyDAV platform even accessible on a desktop. Here we show the effect of sequencing coverage, read length, bin size, data pre-treatment and segmentation approaches on accurate detection of the complete spectrum of CNVs. Performance of iCopyDAV is evaluated on both simulated data and real data for different sequencing depths. It is an open-source integrated pipeline available at https://github.com/vogetihrsh/icopydav and as Docker’s image at http://bioinf.iiit.ac.in/icopydav/. PMID:29621297

  3. Automatic detection and recognition of signs from natural scenes.

    PubMed

    Chen, Xilin; Yang, Jie; Zhang, Jing; Waibel, Alex

    2004-01-01

    In this paper, we present an approach to automatic detection and recognition of signs from natural scenes, and its application to a sign translation task. The proposed approach embeds multiresolution and multiscale edge detection, adaptive searching, color analysis, and affine rectification in a hierarchical framework for sign detection, with different emphases at each phase to handle the text in different sizes, orientations, color distributions and backgrounds. We use affine rectification to recover deformation of the text regions caused by an inappropriate camera view angle. The procedure can significantly improve text detection rate and optical character recognition (OCR) accuracy. Instead of using binary information for OCR, we extract features from an intensity image directly. We propose a local intensity normalization method to effectively handle lighting variations, followed by a Gabor transform to obtain local features, and finally a linear discriminant analysis (LDA) method for feature selection. We have applied the approach in developing a Chinese sign translation system, which can automatically detect and recognize Chinese signs as input from a camera, and translate the recognized text into English.

  4. Novel droplet platforms for the detection of disease biomarkers.

    PubMed

    Zec, Helena; Shin, Dong Jin; Wang, Tza-Huei

    2014-09-01

    Personalized medicine - healthcare based on individual genetic variation - has the potential to transform the way healthcare is delivered to patients. The promise of personalized medicine has been predicated on the predictive and diagnostic power of genomic and proteomic biomarkers. Biomarker screening may help improve health outcomes, for example, by identifying individuals' susceptibility to diseases and predicting how patients will respond to drugs. Microfluidic droplet technology offers an exciting opportunity to revolutionize the accessibility of personalized medicine. A framework for the role of droplet microfluidics in biomarker detection can be based on two main themes. Emulsion-based microdroplet platforms can provide new ways to measure and detect biomolecules. In addition, microdroplet platforms facilitate high-throughput screening of biomarkers. Meanwhile, surface-based droplet platforms provide an opportunity to develop miniaturized diagnostic systems. These platforms may function as portable benchtop environments that dramatically shorten the transition of a benchtop assay into a point-of-care format.

  5. Exploring Conceptual Frameworks of Models of Atomic Structures and Periodic Variations, Chemical Bonding, and Molecular Shape and Polarity: A Comparison of Undergraduate General Chemistry Students with High and Low Levels of Content Knowledge

    ERIC Educational Resources Information Center

    Wang, Chia-Yu; Barrow, Lloyd H.

    2013-01-01

    The purpose of the study was to explore students' conceptual frameworks of models of atomic structure and periodic variations, chemical bonding, and molecular shape and polarity, and how these conceptual frameworks influence their quality of explanations and ability to shift among chemical representations. This study employed a purposeful sampling…

  6. A framework based on 2-D Taylor expansion for quantifying the impacts of subpixel reflectance variance and covariance on cloud optical thickness and effective radius retrievals based on the bispectral method

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Werner, F.; Cho, H.-M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, K.

    2016-06-01

    The bispectral method retrieves cloud optical thickness (τ) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VIS/NIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring subpixel variations of cloud reflectances can lead to a significant bias in the retrieved τ and re. In the literature, the retrievals of τ and re are often assumed to be independent and considered separately when investigating the impact of subpixel cloud reflectance variations on the bispectral method. As a result, the impact on τ is contributed only by the subpixel variation of VIS/NIR band reflectance and the impact on re only by the subpixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of subpixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the τ and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how subpixel cloud reflectance variations impact the τ and re retrievals based on the bispectral method. In particular, our framework provides a mathematical explanation of how the subpixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used to estimate the retrieval uncertainty from subpixel reflectance variations in operational satellite cloud products and to help understand the differences in τ and re retrievals between two instruments.

  7. A Framework Based on 2-D Taylor Expansion for Quantifying the Impacts of Sub-Pixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bi-Spectral Method

    NASA Technical Reports Server (NTRS)

    Zhang, Z.; Werner, F.; Cho, H. -M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2016-01-01

    The bi-spectral method retrieves cloud optical thickness and cloud droplet effective radius simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VISNIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved and re. In the literature, the retrievals of and re are often assumed to be independent and considered separately when investigating the impact of sub-pixel cloud reflectance variations on the bi-spectral method. As a result, the impact on is contributed only by the sub-pixel variation of VISNIR band reflectance and the impact on re only by the sub-pixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VISNIR and SWIR cloud reflectances and their covariance on the and re retrievals. This framework takes into account the fact that the retrievals are determined by both VISNIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VISNIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used to estimate the retrieval uncertainty from sub-pixel reflectance variations in operational satellite cloud products and to help understand the differences in and re retrievals between two instruments.

  8. A Framework Based on 2-D Taylor Expansion for Quantifying the Impacts of Subpixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bispectral Method

    NASA Technical Reports Server (NTRS)

    Zhang, Z.; Werner, F.; Cho, H.-M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, K.

    2016-01-01

    The bispectral method retrieves cloud optical thickness (t) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VIS/NIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring subpixel variations of cloud reflectances can lead to a significant bias in the retrieved t and re. In the literature, the retrievals of t and re are often assumed to be independent and considered separately when investigating the impact of subpixel cloud reflectance variations on the bispectral method. As a result, the impact on t is contributed only by the subpixel variation of VIS/NIR band reflectance and the impact on re only by the subpixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of subpixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the t and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how subpixel cloud reflectance variations impact the t and re retrievals based on the bispectral method. In particular, our framework provides a mathematical explanation of how the subpixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used to estimate the retrieval uncertainty from subpixel reflectance variations in operational satellite cloud products and to help understand the differences in t and re retrievals between two instruments.

  9. An approach to analyze the breast tissues in infrared images using nonlinear adaptive level sets and Riesz transform features.

    PubMed

    Prabha, S; Suganthi, S S; Sujatha, C M

    2015-01-01

    Breast thermography is a potential imaging method for the early detection of breast cancer. The pathological conditions can be determined by measuring temperature variations in the abnormal breast regions. Accurate delineation of breast tissues is reported as a challenging task due to inherent limitations of infrared images such as low contrast, low signal to noise ratio and absence of clear edges. Segmentation technique is attempted to delineate the breast tissues by detecting proper lower breast boundaries and inframammary folds. Characteristic features are extracted to analyze the asymmetrical thermal variations in normal and abnormal breast tissues. An automated analysis of thermal variations of breast tissues is attempted using nonlinear adaptive level sets and Riesz transform. Breast thermal images are initially subjected to Stein's unbiased risk estimate based orthonormal wavelet denoising. These denoised images are enhanced using contrast-limited adaptive histogram equalization method. The breast tissues are then segmented using non-linear adaptive level set method. The phase map of enhanced image is integrated into the level set framework for final boundary estimation. The segmented results are validated against the corresponding ground truth images using overlap and regional similarity metrics. The segmented images are further processed with Riesz transform and structural texture features are derived from the transformed coefficients to analyze pathological conditions of breast tissues. Results show that the estimated average signal to noise ratio of denoised images and average sharpness of enhanced images are improved by 38% and 6% respectively. The interscale consideration adopted in the denoising algorithm is able to improve signal to noise ratio by preserving edges. The proposed segmentation framework could delineate the breast tissues with high degree of correlation (97%) between the segmented and ground truth areas. Also, the average segmentation accuracy and sensitivity are found to be 98%. Similarly, the maximum regional overlap between segmented and ground truth images obtained using volume similarity measure is observed to be 99%. Directionality as a feature, showed a considerable difference between normal and abnormal tissues which is found to be 11%. The proposed framework for breast thermal image analysis that is aided with necessary preprocessing is found to be useful in assisting the early diagnosis of breast abnormalities.

  10. Signature detection and matching for document image retrieval.

    PubMed

    Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan

    2009-11-01

    As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches.

  11. Systematic evaluation of deep learning based detection frameworks for aerial imagery

    NASA Astrophysics Data System (ADS)

    Sommer, Lars; Steinmann, Lucas; Schumann, Arne; Beyerer, Jürgen

    2018-04-01

    Object detection in aerial imagery is crucial for many applications in the civil and military domain. In recent years, deep learning based object detection frameworks significantly outperformed conventional approaches based on hand-crafted features on several datasets. However, these detection frameworks are generally designed and optimized for common benchmark datasets, which considerably differ from aerial imagery especially in object sizes. As already demonstrated for Faster R-CNN, several adaptations are necessary to account for these differences. In this work, we adapt several state-of-the-art detection frameworks including Faster R-CNN, R-FCN, and Single Shot MultiBox Detector (SSD) to aerial imagery. We discuss adaptations that mainly improve the detection accuracy of all frameworks in detail. As the output of deeper convolutional layers comprise more semantic information, these layers are generally used in detection frameworks as feature map to locate and classify objects. However, the resolution of these feature maps is insufficient for handling small object instances, which results in an inaccurate localization or incorrect classification of small objects. Furthermore, state-of-the-art detection frameworks perform bounding box regression to predict the exact object location. Therefore, so called anchor or default boxes are used as reference. We demonstrate how an appropriate choice of anchor box sizes can considerably improve detection performance. Furthermore, we evaluate the impact of the performed adaptations on two publicly available datasets to account for various ground sampling distances or differing backgrounds. The presented adaptations can be used as guideline for further datasets or detection frameworks.

  12. Framework flexibility of ZIF-8 under liquid intrusion: discovering time-dependent mechanical response and structural relaxation.

    PubMed

    Sun, Yueting; Li, Yibing; Tan, Jin-Chong

    2018-04-18

    The structural flexibility of a topical zeolitic imidazolate framework with sodalite topology, termed ZIF-8, has been elucidated through liquid intrusion under moderate pressures (i.e. tens of MPa). By tracking the evolution of water intrusion pressure under cyclic conditions, we interrogate the role of the gate-opening mechanism controlling the size variation of the pore channels of ZIF-8. Interestingly, we demonstrate that its channel deformation is recoverable through structural relaxation over time, hence revealing the viscoelastic mechanical response in ZIF-8. We propose a simple approach employing a glycerol-water solution mixture, which can significantly enhance the sensitivity of intrusion pressure for the detection of structural deformation in ZIF-8. By leveraging the time-dependent gate-opening phenomenon in ZIF-8, we achieved a notable improvement (50%) in energy dissipation during multicycle mechanical deformation experiments.

  13. Conceptual Variation in the Depiction of Gene Function in Upper Secondary School Textbooks

    ERIC Educational Resources Information Center

    Gericke, Niklas Markus; Hagberg, Mariana

    2010-01-01

    This paper explores conceptual variation in the depiction of gene function in upper secondary school textbooks. Historically, concepts in genetics have developed in various scientific frameworks, which has led to a level of incommensurability as concepts have changed over time within their respective frameworks. Since students may have…

  14. Variation tolerant SoC design

    NASA Astrophysics Data System (ADS)

    Kozhikkottu, Vivek J.

    The scaling of integrated circuits into the nanometer regime has led to variations emerging as a primary concern for designers of integrated circuits. Variations are an inevitable consequence of the semiconductor manufacturing process, and also arise due to the side-effects of operation of integrated circuits (voltage, temperature, and aging). Conventional design approaches, which are based on design corners or worst-case scenarios, leave designers with an undesirable choice between the considerable overheads associated with over-design and significantly reduced manufacturing yield. Techniques for variation-tolerant design at the logic, circuit and layout levels of the design process have been developed and are in commercial use. However, with the incessant increase in variations due to technology scaling and design trends such as near-threshold computing, these techniques are no longer sufficient to contain the effects of variations, and there is a need to address variations at all stages of design. This thesis addresses the problem of variation-tolerant design at the earliest stages of the design process, where the system-level design decisions that are made can have a very significant impact. There are two key aspects to making system-level design variation-aware. First, analysis techniques must be developed to project the impact of variations on system-level metrics such as application performance and energy. Second, variation-tolerant design techniques need to be developed to absorb the residual impact of variations (that cannot be contained through lower-level techniques). In this thesis, we address both these facets by developing robust and scalable variation-aware analysis and variation mitigation techniques at the system level. The first contribution of this thesis is a variation-aware system-level performance analysis framework. We address the key challenge of translating the per-component clock frequency distributions into a system-level application performance distribution. This task is particularly complex and challenging due to the inter-dependencies between components' execution, indirect effects of shared resources, and interactions between multiple system-level "execution paths". We argue that accurate variation-aware performance analysis requires Monte-Carlo based repeated system execution. Our proposed analysis framework leverages emulation to significantly speedup performance analysis without sacrificing the generality and accuracy achieved by Monte-Carlo based simulations. Our experiments show performance improvements of around 60x compared to state-of-the-art hardware-software co-simulation tools and also underscore the framework's potential to enable variation-aware design and exploration at the system level. Our second contribution addresses the problem of designing variation-tolerant SoCs using recovery based design, a popular circuit design paradigm that addresses variations by eliminating guard-bands and operating circuits at close to "zero margins" while detecting and recovering from timing errors. While previous efforts have demonstrated the potential benefits of recovery based design, we identify several challenges that need to be addressed in order to apply this technique to SoCs. We present a systematic design framework to apply recovery based design at the system level. We propose to partition SoCs into "recovery islands", wherein each recovery island consists of one or more SoC components that can recover independent of the rest of the SoC. We present a variation-aware design methodology that partitions a given SoC into recovery islands and computes the optimal operating points for each island, taking into account the various trade-offs involved. Our experiments demonstrate that the proposed design framework achieves an average of 32% energy savings over conventional worst-case designs, with negligible losses in performance. The third contribution of this thesis introduces disproportionate allocation of shared system resources as a means to combat the adverse impact of within-die variations on multi-core platforms. For multi-threaded programs executing on variation-impacted multi-cores platforms, we make the key observation that thread performance is not only a function of the frequency of the core on which it is executing on, but also depends upon the amount of shared system resources allocated to it. We utilize this insight to design a variation-aware runtime scheme which allocates the ways of a last-level shared L2 cache amongst the different cores/threads of a multi-core platform taking into account both application characteristics as well as chip specific variation profiles. Our experiments on 100 quad-core chips, each with a distinct variation profile, shows on an average 15% performance improvements for a suite of multi-threaded benchmarks. Our final contribution investigates the variation-tolerant design of domain-specific accelerators and demonstrates how the unique architectural properties of these accelerators can be leveraged to create highly effective variation tolerance mechanisms. We explore this concept through the variation-tolerant design of a vector processor that efficiently executes applications from the domains of recognition, mining and synthesis (RMS). We develop a novel design approach for variation tolerance, which leverages the unique nature of the vector reduction operations performed by this processor to effectively predict and preempt the occurrence of timing errors under variations and subsequently restore the correct output at the end of each vector reduction operation. We implement the above predict, preempt and restore operations by suitably enhancing the processor hardware and the application software and demonstrate considerable energy benefits (on an average 32%) across six applications from the domains of RMS. In conclusion, our work provides system designers with powerful tools and mechanisms in their efforts to combat variations, resulting in improved designer productivity and variation-tolerant systems.

  15. Can the envisaged reductions of fossil fuel CO2 emissions be detected by atmospheric observations?

    PubMed

    Levin, Ingeborg; Rödenbeck, Christian

    2008-03-01

    The lower troposphere is an excellent receptacle, which integrates anthropogenic greenhouse gases emissions over large areas. Therefore, atmospheric concentration observations over populated regions would provide the ultimate proof if sustained emissions changes have occurred. The most important anthropogenic greenhouse gas, carbon dioxide (CO(2)), also shows large natural concentration variations, which need to be disentangled from anthropogenic signals to assess changes in associated emissions. This is in principle possible for the fossil fuel CO(2) component (FFCO(2)) by high-precision radiocarbon ((14)C) analyses because FFCO(2) is free of radiocarbon. Long-term observations of (14)CO(2) conducted at two sites in south-western Germany do not yet reveal any significant trends in the regional fossil fuel CO(2) component. We rather observe strong inter-annual variations, which are largely imprinted by changes of atmospheric transport as supported by dedicated transport model simulations of fossil fuel CO(2). In this paper, we show that, depending on the remoteness of the site, changes of about 7-26% in fossil fuel emissions in respective catchment areas could be detected with confidence by high-precision atmospheric (14)CO(2) measurements when comparing 5-year averages if these inter-annual variations were taken into account. This perspective constitutes the urgently needed tool for validation of fossil fuel CO(2) emissions changes in the framework of the Kyoto protocol and successive climate initiatives.

  16. Development of a robust analytical framework for assessing landbird trends, dynamics and relationships with environmental covariates in the North Coast and Cascades Network

    USGS Publications Warehouse

    Ray, Chris; Saracco, James; Jenkins, Kurt J.; Huff, Mark; Happe, Patricia J.; Ransom, Jason I.

    2017-01-01

    During 2015-2016, we completed development of a new analytical framework for landbird population monitoring data from the National Park Service (NPS) North Coast and Cascades Inventory and Monitoring Network (NCCN). This new tool for analysis combines several recent advances in modeling population status and trends using point-count data and is designed to supersede the approach previously slated for analysis of trends in the NCCN and other networks, including the Sierra Nevada Network (SIEN). Advances supported by the new model-based approach include 1) the use of combined data on distance and time of detection to estimate detection probability without assuming perfect detection at zero distance, 2) seamless accommodation of variation in sampling effort and missing data, and 3) straightforward estimation of the effects of downscaled climate and other local habitat characteristics on spatial and temporal trends in landbird populations. No changes in the current field protocol are necessary to facilitate the new analyses. We applied several versions of the new model to data from each of 39 species recorded in the three mountain parks of the NCCN, estimating trends and climate relationships for each species during 2005-2014. Our methods and results are also reported in a manuscript in revision for the journal Ecosphere (hereafter, Ray et al.). Here, we summarize the methods and results outlined in depth by Ray et al., discuss benefits of the new analytical framework, and provide recommendations for its application to synthetic analyses of long-term data from the NCCN and SIEN. All code necessary for implementing the new analyses is provided within the Appendices to this report, in the form of fully annotated scripts written in the open-access programming languages R and JAGS.

  17. Bino variations: Effective field theory methods for dark matter direct detection

    NASA Astrophysics Data System (ADS)

    Berlin, Asher; Robertson, Denis S.; Solon, Mikhail P.; Zurek, Kathryn M.

    2016-05-01

    We apply effective field theory methods to compute bino-nucleon scattering, in the case where tree-level interactions are suppressed and the leading contribution is at loop order via heavy flavor squarks or sleptons. We find that leading log corrections to fixed-order calculations can increase the bino mass reach of direct detection experiments by a factor of 2 in some models. These effects are particularly large for the bino-sbottom coannihilation region, where bino dark matter as heavy as 5-10 TeV may be detected by near future experiments. For the case of stop- and selectron-loop mediated scattering, an experiment reaching the neutrino background will probe thermal binos as heavy as 500 and 300 GeV, respectively. We present three key examples that illustrate in detail the framework for determining weak scale coefficients, and for mapping onto a low-energy theory at hadronic scales, through a sequence of effective theories and renormalization group evolution. For the case of a squark degenerate with the bino, we extend the framework to include a squark degree of freedom at low energies using heavy particle effective theory, thus accounting for large logarithms through a "heavy-light current." Benchmark predictions for scattering cross sections are evaluated, including complete leading order matching onto quark and gluon operators, and a systematic treatment of perturbative and hadronic uncertainties.

  18. Bino variations: Effective field theory methods for dark matter direct detection

    DOE PAGES

    Berlin, Asher; Robertson, Denis S.; Solon, Mikhail P.; ...

    2016-05-10

    We apply effective field theory methods to compute bino-nucleon scattering, in the case where tree-level interactions are suppressed and the leading contribution is at loop order via heavy flavor squarks or sleptons. We find that leading log corrections to fixed-order calculations can increase the bino mass reach of direct detection experiments by a factor of 2 in some models. These effects are particularly large for the bino-sbottom coannihilation region, where bino dark matter as heavy as 5–10 TeV may be detected by near future experiments. For the case of stop- and selectron-loop mediated scattering, an experiment reaching the neutrino backgroundmore » will probe thermal binos as heavy as 500 and 300 GeV, respectively. We present three key examples that illustrate in detail the framework for determining weak scale coefficients, and for mapping onto a low-energy theory at hadronic scales, through a sequence of effective theories and renormalization group evolution. For the case of a squark degenerate with the bino, we extend the framework to include a squark degree of freedom at low energies using heavy particle effective theory, thus accounting for large logarithms through a “heavy-light current.” Finally, benchmark predictions for scattering cross sections are evaluated, including complete leading order matching onto quark and gluon operators, and a systematic treatment of perturbative and hadronic uncertainties.« less

  19. A software framework for real-time multi-modal detection of microsleeps.

    PubMed

    Knopp, Simon J; Bones, Philip J; Weddell, Stephen J; Jones, Richard D

    2017-09-01

    A software framework is described which was designed to process EEG, video of one eye, and head movement in real time, towards achieving early detection of microsleeps for prevention of fatal accidents, particularly in transport sectors. The framework is based around a pipeline structure with user-replaceable signal processing modules. This structure can encapsulate a wide variety of feature extraction and classification techniques and can be applied to detecting a variety of aspects of cognitive state. Users of the framework can implement signal processing plugins in C++ or Python. The framework also provides a graphical user interface and the ability to save and load data to and from arbitrary file formats. Two small studies are reported which demonstrate the capabilities of the framework in typical applications: monitoring eye closure and detecting simulated microsleeps. While specifically designed for microsleep detection/prediction, the software framework can be just as appropriately applied to (i) other measures of cognitive state and (ii) development of biomedical instruments for multi-modal real-time physiological monitoring and event detection in intensive care, anaesthesiology, cardiology, neurosurgery, etc. The software framework has been made freely available for researchers to use and modify under an open source licence.

  20. Rare variation facilitates inferences of fine-scale population structure in humans.

    PubMed

    O'Connor, Timothy D; Fu, Wenqing; Mychaleckyj, Josyf C; Logsdon, Benjamin; Auer, Paul; Carlson, Christopher S; Leal, Suzanne M; Smith, Joshua D; Rieder, Mark J; Bamshad, Michael J; Nickerson, Deborah A; Akey, Joshua M

    2015-03-01

    Understanding the genetic structure of human populations has important implications for the design and interpretation of disease mapping studies and reconstructing human evolutionary history. To date, inferences of human population structure have primarily been made with common variants. However, recent large-scale resequencing studies have shown an abundance of rare variation in humans, which may be particularly useful for making inferences of fine-scale population structure. To this end, we used an information theory framework and extensive coalescent simulations to rigorously quantify the informativeness of rare and common variation to detect signatures of fine-scale population structure. We show that rare variation affords unique insights into patterns of recent population structure. Furthermore, to empirically assess our theoretical findings, we analyzed high-coverage exome sequences in 6,515 European and African American individuals. As predicted, rare variants are more informative than common polymorphisms in revealing a distinct cluster of European-American individuals, and subsequent analyses demonstrate that these individuals are likely of Ashkenazi Jewish ancestry. Our results provide new insights into the population structure using rare variation, which will be an important factor to account for in rare variant association studies. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  1. Using genomics to characterize evolutionary potential for conservation of wild populations

    PubMed Central

    Harrisson, Katherine A; Pavlova, Alexandra; Telonis-Scott, Marina; Sunnucks, Paul

    2014-01-01

    Genomics promises exciting advances towards the important conservation goal of maximizing evolutionary potential, notwithstanding associated challenges. Here, we explore some of the complexity of adaptation genetics and discuss the strengths and limitations of genomics as a tool for characterizing evolutionary potential in the context of conservation management. Many traits are polygenic and can be strongly influenced by minor differences in regulatory networks and by epigenetic variation not visible in DNA sequence. Much of this critical complexity is difficult to detect using methods commonly used to identify adaptive variation, and this needs appropriate consideration when planning genomic screens, and when basing management decisions on genomic data. When the genomic basis of adaptation and future threats are well understood, it may be appropriate to focus management on particular adaptive traits. For more typical conservations scenarios, we argue that screening genome-wide variation should be a sensible approach that may provide a generalized measure of evolutionary potential that accounts for the contributions of small-effect loci and cryptic variation and is robust to uncertainty about future change and required adaptive response(s). The best conservation outcomes should be achieved when genomic estimates of evolutionary potential are used within an adaptive management framework. PMID:25553064

  2. Parenchymal texture analysis in digital mammography: robust texture feature identification and equivalence across devices.

    PubMed

    Keller, Brad M; Oustimov, Andrew; Wang, Yan; Chen, Jinbo; Acciavatti, Raymond J; Zheng, Yuanjie; Ray, Shonket; Gee, James C; Maidment, Andrew D A; Kontos, Despina

    2015-04-01

    An analytical framework is presented for evaluating the equivalence of parenchymal texture features across different full-field digital mammography (FFDM) systems using a physical breast phantom. Phantom images (FOR PROCESSING) are acquired from three FFDM systems using their automated exposure control setting. A panel of texture features, including gray-level histogram, co-occurrence, run length, and structural descriptors, are extracted. To identify features that are robust across imaging systems, a series of equivalence tests are performed on the feature distributions, in which the extent of their intersystem variation is compared to their intrasystem variation via the Hodges-Lehmann test statistic. Overall, histogram and structural features tend to be most robust across all systems, and certain features, such as edge enhancement, tend to be more robust to intergenerational differences between detectors of a single vendor than to intervendor differences. Texture features extracted from larger regions of interest (i.e., [Formula: see text]) and with a larger offset length (i.e., [Formula: see text]), when applicable, also appear to be more robust across imaging systems. This framework and observations from our experiments may benefit applications utilizing mammographic texture analysis on images acquired in multivendor settings, such as in multicenter studies of computer-aided detection and breast cancer risk assessment.

  3. Improved maximum average correlation height filter with adaptive log base selection for object recognition

    NASA Astrophysics Data System (ADS)

    Tehsin, Sara; Rehman, Saad; Awan, Ahmad B.; Chaudry, Qaiser; Abbas, Muhammad; Young, Rupert; Asif, Afia

    2016-04-01

    Sensitivity to the variations in the reference image is a major concern when recognizing target objects. A combinational framework of correlation filters and logarithmic transformation has been previously reported to resolve this issue alongside catering for scale and rotation changes of the object in the presence of distortion and noise. In this paper, we have extended the work to include the influence of different logarithmic bases on the resultant correlation plane. The meaningful changes in correlation parameters along with contraction/expansion in the correlation plane peak have been identified under different scenarios. Based on our research, we propose some specific log bases to be used in logarithmically transformed correlation filters for achieving suitable tolerance to different variations. The study is based upon testing a range of logarithmic bases for different situations and finding an optimal logarithmic base for each particular set of distortions. Our results show improved correlation and target detection accuracies.

  4. Digital PCR Modeling for Maximal Sensitivity, Dynamic Range and Measurement Precision

    PubMed Central

    Majumdar, Nivedita; Wessel, Thomas; Marks, Jeffrey

    2015-01-01

    The great promise of digital PCR is the potential for unparalleled precision enabling accurate measurements for genetic quantification. A challenge associated with digital PCR experiments, when testing unknown samples, is to perform experiments at dilutions allowing the detection of one or more targets of interest at a desired level of precision. While theory states that optimal precision (Po) is achieved by targeting ~1.59 mean copies per partition (λ), and that dynamic range (R) includes the space spanning one positive (λL) to one negative (λU) result from the total number of partitions (n), these results are tempered for the practitioner seeking to construct digital PCR experiments in the laboratory. A mathematical framework is presented elucidating the relationships between precision, dynamic range, number of partitions, interrogated volume, and sensitivity in digital PCR. The impact that false reaction calls and volumetric variation have on sensitivity and precision is next considered. The resultant effects on sensitivity and precision are established via Monte Carlo simulations reflecting the real-world likelihood of encountering such scenarios in the laboratory. The simulations provide insight to the practitioner on how to adapt experimental loading concentrations to counteract any one of these conditions. The framework is augmented with a method of extending the dynamic range of digital PCR, with and without increasing n, via the use of dilutions. An example experiment demonstrating the capabilities of the framework is presented enabling detection across 3.33 logs of starting copy concentration. PMID:25806524

  5. Aberrant Gene Expression in Humans

    PubMed Central

    Yang, Ence; Ji, Guoli; Brinkmeyer-Langford, Candice L.; Cai, James J.

    2015-01-01

    Gene expression as an intermediate molecular phenotype has been a focus of research interest. In particular, studies of expression quantitative trait loci (eQTL) have offered promise for understanding gene regulation through the discovery of genetic variants that explain variation in gene expression levels. Existing eQTL methods are designed for assessing the effects of common variants, but not rare variants. Here, we address the problem by establishing a novel analytical framework for evaluating the effects of rare or private variants on gene expression. Our method starts from the identification of outlier individuals that show markedly different gene expression from the majority of a population, and then reveals the contributions of private SNPs to the aberrant gene expression in these outliers. Using population-scale mRNA sequencing data, we identify outlier individuals using a multivariate approach. We find that outlier individuals are more readily detected with respect to gene sets that include genes involved in cellular regulation and signal transduction, and less likely to be detected with respect to the gene sets with genes involved in metabolic pathways and other fundamental molecular functions. Analysis of polymorphic data suggests that private SNPs of outlier individuals are enriched in the enhancer and promoter regions of corresponding aberrantly-expressed genes, suggesting a specific regulatory role of private SNPs, while the commonly-occurring regulatory genetic variants (i.e., eQTL SNPs) show little evidence of involvement. Additional data suggest that non-genetic factors may also underlie aberrant gene expression. Taken together, our findings advance a novel viewpoint relevant to situations wherein common eQTLs fail to predict gene expression when heritable, rare inter-individual variation exists. The analytical framework we describe, taking into consideration the reality of differential phenotypic robustness, may be valuable for investigating complex traits and conditions. PMID:25617623

  6. Improved native UV laser induced fluorescence detection for single cell analysis in poly(dimethylsiloxane) microfluidic devices.

    PubMed

    Hellmich, Wibke; Greif, Dominik; Pelargus, Christoph; Anselmetti, Dario; Ros, Alexandra

    2006-10-20

    Single cell analytics is a key method in the framework of proteom research allowing analyses, which are not subjected to ensemble-averaging, cell-cycle or heterogeneous cell-population effects. Our previous studies on single cell analysis in poly(dimethylsiloxane) microfluidic devices with native label-free laser induced fluorescence detection [W. Hellmich, C. Pelargus, K. Leffhalm, A. Ros, D. Anselmetti, Electrophoresis 26 (2005) 3689] were extended in order to improve separation efficiency and detection sensitivity. Here, we particularly focus on the influence of poly(oxyethylene) based coatings on the separation performance. In addition, the influence on background fluorescence is studied by the variation of the incident laser power as well as the adaptation of the confocal volume to the microfluidic channel dimensions. Last but not least, the use of carbon black particles further enhanced the detection limit to 25 nM, thereby reaching the relevant concentration ranges necessary for the label-free detection of low abundant proteins in single cells. On the basis of these results, we demonstrate the first electropherogram from an individual Spodoptera frugiperda (Sf9) cell with native label-free UV-LIF detection in a microfluidic chip.

  7. Single cell digital polymerase chain reaction on self-priming compartmentalization chip

    PubMed Central

    Zhu, Qiangyuan; Qiu, Lin; Xu, Yanan; Li, Guang; Mu, Ying

    2017-01-01

    Single cell analysis provides a new framework for understanding biology and disease, however, an absolute quantification of single cell gene expression still faces many challenges. Microfluidic digital polymerase chain reaction (PCR) provides a unique method to absolutely quantify the single cell gene expression, but only limited devices are developed to analyze a single cell with detection variation. This paper describes a self-priming compartmentalization (SPC) microfluidic digital polymerase chain reaction chip being capable of performing single molecule amplification from single cell. The chip can be used to detect four single cells simultaneously with 85% of sample digitization. With the optimized protocol for the SPC chip, we first tested the ability, precision, and sensitivity of our SPC digital PCR chip by assessing β-actin DNA gene expression in 1, 10, 100, and 1000 cells. And the reproducibility of the SPC chip is evaluated by testing 18S rRNA of single cells with 1.6%–4.6% of coefficient of variation. At last, by detecting the lung cancer related genes, PLAU gene expression of A549 cells at the single cell level, the single cell heterogeneity was demonstrated. So, with the power-free, valve-free SPC chip, the gene copy number of single cells can be quantified absolutely with higher sensitivity, reduced labor time, and reagent. We expect that this chip will enable new studies for biology and disease. PMID:28191267

  8. Single cell digital polymerase chain reaction on self-priming compartmentalization chip.

    PubMed

    Zhu, Qiangyuan; Qiu, Lin; Xu, Yanan; Li, Guang; Mu, Ying

    2017-01-01

    Single cell analysis provides a new framework for understanding biology and disease, however, an absolute quantification of single cell gene expression still faces many challenges. Microfluidic digital polymerase chain reaction (PCR) provides a unique method to absolutely quantify the single cell gene expression, but only limited devices are developed to analyze a single cell with detection variation. This paper describes a self-priming compartmentalization (SPC) microfluidic digital polymerase chain reaction chip being capable of performing single molecule amplification from single cell. The chip can be used to detect four single cells simultaneously with 85% of sample digitization. With the optimized protocol for the SPC chip, we first tested the ability, precision, and sensitivity of our SPC digital PCR chip by assessing β-actin DNA gene expression in 1, 10, 100, and 1000 cells. And the reproducibility of the SPC chip is evaluated by testing 18S rRNA of single cells with 1.6%-4.6% of coefficient of variation. At last, by detecting the lung cancer related genes, PLAU gene expression of A549 cells at the single cell level, the single cell heterogeneity was demonstrated. So, with the power-free, valve-free SPC chip, the gene copy number of single cells can be quantified absolutely with higher sensitivity, reduced labor time, and reagent. We expect that this chip will enable new studies for biology and disease.

  9. Beyond Group: Multiple Person Tracking via Minimal Topology-Energy-Variation.

    PubMed

    Gao, Shan; Ye, Qixiang; Xing, Junliang; Kuijper, Arjan; Han, Zhenjun; Jiao, Jianbin; Ji, Xiangyang

    2017-12-01

    Tracking multiple persons is a challenging task when persons move in groups and occlude each other. Existing group-based methods have extensively investigated how to make group division more accurately in a tracking-by-detection framework; however, few of them quantify the group dynamics from the perspective of targets' spatial topology or consider the group in a dynamic view. Inspired by the sociological properties of pedestrians, we propose a novel socio-topology model with a topology-energy function to factor the group dynamics of moving persons and groups. In this model, minimizing the topology-energy-variance in a two-level energy form is expected to produce smooth topology transitions, stable group tracking, and accurate target association. To search for the strong minimum in energy variation, we design the discrete group-tracklet jump moves embedded in the gradient descent method, which ensures that the moves reduce the energy variation of group and trajectory alternately in the varying topology dimension. Experimental results on both RGB and RGB-D data sets show the superiority of our proposed model for multiple person tracking in crowd scenes.

  10. Distinguishing between Selective Sweeps from Standing Variation and from a De Novo Mutation

    PubMed Central

    Peter, Benjamin M.; Huerta-Sanchez, Emilia; Nielsen, Rasmus

    2012-01-01

    An outstanding question in human genetics has been the degree to which adaptation occurs from standing genetic variation or from de novo mutations. Here, we combine several common statistics used to detect selection in an Approximate Bayesian Computation (ABC) framework, with the goal of discriminating between models of selection and providing estimates of the age of selected alleles and the selection coefficients acting on them. We use simulations to assess the power and accuracy of our method and apply it to seven of the strongest sweeps currently known in humans. We identify two genes, ASPM and PSCA, that are most likely affected by selection on standing variation; and we find three genes, ADH1B, LCT, and EDAR, in which the adaptive alleles seem to have swept from a new mutation. We also confirm evidence of selection for one further gene, TRPV6. In one gene, G6PD, neither neutral models nor models of selective sweeps fit the data, presumably because this locus has been subject to balancing selection. PMID:23071458

  11. The behavioral ecology of cultural psychological variation.

    PubMed

    Sng, Oliver; Neuberg, Steven L; Varnum, Michael E W; Kenrick, Douglas T

    2018-04-23

    Recent work has documented a wide range of important psychological differences across societies. Multiple explanations have been offered for why such differences exist, including historical philosophies, subsistence methods, social mobility, social class, climactic stresses, and religion. With the growing body of theory and data, there is an emerging need for an organizing framework. We propose here that a behavioral ecological perspective, particularly the idea of adaptive phenotypic plasticity, can provide an overarching framework for thinking about psychological variation across cultures and societies. We focus on how societies vary as a function of six important ecological dimensions: density, relatedness, sex ratio, mortality likelihood, resources, and disease. This framework can: (a) highlight new areas of research, (b) integrate and ground existing cultural psychological explanations, (c) integrate research on variation across human societies with research on parallel variations in other animal species, (d) provide a way for thinking about multiple levels of culture and cultural change, and (e) facilitate the creation of an ecological taxonomy of societies, from which one can derive specific predictions about cultural differences and similarities. Finally, we discuss the relationships between the current framework and existing perspectives. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Modeling anuran detection and site occupancy on North American Amphibian Monitoring Program (NAAMP) routes in Maryland

    USGS Publications Warehouse

    Weir, L.A.; Royle, J. Andrew; Nanjappa, P.; Jung, R.E.

    2005-01-01

    One of the most fundamental problems in monitoring animal populations is that of imperfect detection. Although imperfect detection can be modeled, studies examining patterns in occurrence often ignore detection and thus fail to properly partition variation in detection from that of occurrence. In this study, we used anuran calling survey data collected on North American Amphibian Monitoring Program routes in eastern Maryland to investigate factors that influence detection probability and site occupancy for 10 anuran species. In 2002, 17 calling survey routes in eastern Maryland were surveyed to collect environmental and species data nine or more times. To analyze these data, we developed models incorporating detection probability and site occupancy. The results suggest that, for more than half of the 10 species, detection probabilities vary most with season (i.e., day-of-year), air temperature, time, and moon illumination, whereas site occupancy may vary by the amount of palustrine forested wetland habitat. Our results suggest anuran calling surveys should document air temperature, time of night, moon illumination, observer skill, and habitat change over time, as these factors can be important to model-adjusted estimates of site occupancy. Our study represents the first formal modeling effort aimed at developing an analytic assessment framework for NAAMP calling survey data.

  13. Estimating spatial and temporal components of variation in count data using negative binomial mixed models

    USGS Publications Warehouse

    Irwin, Brian J.; Wagner, Tyler; Bence, James R.; Kepler, Megan V.; Liu, Weihai; Hayes, Daniel B.

    2013-01-01

    Partitioning total variability into its component temporal and spatial sources is a powerful way to better understand time series and elucidate trends. The data available for such analyses of fish and other populations are usually nonnegative integer counts of the number of organisms, often dominated by many low values with few observations of relatively high abundance. These characteristics are not well approximated by the Gaussian distribution. We present a detailed description of a negative binomial mixed-model framework that can be used to model count data and quantify temporal and spatial variability. We applied these models to data from four fishery-independent surveys of Walleyes Sander vitreus across the Great Lakes basin. Specifically, we fitted models to gill-net catches from Wisconsin waters of Lake Superior; Oneida Lake, New York; Saginaw Bay in Lake Huron, Michigan; and Ohio waters of Lake Erie. These long-term monitoring surveys varied in overall sampling intensity, the total catch of Walleyes, and the proportion of zero catches. Parameter estimation included the negative binomial scaling parameter, and we quantified the random effects as the variations among gill-net sampling sites, the variations among sampled years, and site × year interactions. This framework (i.e., the application of a mixed model appropriate for count data in a variance-partitioning context) represents a flexible approach that has implications for monitoring programs (e.g., trend detection) and for examining the potential of individual variance components to serve as response metrics to large-scale anthropogenic perturbations or ecological changes.

  14. Vulnerability detection using data-flow graphs and SMT solvers

    DTIC Science & Technology

    2016-10-31

    concerns. The framework is modular and pipelined to allow scalable analysis on distributed systems. Our vulnerability detection framework employs machine...Design We designed the framework to be modular to enable flexible reuse and extendibility. In its current form, our framework performs the following

  15. Study on the influence of supplying compressed air channels and evicting channels on pneumatical oscillation systems for vibromooshing

    NASA Astrophysics Data System (ADS)

    Glăvan, D. O.; Radu, I.; Babanatsas, T.; Babanatis Merce, R. M.; Kiss, I.; Gaspar, M. C.

    2018-01-01

    The paper presents a pneumatic system with two oscillating masses. The system is composed of a cylinder (framework) with mass m1, which has a piston with mass m2 inside. The cylinder (framework system) has one supplying channel for compressed air and one evicting channel for each work chamber (left and right of the piston). Functionality of the piston position comparatively with the cylinder (framework) is possible through the supplying or evicting of compressed air. The variable force that keeps the movement depends on variation of the pressure that is changing depending on the piston position according to the cylinder (framework) and to the section form that is supplying and evicting channels with compressed air. The paper presents the physical model/pattern, the mathematical model/pattern (differential equations) and numerical solution of the differential equations in hypothesis with the section form of supplying and evicting channels with compressed air is rectangular (variation linear) or circular (variation nonlinear).

  16. SmartMal: a service-oriented behavioral malware detection framework for mobile devices.

    PubMed

    Wang, Chao; Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C K

    2014-01-01

    This paper presents SmartMal--a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices.

  17. SmartMal: A Service-Oriented Behavioral Malware Detection Framework for Mobile Devices

    PubMed Central

    Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C. K.

    2014-01-01

    This paper presents SmartMal—a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices. PMID:25165729

  18. Epileptic seizure detection in EEG signal with GModPCA and support vector machine.

    PubMed

    Jaiswal, Abeg Kumar; Banka, Haider

    2017-01-01

    Epilepsy is one of the most common neurological disorders caused by recurrent seizures. Electroencephalograms (EEGs) record neural activity and can detect epilepsy. Visual inspection of an EEG signal for epileptic seizure detection is a time-consuming process and may lead to human error; therefore, recently, a number of automated seizure detection frameworks were proposed to replace these traditional methods. Feature extraction and classification are two important steps in these procedures. Feature extraction focuses on finding the informative features that could be used for classification and correct decision-making. Therefore, proposing effective feature extraction techniques for seizure detection is of great significance. Principal Component Analysis (PCA) is a dimensionality reduction technique used in different fields of pattern recognition including EEG signal classification. Global modular PCA (GModPCA) is a variation of PCA. In this paper, an effective framework with GModPCA and Support Vector Machine (SVM) is presented for epileptic seizure detection in EEG signals. The feature extraction is performed with GModPCA, whereas SVM trained with radial basis function kernel performed the classification between seizure and nonseizure EEG signals. Seven different experimental cases were conducted on the benchmark epilepsy EEG dataset. The system performance was evaluated using 10-fold cross-validation. In addition, we prove analytically that GModPCA has less time and space complexities as compared to PCA. The experimental results show that EEG signals have strong inter-sub-pattern correlations. GModPCA and SVM have been able to achieve 100% accuracy for the classification between normal and epileptic signals. Along with this, seven different experimental cases were tested. The classification results of the proposed approach were better than were compared the results of some of the existing methods proposed in literature. It is also found that the time and space complexities of GModPCA are less as compared to PCA. This study suggests that GModPCA and SVM could be used for automated epileptic seizure detection in EEG signal.

  19. On effectiveness of network sensor-based defense framework

    NASA Astrophysics Data System (ADS)

    Zhang, Difan; Zhang, Hanlin; Ge, Linqiang; Yu, Wei; Lu, Chao; Chen, Genshe; Pham, Khanh

    2012-06-01

    Cyber attacks are increasing in frequency, impact, and complexity, which demonstrate extensive network vulnerabilities with the potential for serious damage. Defending against cyber attacks calls for the distributed collaborative monitoring, detection, and mitigation. To this end, we develop a network sensor-based defense framework, with the aim of handling network security awareness, mitigation, and prediction. We implement the prototypical system and show its effectiveness on detecting known attacks, such as port-scanning and distributed denial-of-service (DDoS). Based on this framework, we also implement the statistical-based detection and sequential testing-based detection techniques and compare their respective detection performance. The future implementation of defensive algorithms can be provisioned in our proposed framework for combating cyber attacks.

  20. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  1. DEVELOPING WATER QUALITY CRITERIA FOR SUSPENDED AND BEDDED SEDIMENTS

    EPA Science Inventory

    The U.S. EPA’s Framework for Developing Suspended and Bedded Sediments (SABS) Water Quality Criteria (SABS Framework) is a nationally-consistent process for developing ambient sediment quality criteria for surface waters. The SABS Framework accommodates natural variation among wa...

  2. DB2: a probabilistic approach for accurate detection of tandem duplication breakpoints using paired-end reads.

    PubMed

    Yavaş, Gökhan; Koyutürk, Mehmet; Gould, Meetha P; McMahon, Sarah; LaFramboise, Thomas

    2014-03-05

    With the advent of paired-end high throughput sequencing, it is now possible to identify various types of structural variation on a genome-wide scale. Although many methods have been proposed for structural variation detection, most do not provide precise boundaries for identified variants. In this paper, we propose a new method, Distribution Based detection of Duplication Boundaries (DB2), for accurate detection of tandem duplication breakpoints, an important class of structural variation, with high precision and recall. Our computational experiments on simulated data show that DB2 outperforms state-of-the-art methods in terms of finding breakpoints of tandem duplications, with a higher positive predictive value (precision) in calling the duplications' presence. In particular, DB2's prediction of tandem duplications is correct 99% of the time even for very noisy data, while narrowing down the space of possible breakpoints within a margin of 15 to 20 bps on the average. Most of the existing methods provide boundaries in ranges that extend to hundreds of bases with lower precision values. Our method is also highly robust to varying properties of the sequencing library and to the sizes of the tandem duplications, as shown by its stable precision, recall and mean boundary mismatch performance. We demonstrate our method's efficacy using both simulated paired-end reads, and those generated from a melanoma sample and two ovarian cancer samples. Newly discovered tandem duplications are validated using PCR and Sanger sequencing. Our method, DB2, uses discordantly aligned reads, taking into account the distribution of fragment length to predict tandem duplications along with their breakpoints on a donor genome. The proposed method fine tunes the breakpoint calls by applying a novel probabilistic framework that incorporates the empirical fragment length distribution to score each feasible breakpoint. DB2 is implemented in Java programming language and is freely available at http://mendel.gene.cwru.edu/laframboiselab/software.php.

  3. Automatic classification of animal vocalizations

    NASA Astrophysics Data System (ADS)

    Clemins, Patrick J.

    2005-11-01

    Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.

  4. Multiple loci and epistases control genetic variation for seed dormancy in weedy rice (Oryza sativa).

    PubMed Central

    Gu, Xing-You; Kianian, Shahryar F; Foley, Michael E

    2004-01-01

    Weedy rice has much stronger seed dormancy than cultivated rice. A wild-like weedy strain SS18-2 was selected to investigate the genetic architecture underlying seed dormancy, a critical adaptive trait in plants. A framework genetic map covering the rice genome was constructed on the basis of 156 BC(1) [EM93-1 (nondormant breeding line)//EM93-1/SS18-2] individuals. The mapping population was replicated using a split-tiller technique to control and better estimate the environmental variation. Dormancy was determined by germination of seeds after 1, 11, and 21 days of after-ripening (DAR). Six dormancy QTL, designated as qSD(S)-4, -6, -7-1, -7-2, -8, and -12, were identified. The locus qSD(S)-7-1 was tightly linked to the red pericarp color gene Rc. A QTL x DAR interaction was detected for qSD(S)-12, the locus with the largest main effect at 1, 11, and 21 DAR (R(2) = 0.14, 0.24, and 0.20, respectively). Two, three, and four orders of epistases were detected with four, six, and six QTL, respectively. The higher-order epistases strongly suggest the presence of genetically complex networks in the regulation of variation for seed dormancy in natural populations and make it critical to select for a favorable combination of alleles at multiple loci in positional cloning of a target dormancy gene. PMID:15082564

  5. Influence of ocean tides on the diurnal and semidiurnal earth rotation variations from VLBI observations

    NASA Astrophysics Data System (ADS)

    Gubanov, V. S.; Kurdubov, S. L.

    2015-05-01

    The International astrogeodetic standard IERS Conventions (2010) contains a model of the diurnal and semidiurnal variations in Earth rotation parameters (ERPs), the pole coordinates and the Universal Time, arising from lunisolar tides in the world ocean. This model was constructed in the mid-1990s through a global analysis of Topex/Poseidon altimetry. The goal of this study is to try to estimate the parameters of this model by processing all the available VLBI observations on a global network of stations over the last 35 years performed within the framework of IVS (International VLBI Service) geodetic programs. The complexity of the problemlies in the fact that the sought-for corrections to the parameters of this model lie within 1 mm and, thus, are at the limit of their detectability by all currently available methods of ground-based positional measurements. This requires applying universal software packages with a high accuracy of reduction calculations and a well-developed system of controlling the simultaneous adjustment of observational data to analyze long series of VLBI observations. This study has been performed with the QUASAR software package developed at the Institute of Applied Astronomy of the Russian Academy of Sciences. Although the results obtained, on the whole, confirm a high accuracy of the basic model in the IERS Conventions (2010), statistically significant corrections that allow this model to be refined have been detected for some harmonics of the ERP variations.

  6. Estimating occupancy dynamics for large-scale monitoring networks: amphibian breeding occupancy across protected areas in the northeast United States

    USGS Publications Warehouse

    Miller, David A.W.; Grant, Evan H. Campbell

    2015-01-01

    Regional monitoring strategies frequently employ a nested sampling design where a finite set of study areas from throughout a region are selected within which intensive sub-sampling occurs. This sampling protocol naturally lends itself to a hierarchical analysis to account for dependence among sub-samples. Implementing such an analysis within a classic likelihood framework is computationally prohibitive with species occurrence data when accounting for detection probabilities. Bayesian methods offer an alternative framework to make this analysis feasible. We demonstrate a general approach for estimating occupancy when data come from a nested sampling design. Using data from a regional monitoring program of wood frogs (Lithobates sylvaticus) and spotted salamanders (Ambystoma maculatum) in vernal pools, we analyzed data using static and dynamic occupancy frameworks. We analyzed observations from 2004-2013collected within 14 protected areas located throughout the northeast United States . We use the data set to estimate trends in occupancy at both the regional and individual protected area level. We show that occupancy at the regional level was relatively stable for both species. Much more variation occurred within individual study areas, with some populations declining and some increasing for both species. We found some evidence for a latitudinal gradient in trends among protected areas. However, support for this pattern is overestimated when the hierarchical nature of the data collection is not controlled for in the analysis. For both species, occupancy appeared to be declining in the most southern areas, while occupancy was stable or increasing in more northern areas. These results shed light on the range-level population status of these pond-breeding amphibians and our approach provides a framework that can be used to examine drivers of change including among-year and among-site variation in occurrence dynamics, while properly accounting for nested structure of data collection.

  7. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  8. A scan statistic to extract causal gene clusters from case-control genome-wide rare CNV data.

    PubMed

    Nishiyama, Takeshi; Takahashi, Kunihiko; Tango, Toshiro; Pinto, Dalila; Scherer, Stephen W; Takami, Satoshi; Kishino, Hirohisa

    2011-05-26

    Several statistical tests have been developed for analyzing genome-wide association data by incorporating gene pathway information in terms of gene sets. Using these methods, hundreds of gene sets are typically tested, and the tested gene sets often overlap. This overlapping greatly increases the probability of generating false positives, and the results obtained are difficult to interpret, particularly when many gene sets show statistical significance. We propose a flexible statistical framework to circumvent these problems. Inspired by spatial scan statistics for detecting clustering of disease occurrence in the field of epidemiology, we developed a scan statistic to extract disease-associated gene clusters from a whole gene pathway. Extracting one or a few significant gene clusters from a global pathway limits the overall false positive probability, which results in increased statistical power, and facilitates the interpretation of test results. In the present study, we applied our method to genome-wide association data for rare copy-number variations, which have been strongly implicated in common diseases. Application of our method to a simulated dataset demonstrated the high accuracy of this method in detecting disease-associated gene clusters in a whole gene pathway. The scan statistic approach proposed here shows a high level of accuracy in detecting gene clusters in a whole gene pathway. This study has provided a sound statistical framework for analyzing genome-wide rare CNV data by incorporating topological information on the gene pathway.

  9. Cognitive processing in bipolar disorder conceptualized using the Interactive Cognitive Subsystems (ICS) model.

    PubMed

    Lomax, C L; Barnard, P J; Lam, D

    2009-05-01

    There are few theoretical proposals that attempt to account for the variation in affective processing across different affective states of bipolar disorder (BD). The Interacting Cognitive Subsystems (ICS) framework has been recently extended to account for manic states. Within the framework, positive mood state is hypothesized to tap into an implicational level of processing, which is proposed to be more extreme in states of mania. Thirty individuals with BD and 30 individuals with no history of affective disorder were tested in euthymic mood state and then in induced positive mood state using the Question-Answer task to examine the mode of processing of schemas. The task was designed to test whether individuals would detect discrepancies within the prevailing schemas of the sentences. Although the present study did not support the hypothesis that the groups differ in their ability to detect discrepancies within schemas, we did find that the BD group was significantly more likely than the control group to answer questions that were consistent with the prevailing schemas, both before and after mood induction. These results may reflect a general cognitive bias, that individuals with BD have a tendency to operate at a more abstract level of representation. This may leave an individual prone to affective disturbance, although further research is required to replicate this finding.

  10. Variation and Defect Tolerance for Nano Crossbars

    NASA Astrophysics Data System (ADS)

    Tunc, Cihan

    With the extreme shrinking in CMOS technology, quantum effects and manufacturing issues are getting more crucial. Hence, additional shrinking in CMOS feature size seems becoming more challenging, difficult, and costly. On the other hand, emerging nanotechnology has attracted many researchers since additional scaling down has been demonstrated by manufacturing nanowires, Carbon nanotubes as well as molecular switches using bottom-up manufacturing techniques. In addition to the progress in manufacturing, developments in architecture show that emerging nanoelectronic devices will be promising for the future system designs. Using nano crossbars, which are composed of two sets of perpendicular nanowires with programmable intersections, it is possible to implement logic functions. In addition, nano crossbars present some important features as regularity, reprogrammability, and interchangeability. Combining these features, researchers have presented different effective architectures. Although bottom-up nanofabrication can greatly reduce manufacturing costs, due to low controllability in the manufacturing process, some critical issues occur. Bottom- up nanofabrication process results in high variation compared to conventional top- down lithography used in CMOS technology. In addition, an increased failure rate is expected. Variation and defect tolerance methods used for conventional CMOS technology seem inadequate for adapting to emerging nano technology because the variation and the defect rate for emerging nano technology is much more than current CMOS technology. Therefore, variations and defect tolerance methods for emerging nano technology are necessary for a successful transition. In this work, in order to tolerate variations for crossbars, we introduce a framework that is established based on reprogrammability and interchangeability features of nano crossbars. This framework is shown to be applicable for both FET-based and diode-based nano crossbars. We present a characterization testing method which requires minimal number of test vectors. We formulate the variation optimization problem using Simulated Annealing with different optimization goals. Furthermore, we extend the framework for defect tolerance. Experimental results and comparison of proposed framework with exhaustive methods confirm its effectiveness for both variation and defect tolerance.

  11. The Many Hazards of Trend Evaluation

    NASA Astrophysics Data System (ADS)

    Henebry, G. M.; de Beurs, K.; Zhang, X.; Kimball, J. S.; Small, C.

    2014-12-01

    Given the awareness in the scientific community of global scale drivers such as population growth, globalization, and climatic variation and change, many studies seek to identify temporal patterns in data that may be plausibly related to one or more aspect of global change. Here we explore two questions: "What constitutes a trend in a time series?" and "How can a trend be misinterpreted?" There are manifold hazards—both methodological and psychological—in detecting a trend, quantifying its magnitude, assessing its significance, identifying probable causes, and evaluating the implications of the trend. These hazards can combine to elevate the risk of misinterpreting the trend. In contrast, evaluation of multiple trends within a biogeophysical framework can attenuate the risk of misinterpretation. We review and illustrate these hazards and demonstrate the efficacy of an approach using multiple indicators detecting significant trends (MIDST) applied to time series of remote sensing data products.

  12. Unsupervised real-time speaker identification for daily movies

    NASA Astrophysics Data System (ADS)

    Li, Ying; Kuo, C.-C. Jay

    2002-07-01

    The problem of identifying speakers for movie content analysis is addressed in this paper. While most previous work on speaker identification was carried out in a supervised mode using pure audio data, more robust results can be obtained in real-time by integrating knowledge from multiple media sources in an unsupervised mode. In this work, both audio and visual cues will be employed and subsequently combined in a probabilistic framework to identify speakers. Particularly, audio information is used to identify speakers with a maximum likelihood (ML)-based approach while visual information is adopted to distinguish speakers by detecting and recognizing their talking faces based on face detection/recognition and mouth tracking techniques. Moreover, to accommodate for speakers' acoustic variations along time, we update their models on the fly by adapting to their newly contributed speech data. Encouraging results have been achieved through extensive experiments, which shows a promising future of the proposed audiovisual-based unsupervised speaker identification system.

  13. Compensation for Lithography Induced Process Variations during Physical Design

    NASA Astrophysics Data System (ADS)

    Chin, Eric Yiow-Bing

    This dissertation addresses the challenge of designing robust integrated circuits in the deep sub micron regime in the presence of lithography process variability. By extending and combining existing process and circuit analysis techniques, flexible software frameworks are developed to provide detailed studies of circuit performance in the presence of lithography variations such as focus and exposure. Applications of these software frameworks to select circuits demonstrate the electrical impact of these variations and provide insight into variability aware compact models that capture the process dependent circuit behavior. These variability aware timing models abstract lithography variability from the process level to the circuit level and are used to estimate path level circuit performance with high accuracy with very little overhead in runtime. The Interconnect Variability Characterization (IVC) framework maps lithography induced geometrical variations at the interconnect level to electrical delay variations. This framework is applied to one dimensional repeater circuits patterned with both 90nm single patterning and 32nm double patterning technologies, under the presence of focus, exposure, and overlay variability. Studies indicate that single and double patterning layouts generally exhibit small variations in delay (between 1--3%) due to self compensating RC effects associated with dense layouts and overlay errors for layouts without self-compensating RC effects. The delay response of each double patterned interconnect structure is fit with a second order polynomial model with focus, exposure, and misalignment parameters with 12 coefficients and residuals of less than 0.1ps. The IVC framework is also applied to a repeater circuit with cascaded interconnect structures to emulate more complex layout scenarios, and it is observed that the variations on each segment average out to reduce the overall delay variation. The Standard Cell Variability Characterization (SCVC) framework advances existing layout-level lithography aware circuit analysis by extending it to cell-level applications utilizing a physically accurate approach that integrates process simulation, compact transistor models, and circuit simulation to characterize electrical cell behavior. This framework is applied to combinational and sequential cells in the Nangate 45nm Open Cell Library, and the timing response of these cells to lithography focus and exposure variations demonstrate Bossung like behavior. This behavior permits the process parameter dependent response to be captured in a nine term variability aware compact model based on Bossung fitting equations. For a two input NAND gate, the variability aware compact model captures the simulated response to an accuracy of 0.3%. The SCVC framework is also applied to investigate advanced process effects including misalignment and layout proximity. The abstraction of process variability from the layout level to the cell level opens up an entire new realm of circuit analysis and optimization and provides a foundation for path level variability analysis without the computationally expensive costs associated with joint process and circuit simulation. The SCVC framework is used with slight modification to illustrate the speedup and accuracy tradeoffs of using compact models. With variability aware compact models, the process dependent performance of a three stage logic circuit can be estimated to an accuracy of 0.7% with a speedup of over 50,000. Path level variability analysis also provides an accurate estimate (within 1%) of ring oscillator period in well under a second. Another significant advantage of variability aware compact models is that they can be easily incorporated into existing design methodologies for design optimization. This is demonstrated by applying cell swapping on a logic circuit to reduce the overall delay variability along a circuit path. By including these variability aware compact models in cell characterization libraries, design metrics such as circuit timing, power, area, and delay variability can be quickly assessed to optimize for the correct balance of all design metrics, including delay variability. Deterministic lithography variations can be easily captured using the variability aware compact models described in this dissertation. However, another prominent source of variability is random dopant fluctuations, which affect transistor threshold voltage and in turn circuit performance. The SCVC framework is utilized to investigate the interactions between deterministic lithography variations and random dopant fluctuations. Monte Carlo studies show that the output delay distribution in the presence of random dopant fluctuations is dependent on lithography focus and exposure conditions, with a 3.6 ps change in standard deviation across the focus exposure process window. This indicates that the electrical impact of random variations is dependent on systematic lithography variations, and this dependency should be included for precise analysis.

  14. Bioconjugation of luminescent Eu-BDC-NH2 MOFs for highly efficient sensing of BSA

    NASA Astrophysics Data System (ADS)

    Kukkar, Preeti; Sammi, Heena; Rawat, Mohit; Singh, Pritpal; Basu, Soumen; Kukkar, Deepak

    2018-05-01

    Luminescent metal organic frameworks (MOFs) have emerged as an exciting prospect for molecular sensing applications owing to their tunable porosity and optical properties. In this study, we have reported the synthesis of luminescent Europium-amino terephthalic acid (Eu-BDC-NH2) MOFs through solvothermal approach subsequently followed by their bioconjugation with anti-Bovine serum albumin (BSA) antibody using standard carbodiimide linkage chemistry. Subsequently nanocomposite of the bioconjugate and Zeolotic Imidazole Frameworks -8(ZIF-8) nanoparticles was prepared by adding varying volumes of ZIF-8 NPs to check the variation in photoluminescence (PL) intensity. Finally, optimized nanocomposites with increased PL intensity were treated with different concentrations of BSA to show a turn on effect on the PL intensity. The prepared nanocomposites were able to screen 0.1 ppm concentration of the BSA thus showing their high efficiency as a molecular sensor. This fluorescent platform would be further utilized for sensitive detection of pesticides in solution.

  15. Diagnosis and Threat Detection Capabilities of the SERENITY Monitoring Framework

    NASA Astrophysics Data System (ADS)

    Tsigkritis, Theocharis; Spanoudakis, George; Kloukinas, Christos; Lorenzoli, Davide

    The SERENITY monitoring framework offers mechanisms for diagnosing the causes of violations of security and dependability (S&D) properties and detecting potential violations of such properties, called "Cthreats". Diagnostic information and threat detection are often necessary for deciding what an appropriate reaction to a violation is and taking pre-emptive actions against predicted violations, respectively. In this chapter, we describe the mechanisms of the SERENITY monitoring framework which generate diagnostic information for violations of S&D properties and detecting threats.

  16. Quantifying Extrinsic Noise in Gene Expression Using the Maximum Entropy Framework

    PubMed Central

    Dixit, Purushottam D.

    2013-01-01

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. PMID:23790383

  17. Quantifying extrinsic noise in gene expression using the maximum entropy framework.

    PubMed

    Dixit, Purushottam D

    2013-06-18

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. Inferences about landbird abundance from count data: recent advances and future directions

    USGS Publications Warehouse

    Nichols, J.D.; Thomas, L.; Conn, P.B.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    We summarize results of a November 2006 workshop dealing with recent research on the estimation of landbird abundance from count data. Our conceptual framework includes a decomposition of the probability of detecting a bird potentially exposed to sampling efforts into four separate probabilities. Primary inference methods are described and include distance sampling, multiple observers, time of detection, and repeated counts. The detection parameters estimated by these different approaches differ, leading to different interpretations of resulting estimates of density and abundance. Simultaneous use of combinations of these different inference approaches can not only lead to increased precision but also provides the ability to decompose components of the detection process. Recent efforts to test the efficacy of these different approaches using natural systems and a new bird radio test system provide sobering conclusions about the ability of observers to detect and localize birds in auditory surveys. Recent research is reported on efforts to deal with such potential sources of error as bird misclassification, measurement error, and density gradients. Methods for inference about spatial and temporal variation in avian abundance are outlined. Discussion topics include opinions about the need to estimate detection probability when drawing inference about avian abundance, methodological recommendations based on the current state of knowledge and suggestions for future research.

  19. Automatic telangiectasia analysis in dermoscopy images using adaptive critic design.

    PubMed

    Cheng, B; Stanley, R J; Stoecker, W V; Hinton, K

    2012-11-01

    Telangiectasia, tiny skin vessels, are important dermoscopy structures used to discriminate basal cell carcinoma (BCC) from benign skin lesions. This research builds off of previously developed image analysis techniques to identify vessels automatically to discriminate benign lesions from BCCs. A biologically inspired reinforcement learning approach is investigated in an adaptive critic design framework to apply action-dependent heuristic dynamic programming (ADHDP) for discrimination based on computed features using different skin lesion contrast variations to promote the discrimination process. Lesion discrimination results for ADHDP are compared with multilayer perception backpropagation artificial neural networks. This study uses a data set of 498 dermoscopy skin lesion images of 263 BCCs and 226 competitive benign images as the input sets. This data set is extended from previous research [Cheng et al., Skin Research and Technology, 2011, 17: 278]. Experimental results yielded a diagnostic accuracy as high as 84.6% using the ADHDP approach, providing an 8.03% improvement over a standard multilayer perception method. We have chosen BCC detection rather than vessel detection as the endpoint. Although vessel detection is inherently easier, BCC detection has potential direct clinical applications. Small BCCs are detectable early by dermoscopy and potentially detectable by the automated methods described in this research. © 2011 John Wiley & Sons A/S.

  20. Identification of somatic mutations in cancer through Bayesian-based analysis of sequenced genome pairs

    PubMed Central

    2013-01-01

    Background The field of cancer genomics has rapidly adopted next-generation sequencing (NGS) in order to study and characterize malignant tumors with unprecedented resolution. In particular for cancer, one is often trying to identify somatic mutations – changes specific to a tumor and not within an individual’s germline. However, false positive and false negative detections often result from lack of sufficient variant evidence, contamination of the biopsy by stromal tissue, sequencing errors, and the erroneous classification of germline variation as tumor-specific. Results We have developed a generalized Bayesian analysis framework for matched tumor/normal samples with the purpose of identifying tumor-specific alterations such as single nucleotide mutations, small insertions/deletions, and structural variation. We describe our methodology, and discuss its application to other types of paired-tissue analysis such as the detection of loss of heterozygosity as well as allelic imbalance. We also demonstrate the high level of sensitivity and specificity in discovering simulated somatic mutations, for various combinations of a) genomic coverage and b) emulated heterogeneity. Conclusion We present a Java-based implementation of our methods named Seurat, which is made available for free academic use. We have demonstrated and reported on the discovery of different types of somatic change by applying Seurat to an experimentally-derived cancer dataset using our methods; and have discussed considerations and practices regarding the accurate detection of somatic events in cancer genomes. Seurat is available at https://sites.google.com/site/seuratsomatic. PMID:23642077

  1. Identification of somatic mutations in cancer through Bayesian-based analysis of sequenced genome pairs.

    PubMed

    Christoforides, Alexis; Carpten, John D; Weiss, Glen J; Demeure, Michael J; Von Hoff, Daniel D; Craig, David W

    2013-05-04

    The field of cancer genomics has rapidly adopted next-generation sequencing (NGS) in order to study and characterize malignant tumors with unprecedented resolution. In particular for cancer, one is often trying to identify somatic mutations--changes specific to a tumor and not within an individual's germline. However, false positive and false negative detections often result from lack of sufficient variant evidence, contamination of the biopsy by stromal tissue, sequencing errors, and the erroneous classification of germline variation as tumor-specific. We have developed a generalized Bayesian analysis framework for matched tumor/normal samples with the purpose of identifying tumor-specific alterations such as single nucleotide mutations, small insertions/deletions, and structural variation. We describe our methodology, and discuss its application to other types of paired-tissue analysis such as the detection of loss of heterozygosity as well as allelic imbalance. We also demonstrate the high level of sensitivity and specificity in discovering simulated somatic mutations, for various combinations of a) genomic coverage and b) emulated heterogeneity. We present a Java-based implementation of our methods named Seurat, which is made available for free academic use. We have demonstrated and reported on the discovery of different types of somatic change by applying Seurat to an experimentally-derived cancer dataset using our methods; and have discussed considerations and practices regarding the accurate detection of somatic events in cancer genomes. Seurat is available at https://sites.google.com/site/seuratsomatic.

  2. Avian responses to an extreme ice storm are determined by a combination of functional traits, behavioural adaptations and habitat modifications

    PubMed Central

    Zhang, Qiang; Hong, Yongmi; Zou, Fasheng; Zhang, Min; Lee, Tien Ming; Song, Xiangjin; Rao, Jiteng

    2016-01-01

    The extent to which species’ traits, behavior and habitat synergistically determine their response to extreme weather events (EWE) remains poorly understood. By quantifying bird and vegetation assemblages before and after the 2008 ice storm in China, combined with interspecific interactions and foraging behaviours, we disentangled whether storm influences avian reassembly directly via functional traits (i.e. behavioral adaptations), or indirectly via habitat variations. We found that overall species richness decreased, with 20 species detected exclusively before the storm, and eight species detected exclusively after. These shifts in bird relative abundance were linked to habitat preferences, dietary guild and flocking behaviours. For instance, forest specialists at higher trophic levels (e.g. understory-insectivores, woodpeckers and kingfishers) were especially vulnerable, whereas open-habitat generalists (e.g. bulbuls) were set to benefit from potential habitat homogenization. Alongside population fluctuations, we found that community reassembly can be rapidly adjusted via foraging plasticity (i.e. increased flocking propensity and reduced perching height). And changes in preferred habitat corresponded to a variation in bird assemblages and traits, as represented by intact canopy cover and high density of large trees. Accurate predictions of community responses to EWE are crucial to understanding ecosystem disturbances, thus linking species-oriented traits to a coherent analytical framework. PMID:26929387

  3. Safe and simple detection of sparse hydrogen by Pd-Au alloy/air based 1D photonic crystal sensor

    NASA Astrophysics Data System (ADS)

    Mitra, S.; Biswas, T.; Chattopadhyay, R.; Ghosh, J.; Bysakh, S.; Bhadra, S. K.

    2016-11-01

    A simple integrated hydrogen sensor using Pd-Au alloy/air based one dimensional photonic crystal with an air defect layer is theoretically modeled. Structural parameters of the photonic crystal are delicately scaled to generate photonic band gap frequencies in a visible spectral regime. An optimized defect thickness permits a localized defect mode operating at a frequency within the photonic band gap region. Hydrogen absorption causes modification in the band gap characteristics due to variation of refractive index and lattice parameters of the alloy. As a result, the transmission peak appeared due to the resonant defect state gets shifted. This peak shifting is utilized to detect sparse amount of hydrogen present in the surrounding environment. A theoretical framework is built to calculate the refractive index profile of hydrogen loaded alloy using density functional theory and Bruggeman's effective medium approximation. The calculated refractive index variation of Pd3Au alloy film due to hydrogen loading is verified experimentally by measuring the reflectance characteristics. Lattice expansion properties of the alloy are studied through X-ray diffraction analyses. The proposed structure shows about 3 nm red shift of the transmission peak for a rise of 1% atomic hydrogen concentration in the alloy.

  4. Genetic Structure of Bluefin Tuna in the Mediterranean Sea Correlates with Environmental Variables

    PubMed Central

    Riccioni, Giulia; Stagioni, Marco; Landi, Monica; Ferrara, Giorgia; Barbujani, Guido; Tinti, Fausto

    2013-01-01

    Background Atlantic Bluefin Tuna (ABFT) shows complex demography and ecological variation in the Mediterranean Sea. Genetic surveys have detected significant, although weak, signals of population structuring; catch series analyses and tagging programs identified complex ABFT spatial dynamics and migration patterns. Here, we tested the hypothesis that the genetic structure of the ABFT in the Mediterranean is correlated with mean surface temperature and salinity. Methodology We used six samples collected from Western and Central Mediterranean integrated with a new sample collected from the recently identified easternmost reproductive area of Levantine Sea. To assess population structure in the Mediterranean we used a multidisciplinary framework combining classical population genetics, spatial and Bayesian clustering methods and a multivariate approach based on factor analysis. Conclusions FST analysis and Bayesian clustering methods detected several subpopulations in the Mediterranean, a result also supported by multivariate analyses. In addition, we identified significant correlations of genetic diversity with mean salinity and surface temperature values revealing that ABFT is genetically structured along two environmental gradients. These results suggest that a preference for some spawning habitat conditions could contribute to shape ABFT genetic structuring in the Mediterranean. However, further studies should be performed to assess to what extent ABFT spawning behaviour in the Mediterranean Sea can be affected by environmental variation. PMID:24260341

  5. An Intrusion Detection System Based on Multi-Level Clustering for Hierarchical Wireless Sensor Networks

    PubMed Central

    Butun, Ismail; Ra, In-Ho; Sankar, Ravi

    2015-01-01

    In this work, an intrusion detection system (IDS) framework based on multi-level clustering for hierarchical wireless sensor networks is proposed. The framework employs two types of intrusion detection approaches: (1) “downward-IDS (D-IDS)” to detect the abnormal behavior (intrusion) of the subordinate (member) nodes; and (2) “upward-IDS (U-IDS)” to detect the abnormal behavior of the cluster heads. By using analytical calculations, the optimum parameters for the D-IDS (number of maximum hops) and U-IDS (monitoring group size) of the framework are evaluated and presented. PMID:26593915

  6. Thresholds of Principle and Preference: Exploring Procedural Variation in Postgraduate Surgical Education.

    PubMed

    Apramian, Tavis; Cristancho, Sayra; Watling, Chris; Ott, Michael; Lingard, Lorelei

    2015-11-01

    Expert physicians develop their own ways of doing things. The influence of such practice variation in clinical learning is insufficiently understood. Our grounded theory study explored how residents make sense of, and behave in relation to, the procedural variations of faculty surgeons. We sampled senior postgraduate surgical residents to construct a theoretical framework for how residents make sense of procedural variations. Using a constructivist grounded theory approach, we used marginal participant observation in the operating room across 56 surgical cases (146 hours), field interviews (38), and formal interviews (6) to develop a theoretical framework for residents' ways of dealing with procedural variations. Data analysis used constant comparison to iteratively refine the framework and data collection until theoretical saturation was reached. The core category of the constructed theory was called thresholds of principle and preference and it captured how faculty members position some procedural variations as negotiable and others not. The term thresholding was coined to describe residents' daily experiences of spotting, mapping, and negotiating their faculty members' thresholds and defending their own emerging thresholds. Thresholds of principle and preference play a key role in workplace-based medical education. Postgraduate medical learners are occupied on a day-to-day level with thresholding and attempting to make sense of the procedural variations of faculty. Workplace-based teaching and assessment should include an understanding of the integral role of thresholding in shaping learners' development. Future research should explore the nature and impact of thresholding in workplace-based learning beyond the surgical context.

  7. Command Disaggregation Attack and Mitigation in Industrial Internet of Things

    PubMed Central

    Zhu, Pei-Dong; Hu, Yi-Fan; Cui, Peng-Shuai; Zhang, Yan

    2017-01-01

    A cyber-physical attack in the industrial Internet of Things can cause severe damage to physical system. In this paper, we focus on the command disaggregation attack, wherein attackers modify disaggregated commands by intruding command aggregators like programmable logic controllers, and then maliciously manipulate the physical process. It is necessary to investigate these attacks, analyze their impact on the physical process, and seek effective detection mechanisms. We depict two different types of command disaggregation attack modes: (1) the command sequence is disordered and (2) disaggregated sub-commands are allocated to wrong actuators. We describe three attack models to implement these modes with going undetected by existing detection methods. A novel and effective framework is provided to detect command disaggregation attacks. The framework utilizes the correlations among two-tier command sequences, including commands from the output of central controller and sub-commands from the input of actuators, to detect attacks before disruptions occur. We have designed components of the framework and explain how to mine and use these correlations to detect attacks. We present two case studies to validate different levels of impact from various attack models and the effectiveness of the detection framework. Finally, we discuss how to enhance the detection framework. PMID:29065461

  8. Command Disaggregation Attack and Mitigation in Industrial Internet of Things.

    PubMed

    Xun, Peng; Zhu, Pei-Dong; Hu, Yi-Fan; Cui, Peng-Shuai; Zhang, Yan

    2017-10-21

    A cyber-physical attack in the industrial Internet of Things can cause severe damage to physical system. In this paper, we focus on the command disaggregation attack, wherein attackers modify disaggregated commands by intruding command aggregators like programmable logic controllers, and then maliciously manipulate the physical process. It is necessary to investigate these attacks, analyze their impact on the physical process, and seek effective detection mechanisms. We depict two different types of command disaggregation attack modes: (1) the command sequence is disordered and (2) disaggregated sub-commands are allocated to wrong actuators. We describe three attack models to implement these modes with going undetected by existing detection methods. A novel and effective framework is provided to detect command disaggregation attacks. The framework utilizes the correlations among two-tier command sequences, including commands from the output of central controller and sub-commands from the input of actuators, to detect attacks before disruptions occur. We have designed components of the framework and explain how to mine and use these correlations to detect attacks. We present two case studies to validate different levels of impact from various attack models and the effectiveness of the detection framework. Finally, we discuss how to enhance the detection framework.

  9. Framework for evaluating public health surveillance systems for early detection of outbreaks: recommendations from the CDC Working Group.

    PubMed

    Buehler, James W; Hopkins, Richard S; Overhage, J Marc; Sosin, Daniel M; Tong, Van

    2004-05-07

    The threat of terrorism and high-profile disease outbreaks has drawn attention to public health surveillance systems for early detection of outbreaks. State and local health departments are enhancing existing surveillance systems and developing new systems to better detect outbreaks through public health surveillance. However, information is limited about the usefulness of surveillance systems for outbreak detection or the best ways to support this function. This report supplements previous guidelines for evaluating public health surveillance systems. Use of this framework is intended to improve decision-making regarding the implementation of surveillance for outbreak detection. Use of a standardized evaluation methodology, including description of system design and operation, also will enhance the exchange of information regarding methods to improve early detection of outbreaks. The framework directs particular attention to the measurement of timeliness and validity for outbreak detection. The evaluation framework is designed to support assessment and description of all surveillance approaches to early detection, whether through traditional disease reporting, specialized analytic routines for aberration detection, or surveillance using early indicators of disease outbreaks, such as syndromic surveillance.

  10. Multi-class geospatial object detection based on a position-sensitive balancing framework for high spatial resolution remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Zhong, Yanfei; Han, Xiaobing; Zhang, Liangpei

    2018-04-01

    Multi-class geospatial object detection from high spatial resolution (HSR) remote sensing imagery is attracting increasing attention in a wide range of object-related civil and engineering applications. However, the distribution of objects in HSR remote sensing imagery is location-variable and complicated, and how to accurately detect the objects in HSR remote sensing imagery is a critical problem. Due to the powerful feature extraction and representation capability of deep learning, the deep learning based region proposal generation and object detection integrated framework has greatly promoted the performance of multi-class geospatial object detection for HSR remote sensing imagery. However, due to the translation caused by the convolution operation in the convolutional neural network (CNN), although the performance of the classification stage is seldom influenced, the localization accuracies of the predicted bounding boxes in the detection stage are easily influenced. The dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage has not been addressed for HSR remote sensing imagery, and causes position accuracy problems for multi-class geospatial object detection with region proposal generation and object detection. In order to further improve the performance of the region proposal generation and object detection integrated framework for HSR remote sensing imagery object detection, a position-sensitive balancing (PSB) framework is proposed in this paper for multi-class geospatial object detection from HSR remote sensing imagery. The proposed PSB framework takes full advantage of the fully convolutional network (FCN), on the basis of a residual network, and adopts the PSB framework to solve the dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage. In addition, a pre-training mechanism is utilized to accelerate the training procedure and increase the robustness of the proposed algorithm. The proposed algorithm is validated with a publicly available 10-class object detection dataset.

  11. Threshold Capabilities: Threshold Concepts and Knowledge Capability Linked through Variation Theory

    ERIC Educational Resources Information Center

    Baillie, Caroline; Bowden, John A.; Meyer, Jan H. F.

    2013-01-01

    The Threshold Capability Integrated Theoretical Framework (TCITF) is presented as a framework for the design of university curricula, aimed at developing graduates' capability to deal with previously unseen situations in their professional, social, and personal lives. The TCITF is a new theoretical framework derived from, and heavily dependent…

  12. Influence of geometry variations on the gravitational focusing of timelike geodesic congruences

    NASA Astrophysics Data System (ADS)

    Seriu, Masafumi

    2015-10-01

    We derive a set of equations describing the linear response of the convergence properties of a geodesic congruence to arbitrary geometry variations. It is a combination of equations describing the deviations from the standard Raychaudhuri-type equations due to the geodesic shifts and an equation describing the geodesic shifts due to the geometry variations. In this framework, the geometry variations, which can be chosen arbitrarily, serve as probes to investigate the gravitational contraction processes from various angles. We apply the obtained framework to the case of conformal geometry variations, characterized by an arbitrary function f (x ), and see that the formulas get simplified to a great extent. We investigate the response of the convergence properties of geodesics in the latest phase of gravitational contractions by restricting the class of conformal geometry variations to the one satisfying the strong energy condition. We then find out that in the final stage, f and D .D f control the overall contraction behavior and that the contraction rate gets larger when f is negative and |f | is so large as to overwhelm |D .D f |. (Here D .D is the Laplacian operator on the spatial hypersurfaces orthogonal to the geodesic congruence in concern.) To get more concrete insights, we also apply the framework to the time-reversed Friedmann-Robertson-Walker model as the simplest case of the singularity formations.

  13. Feasibility Study of a Generalized Framework for Developing Computer-Aided Detection Systems-a New Paradigm.

    PubMed

    Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu

    2017-10-01

    We propose a generalized framework for developing computer-aided detection (CADe) systems whose characteristics depend only on those of the training dataset. The purpose of this study is to show the feasibility of the framework. Two different CADe systems were experimentally developed by a prototype of the framework, but with different training datasets. The CADe systems include four components; preprocessing, candidate area extraction, candidate detection, and candidate classification. Four pretrained algorithms with dedicated optimization/setting methods corresponding to the respective components were prepared in advance. The pretrained algorithms were sequentially trained in the order of processing of the components. In this study, two different datasets, brain MRA with cerebral aneurysms and chest CT with lung nodules, were collected to develop two different types of CADe systems in the framework. The performances of the developed CADe systems were evaluated by threefold cross-validation. The CADe systems for detecting cerebral aneurysms in brain MRAs and for detecting lung nodules in chest CTs were successfully developed using the respective datasets. The framework was shown to be feasible by the successful development of the two different types of CADe systems. The feasibility of this framework shows promise for a new paradigm in the development of CADe systems: development of CADe systems without any lesion specific algorithm designing.

  14. Diffusion of breast conserving surgery in medical communities.

    PubMed

    Jerome-D'Emilia, Bonnie; Begun, James W

    2005-01-01

    Excluding skin cancers, breast cancer is the most common form of cancer in women. Due to an increased focus on early detection, many more cases of breast cancer are now diagnosed at an early stage, which makes the use of breast conserving surgery (BCS) an efficacious and often more desirable treatment choice than mastectomy. An analysis of the variation in the use of BCS in the United States was performed using data from the years 1988 and 1994, and stratifying hospitals on the basis of teaching status. In both 1988 and 1994, BCS was highest in academic teaching hospitals and lowest in community hospitals. This finding is interpreted within the framework of classical diffusion theory. Social and cultural norms in local medical communities have a strong effect on the degree to which innovations diffuse rapidly or not. This analysis is useful in the understanding of geographic and hospital-based variations in treatment for early stage breast cancer and other illnesses that have long and strongly held traditions of treatment.

  15. Sun, Ocean, Nuclear Bombs, and Fossil Fuels: Radiocarbon Variations and Implications for High-Resolution Dating

    NASA Astrophysics Data System (ADS)

    Dutta, Koushik

    2016-06-01

    Radiocarbon, or 14C, is a radiometric dating method ideally suited for providing a chronological framework in archaeology and geosciences for timescales spanning the last 50,000 years. 14C is easily detectable in most common natural organic materials and has a half-life (5,730±40 years) relevant to these timescales. 14C produced from large-scale detonations of nuclear bombs between the 1950s and the early 1960s can be used for dating modern organic materials formed after the 1950s. Often these studies demand high-resolution chronology to resolve ages within a few decades to less than a few years. Despite developments in modern, high-precision 14C analytical methods, the applicability of 14C in high-resolution chronology is limited by short-term variations in atmospheric 14C in the past. This article reviews the roles of the principal natural drivers (e.g., solar magnetic activity and ocean circulation) and the anthropogenic perturbations (e.g., fossil fuel CO2 and 14C from nuclear and thermonuclear bombs) that are responsible for short-term 14C variations in the environment. Methods and challenges of high-resolution 14C dating are discussed.

  16. Testing the predictions of coping styles theory in threespined sticklebacks

    PubMed Central

    Bensky, Miles K.; Paitz, Ryan; Pereira, Laura; Bell, Alison M.

    2017-01-01

    Coping styles theory provides a framework for understanding individual variation in how animals respond to environmental change, and predicts how individual differences in stress responsiveness and behavior might relate to cognitive differences. According to coping styles theory, proactive individuals are bolder, less reactive to stressors, and more routinized than their reactive counterparts. A key tenet of coping styles theory is that variation in coping styles is maintained by tradeoffs with behavioral flexibility: proactive individuals excel in stable environments while more flexible, reactive individuals perform better in variable environments. Here, we assess evidence for coping styles within a natural population of threespined sticklebacks (Gasterosteus aculeatus). We developed a criterion-based learning paradigm to evaluate individual variation in initial and reversal learning. We observed strong individual differences in boldness, cortisol production, and learning performance. Consistent with coping styles, fish that released more cortisol were more timid in response to a predator attack and slower to learn a color discrimination task. However, there was no evidence that reactive individuals performed better when the environment changed (when the rewarded color was reversed). The failure to detect trade-offs between behavioral routinization and flexibility prompts other explanations for the maintenance of differing coping styles. PMID:28017848

  17. AnRAD: A Neuromorphic Anomaly Detection Framework for Massive Concurrent Data Streams.

    PubMed

    Chen, Qiuwen; Luley, Ryan; Wu, Qing; Bishop, Morgan; Linderman, Richard W; Qiu, Qinru

    2018-05-01

    The evolution of high performance computing technologies has enabled the large-scale implementation of neuromorphic models and pushed the research in computational intelligence into a new era. Among the machine learning applications, unsupervised detection of anomalous streams is especially challenging due to the requirements of detection accuracy and real-time performance. Designing a computing framework that harnesses the growing computing power of the multicore systems while maintaining high sensitivity and specificity to the anomalies is an urgent research topic. In this paper, we propose anomaly recognition and detection (AnRAD), a bioinspired detection framework that performs probabilistic inferences. We analyze the feature dependency and develop a self-structuring method that learns an efficient confabulation network using unlabeled data. This network is capable of fast incremental learning, which continuously refines the knowledge base using streaming data. Compared with several existing anomaly detection approaches, our method provides competitive detection quality. Furthermore, we exploit the massive parallel structure of the AnRAD framework. Our implementations of the detection algorithm on the graphic processing unit and the Xeon Phi coprocessor both obtain substantial speedups over the sequential implementation on general-purpose microprocessor. The framework provides real-time service to concurrent data streams within diversified knowledge contexts, and can be applied to large problems with multiple local patterns. Experimental results demonstrate high computing performance and memory efficiency. For vehicle behavior detection, the framework is able to monitor up to 16000 vehicles (data streams) and their interactions in real time with a single commodity coprocessor, and uses less than 0.2 ms for one testing subject. Finally, the detection network is ported to our spiking neural network simulator to show the potential of adapting to the emerging neuromorphic architectures.

  18. The evolution of personality variation in humans and other animals.

    PubMed

    Nettle, Daniel

    2006-09-01

    A comprehensive evolutionary framework for understanding the maintenance of heritable behavioral variation in humans is yet to be developed. Some evolutionary psychologists have argued that heritable variation will not be found in important, fitness-relevant characteristics because of the winnowing effect of natural selection. This article propounds the opposite view. Heritable variation is ubiquitous in all species, and there are a number of frameworks for understanding its persistence. The author argues that each of the Big Five dimensions of human personality can be seen as the result of a trade-off between different fitness costs and benefits. As there is no unconditionally optimal value of these trade-offs, it is to be expected that genetic diversity will be retained in the population. ((c) 2006 APA, all rights reserved).

  19. Video copy protection and detection framework (VPD) for e-learning systems

    NASA Astrophysics Data System (ADS)

    ZandI, Babak; Doustarmoghaddam, Danial; Pour, Mahsa R.

    2013-03-01

    This Article reviews and compares the copyright issues related to the digital video files, which can be categorized as contended based and Digital watermarking copy Detection. Then we describe how to protect a digital video by using a special Video data hiding method and algorithm. We also discuss how to detect the copy right of the file, Based on expounding Direction of the technology of the video copy detection, and Combining with the own research results, brings forward a new video protection and copy detection approach in terms of plagiarism and e-learning systems using the video data hiding technology. Finally we introduce a framework for Video protection and detection in e-learning systems (VPD Framework).

  20. A framework for investigating geographical variation in diseases, based on a study of Legionnaires' disease.

    PubMed

    Bhopal, R S

    1991-11-01

    Demonstration of geographical variations in disease can yield powerful insight into the disease pathway, particularly for environmentally acquired conditions, but only if the many problems of data interpretation can be solved. This paper presents the framework, methods and principles guiding a study of the geographical epidemiology of Legionnaires' Disease in Scotland. A case-list was constructed and disease incidence rates were calculated by geographical area; these showed variation. Five categories of explanation for the variation were identified: short-term fluctuations of incidence in time masquerading as differences by place; artefact; and differences in host-susceptibility, agent virulence, or environment. The methods used to study these explanations, excepting agent virulence, are described, with an emphasis on the use of previously existing data to test hypotheses. Examples include the use of mortality, census and hospital morbidity data to assess the artefact and host-susceptibility explanations; and the use of ratios of serology tests to disease to examine the differential testing hypothesis. The reasoning and process by which the environmental focus of the study was narrowed and the technique for relating the geographical pattern of disease to the putative source are outlined. This framework allows the researcher to plan for the parallel collection of the data necessary both to demonstrate geographical variation and to point to the likely explanation.

  1. Identifying group discriminative and age regressive sub-networks from DTI-based connectivity via a unified framework of non-negative matrix factorization and graph embedding

    PubMed Central

    Ghanbari, Yasser; Smith, Alex R.; Schultz, Robert T.; Verma, Ragini

    2014-01-01

    Diffusion tensor imaging (DTI) offers rich insights into the physical characteristics of white matter (WM) fiber tracts and their development in the brain, facilitating a network representation of brain’s traffic pathways. Such a network representation of brain connectivity has provided a novel means of investigating brain changes arising from pathology, development or aging. The high dimensionality of these connectivity networks necessitates the development of methods that identify the connectivity building blocks or sub-network components that characterize the underlying variation in the population. In addition, the projection of the subject networks into the basis set provides a low dimensional representation of it, that teases apart different sources of variation in the sample, facilitating variation-specific statistical analysis. We propose a unified framework of non-negative matrix factorization and graph embedding for learning sub-network patterns of connectivity by their projective non-negative decomposition into a reconstructive basis set, as well as, additional basis sets representing variational sources in the population like age and pathology. The proposed framework is applied to a study of diffusion-based connectivity in subjects with autism that shows localized sparse sub-networks which mostly capture the changes related to pathology and developmental variations. PMID:25037933

  2. Vision Marker-Based In Situ Examination of Bacterial Growth in Liquid Culture Media.

    PubMed

    Kim, Kyukwang; Choi, Duckyu; Lim, Hwijoon; Kim, Hyeongkeun; Jeon, Jessie S

    2016-12-18

    The detection of bacterial growth in liquid media is an essential process in determining antibiotic susceptibility or the level of bacterial presence for clinical or research purposes. We have developed a system, which enables simplified and automated detection using a camera and a striped pattern marker. The quantification of bacterial growth is possible as the bacterial growth in the culturing vessel blurs the marker image, which is placed on the back of the vessel, and the blurring results in a decrease in the high-frequency spectrum region of the marker image. The experiment results show that the FFT (fast Fourier transform)-based growth detection method is robust to the variations in the type of bacterial carrier and vessels ranging from the culture tubes to the microfluidic devices. Moreover, the automated incubator and image acquisition system are developed to be used as a comprehensive in situ detection system. We expect that this result can be applied in the automation of biological experiments, such as the Antibiotics Susceptibility Test or toxicity measurement. Furthermore, the simple framework of the proposed growth measurement method may be further utilized as an effective and convenient method for building point-of-care devices for developing countries.

  3. Highly selective luminescent sensing of picric acid based on a water-stable europium metal-organic framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Tifeng; Zhu, Fengliang; Cui, Yuanjing, E-mail: cuiyj@zju.edu.cn

    A water-stable metal-organic framework (MOF) EuNDC has been synthesized for selective detection of the well-known contaminant and toxicant picric acid (PA) in aqueous solution. Due to the photo-induced electron transfer and self-absorption mechanism, EuNDC displayed rapid, selective and sensitive detection of PA with a detection limit of 37.6 ppb. Recyclability experiments revealed that EuNDC retains its initial luminescent intensity and same quenching efficiency in each cycle, suggesting high photostability and reusability for long-term sensing applications. The excellent detection performance of EuNDC makes it a promising PA sensing material for practical applications. - Graphical abstract: A water-stable europium-based metal-organic framework hasmore » been reported for highly selective sensing of picric acid (PA) with a detection limit of 37.6 ppb in aqueous solution. - Highlights: • A water-stable metal-organic framework (MOF) EuNDC was synthesized. • The highly selective detection of picric acid with a detection limit of 37.6 ppb was realized. • The detection mechanism were also presented and discussed.« less

  4. Detection Copy Number Variants from NGS with Sparse and Smooth Constraints.

    PubMed

    Zhang, Yue; Cheung, Yiu-Ming; Xu, Bo; Su, Weifeng

    2017-01-01

    It is known that copy number variations (CNVs) are associated with complex diseases and particular tumor types, thus reliable identification of CNVs is of great potential value. Recent advances in next generation sequencing (NGS) data analysis have helped manifest the richness of CNV information. However, the performances of these methods are not consistent. Reliably finding CNVs in NGS data in an efficient way remains a challenging topic, worthy of further investigation. Accordingly, we tackle the problem by formulating CNVs identification into a quadratic optimization problem involving two constraints. By imposing the constraints of sparsity and smoothness, the reconstructed read depth signal from NGS is anticipated to fit the CNVs patterns more accurately. An efficient numerical solution tailored from alternating direction minimization (ADM) framework is elaborated. We demonstrate the advantages of the proposed method, namely ADM-CNV, by comparing it with six popular CNV detection methods using synthetic, simulated, and empirical sequencing data. It is shown that the proposed approach can successfully reconstruct CNV patterns from raw data, and achieve superior or comparable performance in detection of the CNVs compared to the existing counterparts.

  5. Characterizing behavioural ‘characters’: an evolutionary framework

    PubMed Central

    Araya-Ajoy, Yimen G.; Dingemanse, Niels J.

    2014-01-01

    Biologists often study phenotypic evolution assuming that phenotypes consist of a set of quasi-independent units that have been shaped by selection to accomplish a particular function. In the evolutionary literature, such quasi-independent functional units are called ‘evolutionary characters’, and a framework based on evolutionary principles has been developed to characterize them. This framework mainly focuses on ‘fixed’ characters, i.e. those that vary exclusively between individuals. In this paper, we introduce multi-level variation and thereby expand the framework to labile characters, focusing on behaviour as a worked example. We first propose a concept of ‘behavioural characters’ based on the original evolutionary character concept. We then detail how integration of variation between individuals (cf. ‘personality’) and within individuals (cf. ‘individual plasticity’) into the framework gives rise to a whole suite of novel testable predictions about the evolutionary character concept. We further propose a corresponding statistical methodology to test whether observed behaviours should be considered expressions of a hypothesized evolutionary character. We illustrate the application of our framework by characterizing the behavioural character ‘aggressiveness’ in wild great tits, Parus major. PMID:24335984

  6. Variance components estimation for continuous and discrete data, with emphasis on cross-classified sampling designs

    USGS Publications Warehouse

    Gray, Brian R.; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.

    2012-01-01

    Variance components may play multiple roles (cf. Cox and Solomon 2003). First, magnitudes and relative magnitudes of the variances of random factors may have important scientific and management value in their own right. For example, variation in levels of invasive vegetation among and within lakes may suggest causal agents that operate at both spatial scales – a finding that may be important for scientific and management reasons. Second, variance components may also be of interest when they affect precision of means and covariate coefficients. For example, variation in the effect of water depth on the probability of aquatic plant presence in a study of multiple lakes may vary by lake. This variation will affect the precision of the average depth-presence association. Third, variance component estimates may be used when designing studies, including monitoring programs. For example, to estimate the numbers of years and of samples per year required to meet long-term monitoring goals, investigators need estimates of within and among-year variances. Other chapters in this volume (Chapters 7, 8, and 10) as well as extensive external literature outline a framework for applying estimates of variance components to the design of monitoring efforts. For example, a series of papers with an ecological monitoring theme examined the relative importance of multiple sources of variation, including variation in means among sites, years, and site-years, for the purposes of temporal trend detection and estimation (Larsen et al. 2004, and references therein).

  7. A deep learning framework for the automated inspection of complex dual-energy x-ray cargo imagery

    NASA Astrophysics Data System (ADS)

    Rogers, Thomas W.; Jaccard, Nicolas; Griffin, Lewis D.

    2017-05-01

    Previously, we investigated the use of Convolutional Neural Networks (CNNs) to detect so-called Small Metallic Threats (SMTs) hidden amongst legitimate goods inside a cargo container. We trained a CNN from scratch on data produced by a Threat Image Projection (TIP) framework that generates images with realistic variation to robustify performance. The system achieved 90% detection of containers that contained a single SMT, while raising 6% false positives on benign containers. The best CNN architecture used the raw high energy image (single-energy) and its logarithm as input channels. Use of the logarithm improved performance, thus echoing studies on human operator performance. However, it is an unexpected result with CNNs. In this work, we (i) investigate methods to exploit material information captured in dual-energy images, and (ii) introduce a new CNN training scheme that generates `spot-the-difference' benign and threat pairs on-the-fly. To the best of our knowledge, this is the first time that CNNs have been applied directly to raw dual-energy X-ray imagery, in any field. To exploit dual-energy, we experiment with adapting several physics-derived approaches to material discrimination from the cargo literature, and introduce three novel variants. We hypothesise that CNNs can implicitly learn about the material characteristics of objects from the raw dual-energy images, and use this to suppress false positives. The best performing method is able to detect 95% of containers containing a single SMT, while raising 0.4% false positives on benign containers. This is a step change improvement in performance over our prior work

  8. Measuring adverse events in helicopter emergency medical services: establishing content validity.

    PubMed

    Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M

    2014-01-01

    We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.

  9. Thresholds of Principle and Preference: Exploring Procedural Variation in Postgraduate Surgical Education

    PubMed Central

    Apramian, Tavis; Cristancho, Sayra; Watling, Chris; Ott, Michael; Lingard, Lorelei

    2017-01-01

    Background Expert physicians develop their own ways of doing things. The influence of such practice variation in clinical learning is insufficiently understood. Our grounded theory study explored how residents make sense of, and behave in relation to, the procedural variations of faculty surgeons. Method We sampled senior postgraduate surgical residents to construct a theoretical framework for how residents make sense of procedural variations. Using a constructivist grounded theory approach, we used marginal participant observation in the operating room across 56 surgical cases (146 hours), field interviews (38), and formal interviews (6) to develop a theoretical framework for residents’ ways of dealing with procedural variations. Data analysis used constant comparison to iteratively refine the framework and data collection until theoretical saturation was reached. Results The core category of the constructed theory was called thresholds of principle and preference and it captured how faculty members position some procedural variations as negotiable and others not. The term thresholding was coined to describe residents’ daily experiences of spotting, mapping, and negotiating their faculty members’ thresholds and defending their own emerging thresholds. Conclusions Thresholds of principle and preference play a key role in workplace-based medical education. Postgraduate medical learners are occupied on a day-to-day level with thresholding and attempting to make sense of the procedural variations of faculty. Workplace-based teaching and assessment should include an understanding of the integral role of thresholding in shaping learners’ development. Future research should explore the nature and impact of thresholding in workplace-based learning beyond the surgical context. PMID:26505105

  10. Automated Detection of Leakage in Fluorescein Angiography Images with Application to Malarial Retinopathy

    PubMed Central

    Zhao, Yitian; J. C. MacCormick, Ian; G. Parry, David; Leach, Sophie; A. V. Beare, Nicholas; P. Harding, Simon; Zheng, Yalin

    2015-01-01

    The detection and assessment of leakage in retinal fluorescein angiogram images is important for the management of a wide range of retinal diseases. We have developed a framework that can automatically detect three types of leakage (large focal, punctate focal, and vessel segment leakage) and validated it on images from patients with malarial retinopathy. This framework comprises three steps: vessel segmentation, saliency feature generation and leakage detection. We tested the effectiveness of this framework by applying it to images from 20 patients with large focal leak, 10 patients with punctate focal leak, and 5,846 vessel segments from 10 patients with vessel leakage. The sensitivity in detecting large focal, punctate focal and vessel segment leakage are 95%, 82% and 81%, respectively, when compared to manual annotation by expert human observers. Our framework has the potential to become a powerful new tool for studying malarial retinopathy, and other conditions involving retinal leakage. PMID:26030010

  11. Automated detection of leakage in fluorescein angiography images with application to malarial retinopathy.

    PubMed

    Zhao, Yitian; MacCormick, Ian J C; Parry, David G; Leach, Sophie; Beare, Nicholas A V; Harding, Simon P; Zheng, Yalin

    2015-06-01

    The detection and assessment of leakage in retinal fluorescein angiogram images is important for the management of a wide range of retinal diseases. We have developed a framework that can automatically detect three types of leakage (large focal, punctate focal, and vessel segment leakage) and validated it on images from patients with malarial retinopathy. This framework comprises three steps: vessel segmentation, saliency feature generation and leakage detection. We tested the effectiveness of this framework by applying it to images from 20 patients with large focal leak, 10 patients with punctate focal leak, and 5,846 vessel segments from 10 patients with vessel leakage. The sensitivity in detecting large focal, punctate focal and vessel segment leakage are 95%, 82% and 81%, respectively, when compared to manual annotation by expert human observers. Our framework has the potential to become a powerful new tool for studying malarial retinopathy, and other conditions involving retinal leakage.

  12. The Evolution of Personality Variation in Humans and Other Animals

    ERIC Educational Resources Information Center

    Nettle, Daniel

    2006-01-01

    A comprehensive evolutionary framework for understanding the maintenance of heritable behavioral variation in humans is yet to be developed. Some evolutionary psychologists have argued that heritable variation will not be found in important, fitness-relevant characteristics because of the winnowing effect of natural selection. This article…

  13. Measuring Blood Glucose Concentrations in Photometric Glucometers Requiring Very Small Sample Volumes.

    PubMed

    Demitri, Nevine; Zoubir, Abdelhak M

    2017-01-01

    Glucometers present an important self-monitoring tool for diabetes patients and, therefore, must exhibit high accuracy as well as good usability features. Based on an invasive photometric measurement principle that drastically reduces the volume of the blood sample needed from the patient, we present a framework that is capable of dealing with small blood samples, while maintaining the required accuracy. The framework consists of two major parts: 1) image segmentation; and 2) convergence detection. Step 1 is based on iterative mode-seeking methods to estimate the intensity value of the region of interest. We present several variations of these methods and give theoretical proofs of their convergence. Our approach is able to deal with changes in the number and position of clusters without any prior knowledge. Furthermore, we propose a method based on sparse approximation to decrease the computational load, while maintaining accuracy. Step 2 is achieved by employing temporal tracking and prediction, herewith decreasing the measurement time, and, thus, improving usability. Our framework is tested on several real datasets with different characteristics. We show that we are able to estimate the underlying glucose concentration from much smaller blood samples than is currently state of the art with sufficient accuracy according to the most recent ISO standards and reduce measurement time significantly compared to state-of-the-art methods.

  14. Wide-band profile domain pulsar timing analysis

    NASA Astrophysics Data System (ADS)

    Lentati, L.; Kerr, M.; Dai, S.; Hobson, M. P.; Shannon, R. M.; Hobbs, G.; Bailes, M.; Bhat, N. D. Ramesh; Burke-Spolaor, S.; Coles, W.; Dempsey, J.; Lasky, P. D.; Levin, Y.; Manchester, R. N.; Osłowski, S.; Ravi, V.; Reardon, D. J.; Rosado, P. A.; Spiewak, R.; van Straten, W.; Toomey, L.; Wang, J.; Wen, L.; You, X.; Zhu, X.

    2017-04-01

    We extend profile domain pulsar timing to incorporate wide-band effects such as frequency-dependent profile evolution and broad-band shape variation in the pulse profile. We also incorporate models for temporal variations in both pulse width and in the separation in phase of the main pulse and interpulse. We perform the analysis with both nested sampling and Hamiltonian Monte Carlo methods. In the latter case, we introduce a new parametrization of the posterior that is extremely efficient in the low signal-to-noise regime and can be readily applied to a wide range of scientific problems. We apply this methodology to a series of simulations, and to between seven and nine years of observations for PSRs J1713+0747, J1744-1134 and J1909-3744 with frequency coverage that spans 700-3600 Mhz. We use a smooth model for profile evolution across the full frequency range, and compare smooth and piecewise models for the temporal variations in dispersion measure (DM). We find that the profile domain framework consistently results in improved timing precision compared to the standard analysis paradigm by as much as 40 per cent for timing parameters. Incorporating smoothness in the DM variations into the model further improves timing precision by as much as 30 per cent. For PSR J1713+0747, we also detect pulse shape variation uncorrelated between epochs, which we attribute to variation intrinsic to the pulsar at a level consistent with previously published analyses. Not accounting for this shape variation biases the measured arrival times at the level of ˜30 ns, the same order of magnitude as the expected shift due to gravitational waves in the pulsar timing band.

  15. Cloud-Based Evaluation of Anatomical Structure Segmentation and Landmark Detection Algorithms: VISCERAL Anatomy Benchmarks.

    PubMed

    Jimenez-Del-Toro, Oscar; Muller, Henning; Krenn, Markus; Gruenberg, Katharina; Taha, Abdel Aziz; Winterstein, Marianne; Eggel, Ivan; Foncubierta-Rodriguez, Antonio; Goksel, Orcun; Jakab, Andras; Kontokotsios, Georgios; Langs, Georg; Menze, Bjoern H; Salas Fernandez, Tomas; Schaer, Roger; Walleyo, Anna; Weber, Marc-Andre; Dicente Cid, Yashin; Gass, Tobias; Heinrich, Mattias; Jia, Fucang; Kahl, Fredrik; Kechichian, Razmig; Mai, Dominic; Spanier, Assaf B; Vincent, Graham; Wang, Chunliang; Wyeth, Daniel; Hanbury, Allan

    2016-11-01

    Variations in the shape and appearance of anatomical structures in medical images are often relevant radiological signs of disease. Automatic tools can help automate parts of this manual process. A cloud-based evaluation framework is presented in this paper including results of benchmarking current state-of-the-art medical imaging algorithms for anatomical structure segmentation and landmark detection: the VISCERAL Anatomy benchmarks. The algorithms are implemented in virtual machines in the cloud where participants can only access the training data and can be run privately by the benchmark administrators to objectively compare their performance in an unseen common test set. Overall, 120 computed tomography and magnetic resonance patient volumes were manually annotated to create a standard Gold Corpus containing a total of 1295 structures and 1760 landmarks. Ten participants contributed with automatic algorithms for the organ segmentation task, and three for the landmark localization task. Different algorithms obtained the best scores in the four available imaging modalities and for subsets of anatomical structures. The annotation framework, resulting data set, evaluation setup, results and performance analysis from the three VISCERAL Anatomy benchmarks are presented in this article. Both the VISCERAL data set and Silver Corpus generated with the fusion of the participant algorithms on a larger set of non-manually-annotated medical images are available to the research community.

  16. Cognitive processing in bipolar disorder conceptualized using the Interactive Cognitive Subsystems (ICS) model

    PubMed Central

    Lomax, C. L.; Barnard, P. J.; Lam, D.

    2009-01-01

    Background There are few theoretical proposals that attempt to account for the variation in affective processing across different affective states of bipolar disorder (BD). The Interacting Cognitive Subsystems (ICS) framework has been recently extended to account for manic states. Within the framework, positive mood state is hypothesized to tap into an implicational level of processing, which is proposed to be more extreme in states of mania. Method Thirty individuals with BD and 30 individuals with no history of affective disorder were tested in euthymic mood state and then in induced positive mood state using the Question–Answer task to examine the mode of processing of schemas. The task was designed to test whether individuals would detect discrepancies within the prevailing schemas of the sentences. Results Although the present study did not support the hypothesis that the groups differ in their ability to detect discrepancies within schemas, we did find that the BD group was significantly more likely than the control group to answer questions that were consistent with the prevailing schemas, both before and after mood induction. Conclusions These results may reflect a general cognitive bias, that individuals with BD have a tendency to operate at a more abstract level of representation. This may leave an individual prone to affective disturbance, although further research is required to replicate this finding. PMID:18796173

  17. Multi-scale Gaussian representation and outline-learning based cell image segmentation.

    PubMed

    Farhan, Muhammad; Ruusuvuori, Pekka; Emmenlauer, Mario; Rämö, Pauli; Dehio, Christoph; Yli-Harja, Olli

    2013-01-01

    High-throughput genome-wide screening to study gene-specific functions, e.g. for drug discovery, demands fast automated image analysis methods to assist in unraveling the full potential of such studies. Image segmentation is typically at the forefront of such analysis as the performance of the subsequent steps, for example, cell classification, cell tracking etc., often relies on the results of segmentation. We present a cell cytoplasm segmentation framework which first separates cell cytoplasm from image background using novel approach of image enhancement and coefficient of variation of multi-scale Gaussian scale-space representation. A novel outline-learning based classification method is developed using regularized logistic regression with embedded feature selection which classifies image pixels as outline/non-outline to give cytoplasm outlines. Refinement of the detected outlines to separate cells from each other is performed in a post-processing step where the nuclei segmentation is used as contextual information. We evaluate the proposed segmentation methodology using two challenging test cases, presenting images with completely different characteristics, with cells of varying size, shape, texture and degrees of overlap. The feature selection and classification framework for outline detection produces very simple sparse models which use only a small subset of the large, generic feature set, that is, only 7 and 5 features for the two cases. Quantitative comparison of the results for the two test cases against state-of-the-art methods show that our methodology outperforms them with an increase of 4-9% in segmentation accuracy with maximum accuracy of 93%. Finally, the results obtained for diverse datasets demonstrate that our framework not only produces accurate segmentation but also generalizes well to different segmentation tasks.

  18. Multi-scale Gaussian representation and outline-learning based cell image segmentation

    PubMed Central

    2013-01-01

    Background High-throughput genome-wide screening to study gene-specific functions, e.g. for drug discovery, demands fast automated image analysis methods to assist in unraveling the full potential of such studies. Image segmentation is typically at the forefront of such analysis as the performance of the subsequent steps, for example, cell classification, cell tracking etc., often relies on the results of segmentation. Methods We present a cell cytoplasm segmentation framework which first separates cell cytoplasm from image background using novel approach of image enhancement and coefficient of variation of multi-scale Gaussian scale-space representation. A novel outline-learning based classification method is developed using regularized logistic regression with embedded feature selection which classifies image pixels as outline/non-outline to give cytoplasm outlines. Refinement of the detected outlines to separate cells from each other is performed in a post-processing step where the nuclei segmentation is used as contextual information. Results and conclusions We evaluate the proposed segmentation methodology using two challenging test cases, presenting images with completely different characteristics, with cells of varying size, shape, texture and degrees of overlap. The feature selection and classification framework for outline detection produces very simple sparse models which use only a small subset of the large, generic feature set, that is, only 7 and 5 features for the two cases. Quantitative comparison of the results for the two test cases against state-of-the-art methods show that our methodology outperforms them with an increase of 4-9% in segmentation accuracy with maximum accuracy of 93%. Finally, the results obtained for diverse datasets demonstrate that our framework not only produces accurate segmentation but also generalizes well to different segmentation tasks. PMID:24267488

  19. Parametric modelling of cardiac system multiple measurement signals: an open-source computer framework for performance evaluation of ECG, PCG and ABP event detectors.

    PubMed

    Homaeinezhad, M R; Sabetian, P; Feizollahi, A; Ghaffari, A; Rahmani, R

    2012-02-01

    The major focus of this study is to present a performance accuracy assessment framework based on mathematical modelling of cardiac system multiple measurement signals. Three mathematical algebraic subroutines with simple structural functions for synthetic generation of the synchronously triggered electrocardiogram (ECG), phonocardiogram (PCG) and arterial blood pressure (ABP) signals are described. In the case of ECG signals, normal and abnormal PQRST cycles in complicated conditions such as fascicular ventricular tachycardia, rate dependent conduction block and acute Q-wave infarctions of inferior and anterolateral walls can be simulated. Also, continuous ABP waveform with corresponding individual events such as systolic, diastolic and dicrotic pressures with normal or abnormal morphologies can be generated by another part of the model. In addition, the mathematical synthetic PCG framework is able to generate the S4-S1-S2-S3 cycles in normal and in cardiac disorder conditions such as stenosis, insufficiency, regurgitation and gallop. In the PCG model, the amplitude and frequency content (5-700 Hz) of each sound and variation patterns can be specified. The three proposed models were implemented to generate artificial signals with varies abnormality types and signal-to-noise ratios (SNR), for quantitative detection-delineation performance assessment of several ECG, PCG and ABP individual event detectors designed based on the Hilbert transform, discrete wavelet transform, geometric features such as area curve length (ACLM), the multiple higher order moments (MHOM) metric, and the principal components analysed geometric index (PCAGI). For each method the detection-delineation operating characteristics were obtained automatically in terms of sensitivity, positive predictivity and delineation (segmentation) error rms and checked by the cardiologist. The Matlab m-file script of the synthetic ECG, ABP and PCG signal generators are available in the Appendix.

  20. Improved flaw detection and characterization with difference thermography

    NASA Astrophysics Data System (ADS)

    Winfree, William P.; Zalameda, Joseph N.; Howell, Patricia A.

    2011-05-01

    Flaw detection and characterization with thermographic techniques in graphite polymer composites is often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, variations in fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These variations result in a noise floor that increases the difficulty of detecting and characterizing deeper flaws. The paper investigates comparing thermographic responses taken before and after a change in state in a composite to improve the detection of subsurface flaws. A method is presented for registration of the responses before finding the difference. A significant improvement in the detectability is achieved by comparing the differences in response. Examples of changes in state due to application of a load and impact are presented.

  1. Improved Flaw Detection and Characterization with Difference Thermography

    NASA Technical Reports Server (NTRS)

    Winfree, William P.; Zalameda, Joseph N.; Howell, Patricia A.

    2011-01-01

    Flaw detection and characterization with thermographic techniques in graphite polymer composites is often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, variations in fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These variations result in a noise floor that increases the difficulty of detecting and characterizing deeper flaws. The paper investigates comparing thermographic responses taken before and after a change in state in a composite to improve the detection of subsurface flaws. A method is presented for registration of the responses before finding the difference. A significant improvement in the detectability is achieved by comparing the differences in response. Examples of changes in state due to application of a load and impact are presented.

  2. Costs of fear: Behavioral and life-history responses to risk and their demographic consequences vary across species

    USGS Publications Warehouse

    LaManna, Joseph A.; Martin, Thomas E.

    2016-01-01

    Behavioural responses to reduce predation risk might cause demographic ‘costs of fear’. Costs differ among species, but a conceptual framework to understand this variation is lacking. We use a life-history framework to tie together diverse traits and life stages to better understand interspecific variation in responses and costs. We used natural and experimental variation in predation risk to test phenotypic responses and associated demographic costs for 10 songbird species. Responses such as increased parental attentiveness yielded reduced development time and created benefits such as reduced predation probability. Yet, responses to increased risk also created demographic costs by reducing offspring production in the absence of direct predation. This cost of fear varied widely across species, but predictably with the probability of repeat breeding. Use of a life-history framework can aid our understanding of potential demographic costs from predation, both from responses to perceived risk and from direct predation mortality.

  3. Estimation of completeness magnitude with a Bayesian modeling of daily and weekly variations in earthquake detectability

    NASA Astrophysics Data System (ADS)

    Iwata, T.

    2014-12-01

    In the analysis of seismic activity, assessment of earthquake detectability of a seismic network is a fundamental issue. For this assessment, the completeness magnitude Mc, the minimum magnitude above which all earthquakes are recorded, is frequently estimated. In most cases, Mc is estimated for an earthquake catalog of duration longer than several weeks. However, owing to human activity, noise level in seismic data is higher on weekdays than on weekends, so that earthquake detectability has a weekly variation [e.g., Atef et al., 2009, BSSA]; the consideration of such a variation makes a significant contribution to the precise assessment of earthquake detectability and Mc. For a quantitative evaluation of the weekly variation, we introduced the statistical model of a magnitude-frequency distribution of earthquakes covering an entire magnitude range [Ogata & Katsura, 1993, GJI]. The frequency distribution is represented as the product of the Gutenberg-Richter law and a detection rate function. Then, the weekly variation in one of the model parameters, which corresponds to the magnitude where the detection rate of earthquakes is 50%, was estimated. Because earthquake detectability also have a daily variation [e.g., Iwata, 2013, GJI], and the weekly and daily variations were estimated simultaneously by adopting a modification of a Bayesian smoothing spline method for temporal change in earthquake detectability developed in Iwata [2014, Aust. N. Z. J. Stat.]. Based on the estimated variations in the parameter, the value of Mc was estimated. In this study, the Japan Meteorological Agency catalog from 2006 to 2010 was analyzed; this dataset is the same as analyzed in Iwata [2013] where only the daily variation in earthquake detectability was considered in the estimation of Mc. A rectangular grid with 0.1° intervals covering in and around Japan was deployed, and the value of Mc was estimated for each gridpoint. Consequently, a clear weekly variation was revealed; the detectability is better on Sundays than on the other days. The estimated spatial variation in Mc was compared with that estimated in Iwata [2013]; the maximum difference between Mc values with and without considering the weekly variation approximately equals to 0.2, suggesting the importance of accounting for the weekly variation in the estimation of Mc.

  4. Analysis of Infrared Signature Variation and Robust Filter-Based Supersonic Target Detection

    PubMed Central

    Sun, Sun-Gu; Kim, Kyung-Tae

    2014-01-01

    The difficulty of small infrared target detection originates from the variations of infrared signatures. This paper presents the fundamental physics of infrared target variations and reports the results of variation analysis of infrared images acquired using a long wave infrared camera over a 24-hour period for different types of backgrounds. The detection parameters, such as signal-to-clutter ratio were compared according to the recording time, temperature and humidity. Through variation analysis, robust target detection methodologies are derived by controlling thresholds and designing a temporal contrast filter to achieve high detection rate and low false alarm rate. Experimental results validate the robustness of the proposed scheme by applying it to the synthetic and real infrared sequences. PMID:24672290

  5. Review article: A systematic review of emergency department incident classification frameworks.

    PubMed

    Murray, Matthew; McCarthy, Sally

    2018-06-01

    As in any part of the hospital system, safety incidents can occur in the ED. These incidents arguably have a distinct character, as the ED involves unscheduled flows of urgent patients who require disparate services. To aid understanding of safety issues and support risk management of the ED, a comparison of published ED specific incident classification frameworks was performed. A review of emergency medicine, health management and general medical publications, using Ovid SP to interrogate Medline (1976-2016) was undertaken to identify any type of taxonomy or classification-like framework for ED related incidents. These frameworks were then analysed and compared. The review identified 17 publications containing an incident classification framework. Comparison of factors and themes making up the classification constituent elements revealed some commonality, but no overall consistency, nor evolution towards an ideal framework. Inconsistency arises from differences in the evidential basis and design methodology of classifications, with design itself being an inherently subjective process. It was not possible to identify an 'ideal' incident classification framework for ED risk management, and there is significant variation in the selection of categories used by frameworks. The variation in classification could risk an unbalanced emphasis in findings through application of a particular framework. Design of an ED specific, ideal incident classification framework should be informed by a much wider range of theories of how organisations and systems work, in addition to clinical and human factors. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  6. Protein degradation rate is the dominant mechanism accounting for the differences in protein abundance of basal p53 in a human breast and colorectal cancer cell line.

    PubMed

    Lakatos, Eszter; Salehi-Reyhani, Ali; Barclay, Michael; Stumpf, Michael P H; Klug, David R

    2017-01-01

    We determine p53 protein abundances and cell to cell variation in two human cancer cell lines with single cell resolution, and show that the fractional width of the distributions is the same in both cases despite a large difference in average protein copy number. We developed a computational framework to identify dominant mechanisms controlling the variation of protein abundance in a simple model of gene expression from the summary statistics of single cell steady state protein expression distributions. Our results, based on single cell data analysed in a Bayesian framework, lends strong support to a model in which variation in the basal p53 protein abundance may be best explained by variations in the rate of p53 protein degradation. This is supported by measurements of the relative average levels of mRNA which are very similar despite large variation in the level of protein.

  7. A computational framework to detect normal and tuberculosis infected lung from H and E-stained whole slide images

    NASA Astrophysics Data System (ADS)

    Niazi, M. Khalid Khan; Beamer, Gillian; Gurcan, Metin N.

    2017-03-01

    Accurate detection and quantification of normal lung tissue in the context of Mycobacterium tuberculosis infection is of interest from a biological perspective. The automatic detection and quantification of normal lung will allow the biologists to focus more intensely on regions of interest within normal and infected tissues. We present a computational framework to extract individual tissue sections from whole slide images having multiple tissue sections. It automatically detects the background, red blood cells and handwritten digits to bring efficiency as well as accuracy in quantification of tissue sections. For efficiency, we model our framework with logical and morphological operations as they can be performed in linear time. We further divide these individual tissue sections into normal and infected areas using deep neural network. The computational framework was trained on 60 whole slide images. The proposed computational framework resulted in an overall accuracy of 99.2% when extracting individual tissue sections from 120 whole slide images in the test dataset. The framework resulted in a relatively higher accuracy (99.7%) while classifying individual lung sections into normal and infected areas. Our preliminary findings suggest that the proposed framework has good agreement with biologists on how define normal and infected lung areas.

  8. Biologically meaningful scents: a framework for understanding predator-prey research across disciplines.

    PubMed

    Parsons, Michael H; Apfelbach, Raimund; Banks, Peter B; Cameron, Elissa Z; Dickman, Chris R; Frank, Anke S K; Jones, Menna E; McGregor, Ian S; McLean, Stuart; Müller-Schwarze, Dietland; Sparrow, Elisa E; Blumstein, Daniel T

    2018-02-01

    Fear of predation is a universal motivator. Because predators hunt using stealth and surprise, there is a widespread ability among prey to assess risk from chemical information - scents - in their environment. Consequently, scents often act as particularly strong modulators of memory and emotions. Recent advances in ecological research and analytical technology are leading to novel ways to use this chemical information to create effective attractants, repellents and anti-anxiolytic compounds for wildlife managers, conservation biologists and health practitioners. However, there is extensive variation in the design, results, and interpretation of studies of olfactory-based risk discrimination. To understand the highly variable literature in this area, we adopt a multi-disciplinary approach and synthesize the latest findings from neurobiology, chemical ecology, and ethology to propose a contemporary framework that accounts for such disparate factors as the time-limited stability of chemicals, highly canalized mechanisms that influence prey responses, and the context within which these scents are detected (e.g. availability of alternative resources, perceived shelter, and ambient physical parameters). This framework helps to account for the wide range of reported responses by prey to predator scents, and explains, paradoxically, how the same individual predator scent can be interpreted as either safe or dangerous to a prey animal depending on how, when and where the cue was deposited. We provide a hypothetical example to illustrate the most common factors that influence how a predator scent (from dingoes, Canis dingo) may both attract and repel the same target organism (kangaroos, Macropus spp.). This framework identifies the catalysts that enable dynamic scents, odours or odorants to be used as attractants as well as deterrents. Because effective scent tools often relate to traumatic memories (fear and/or anxiety) that cause future avoidance, this information may also guide the development of appeasement, enrichment and anti-anxiolytic compounds, and help explain the observed variation in post-traumatic-related behaviours (including post-traumatic stress disorder, PTSD) among diverse terrestrial taxa, including humans. © 2017 Cambridge Philosophical Society.

  9. A framework to determine the locations of the environmental monitoring in an estuary of the Yellow Sea.

    PubMed

    Kim, Nam-Hoon; Hwang, Jin Hwan; Cho, Jaegab; Kim, Jae Seong

    2018-06-04

    The characteristics of an estuary are determined by various factors as like as tide, wave, river discharge, etc. which also control the water quality of the estuary. Therefore, detecting the changes of characteristics is critical in managing the environmental qualities and pollution and so the locations of monitoring should be selected carefully. The present study proposes a framework to deploy the monitoring systems based on a graphical method of the spatial and temporal optimizations. With the well-validated numerical simulation results, the monitoring locations are determined to capture the changes of water qualities and pollutants depending on the variations of tide, current and freshwater discharge. The deployment strategy to find the appropriate monitoring locations is designed with the constrained optimization method, which finds solutions by constraining the objective function into the feasible regions. The objective and constrained functions are constructed with the interpolation technique such as objective analysis. Even with the smaller number of the monitoring locations, the present method performs well equivalently to the arbitrarily and evenly deployed monitoring system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Detecting and Quantifying Forest Change: The Potential of Existing C- and X-Band Radar Datasets.

    PubMed

    Tanase, Mihai A; Ismail, Ismail; Lowell, Kim; Karyanto, Oka; Santoro, Maurizio

    2015-01-01

    This paper evaluates the opportunity provided by global interferometric radar datasets for monitoring deforestation, degradation and forest regrowth in tropical and semi-arid environments. The paper describes an easy to implement method for detecting forest spatial changes and estimating their magnitude. The datasets were acquired within space-borne high spatial resolutions radar missions at near-global scales thus being significant for monitoring systems developed under the United Framework Convention on Climate Change (UNFCCC). The approach presented in this paper was tested in two areas located in Indonesia and Australia. Forest change estimation was based on differences between a reference dataset acquired in February 2000 by the Shuttle Radar Topography Mission (SRTM) and TanDEM-X mission (TDM) datasets acquired in 2011 and 2013. The synergy between SRTM and TDM datasets allowed not only identifying changes in forest extent but also estimating their magnitude with respect to the reference through variations in forest height.

  11. Relative importance of precipitation frequency and intensity in inter-annual variation of precipitation in Singapore during 1980-2013

    NASA Astrophysics Data System (ADS)

    Li, Xin; Babovic, Vladan

    2017-04-01

    Observed studies on inter-annual variation of precipitation provide insight into the response of precipitation to anthropogenic climate change and natural climate variability. Inter-annual variation of precipitation results from the concurrent variations of precipitation frequency and intensity, understanding of the relative importance of frequency and intensity in the variability of precipitation can help fathom its changing properties. Investigation of the long-term changes of precipitation schemes has been extensively carried out in many regions across the world, however, detailed studies of the relative importance of precipitation frequency and intensity in inter-annual variation of precipitation are still limited, especially in the tropics. Therefore, this study presents a comprehensive framework to investigate the inter-annual variation of precipitation and the dominance of precipitation frequency and intensity in a tropical urban city-state, Singapore, based on long-term (1980-2013) daily precipitation series from 22 rain gauges. First, an iterative Mann-Kendall trend test method is applied to detect long-term trends in precipitation total, frequency and intensity at both annual and seasonal time scales. Then, the relative importance of precipitation frequency and intensity in inducing the inter-annual variation of wet-day precipitation total is analyzed using a dominance analysis method based on linear regression. The results show statistically significant upward trends in wet-day precipitation total, frequency and intensity at annual time scale, however, these trends are not evident during the monsoon seasons. The inter-annual variation of wet-day precipitation is mainly dominated by precipitation intensity for most of the stations at annual time scale and during the Northeast monsoon season. However, during the Southwest monsoon season, the inter-annual variation of wet-day precipitation is mainly dominated by precipitation frequency. These results have implications for water resources management practices in Singapore.

  12. [Variations in the diagnostic confirmation process between breast cancer mass screening units].

    PubMed

    Natal, Carmen; Fernández-Somoano, Ana; Torá-Rocamora, Isabel; Tardón, Adonina; Castells, Xavier

    2016-01-01

    To analyse variations in the diagnostic confirmation process between screening units, variations in the outcome of each episode and the relationship between the use of the different diagnostic confirmation tests and the lesion detection rate. Observational study of variability of the standardised use of diagnostic and lesion detection tests in 34 breast cancer mass screening units participating in early-detection programmes in three Spanish regions from 2002-2011. The diagnostic test variation ratio in percentiles 25-75 ranged from 1.68 (further appointments) to 3.39 (fine-needle aspiration). The variation ratio in detection rates of benign lesions, ductal carcinoma in situ and invasive cancer were 2.79, 1.99 and 1.36, respectively. A positive relationship between rates of testing and detection rates was found with fine-needle aspiration-benign lesions (R(2): 0.53), fine-needle aspiration-invasive carcinoma (R(2): 0 28), core biopsy-benign lesions (R(2): 0.64), core biopsy-ductal carcinoma in situ (R(2): 0.61) and core biopsy-invasive carcinoma (R(2): 0.48). Variation in the use of invasive tests between the breast cancer screening units participating in early-detection programmes was found to be significantly higher than variations in lesion detection. Units which conducted more fine-needle aspiration tests had higher benign lesion detection rates, while units that conducted more core biopsies detected more benign lesions and cancer. Copyright © 2016 SESPAS. Published by Elsevier Espana. All rights reserved.

  13. An integrated framework for detecting suspicious behaviors in video surveillance

    NASA Astrophysics Data System (ADS)

    Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi

    2014-03-01

    In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.

  14. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    USGS Publications Warehouse

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey objectives and optimize decisions related to survey bias and variance. Finally, managers and researchers involved in the survey design process must realize that obtaining the best survey results requires an interactive and recursive process of survey design, execution, analysis and redesign. Survey refinements will be possible as further knowledge is gained on the actual abundance and distribution of the population and on the most efficient techniques for detection animals.

  15. Digging into the corona: A modeling framework trained with Sun-grazing comet observations

    NASA Astrophysics Data System (ADS)

    Jia, Y. D.; Pesnell, W. D.; Bryans, P.; Downs, C.; Liu, W.; Schwartz, S. J.

    2017-12-01

    Images of comets diving into the low corona have been captured a few times in the past decade. Structures visible at various wavelengths during these encounters indicate a strong variation of the ambient conditions of the corona. We combine three numerical models: a global coronal model, a particle transportation model, and a cometary plasma interaction model into one framework to model the interaction of such Sun-grazing comets with plasma in the low corona. In our framework, cometary vapors are ionized via multiple channels and then captured by the coronal magnetic field. In seconds, these ions are further ionized into their highest charge state, which is revealed by certain coronal emission lines. Constrained by observations, we apply our framework to trace back to the local conditions of the ambient corona, and their spatial/time variation over a broad range of scales. Once trained by multiple stages of the comet's journey in the low corona, we illustrate how this framework can leverage these unique observations to probe the structure of the solar corona and solar wind.

  16. Scale and time dependence of serial correlations in word-length time series of written texts

    NASA Astrophysics Data System (ADS)

    Rodriguez, E.; Aguilar-Cornejo, M.; Femat, R.; Alvarez-Ramirez, J.

    2014-11-01

    This work considered the quantitative analysis of large written texts. To this end, the text was converted into a time series by taking the sequence of word lengths. The detrended fluctuation analysis (DFA) was used for characterizing long-range serial correlations of the time series. To this end, the DFA was implemented within a rolling window framework for estimating the variations of correlations, quantified in terms of the scaling exponent, strength along the text. Also, a filtering derivative was used to compute the dependence of the scaling exponent relative to the scale. The analysis was applied to three famous English-written literary narrations; namely, Alice in Wonderland (by Lewis Carrol), Dracula (by Bram Stoker) and Sense and Sensibility (by Jane Austen). The results showed that high correlations appear for scales of about 50-200 words, suggesting that at these scales the text contains the stronger coherence. The scaling exponent was not constant along the text, showing important variations with apparent cyclical behavior. An interesting coincidence between the scaling exponent variations and changes in narrative units (e.g., chapters) was found. This suggests that the scaling exponent obtained from the DFA is able to detect changes in narration structure as expressed by the usage of words of different lengths.

  17. Exploring Pre-Service Teachers' Understanding of Statistical Variation: Implications for Teaching and Research

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2007-01-01

    Concerns about the importance of variation in statistics education and a lack of research in this topic led to a preliminary study which explored pre-service teachers' ideas in this area. The teachers completed a written questionnaire about variation in sampling and distribution contexts. Responses were categorised in relation to a framework that…

  18. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  19. Geometric constrained variational calculus. II: The second variation (Part I)

    NASA Astrophysics Data System (ADS)

    Massa, Enrico; Bruno, Danilo; Luria, Gianvittorio; Pagani, Enrico

    2016-10-01

    Within the geometrical framework developed in [Geometric constrained variational calculus. I: Piecewise smooth extremals, Int. J. Geom. Methods Mod. Phys. 12 (2015) 1550061], the problem of minimality for constrained calculus of variations is analyzed among the class of differentiable curves. A fully covariant representation of the second variation of the action functional, based on a suitable gauge transformation of the Lagrangian, is explicitly worked out. Both necessary and sufficient conditions for minimality are proved, and reinterpreted in terms of Jacobi fields.

  20. Framework for a hydrologic climate-response network in New England

    USGS Publications Warehouse

    Lent, Robert M.; Hodgkins, Glenn A.; Dudley, Robert W.; Schalk, Luther F.

    2015-01-01

    Many climate-related hydrologic variables in New England have changed in the past century, and many are expected to change during the next century. It is important to understand and monitor these changes because they can affect human water supply, hydroelectric power generation, transportation infrastructure, and stream and riparian ecology. This report describes a framework for hydrologic monitoring in New England by means of a climate-response network. The framework identifies specific inland hydrologic variables that are sensitive to climate variation; identifies geographic regions with similar hydrologic responses; proposes a fixed-station monitoring network composed of existing streamflow, groundwater, lake ice, snowpack, and meteorological data-collection stations for evaluation of hydrologic response to climate variation; and identifies streamflow basins for intensive, process-based studies and for estimates of future hydrologic conditions.

  1. Aerial survey methodology for bison population estimation in Yellowstone National Park

    USGS Publications Warehouse

    Hess, Steven C.

    2002-01-01

    I developed aerial survey methods for statistically rigorous bison population estimation in Yellowstone National Park to support sound resource management decisions and to understand bison ecology. Survey protocols, data recording procedures, a geographic framework, and seasonal stratifications were based on field observations from February 1998-September 2000. The reliability of this framework and strata were tested with long-term data from 1970-1997. I simulated different sample survey designs and compared them to high-effort censuses of well-defined large areas to evaluate effort, precision, and bias. Sample survey designs require much effort and extensive information on the current spatial distribution of bison and therefore do not offer any substantial reduction in time and effort over censuses. I conducted concurrent ground surveys, or 'double sampling' to estimate detection probability during aerial surveys. Group size distribution and habitat strongly affected detection probability. In winter, 75% of the groups and 92% of individual bison were detected on average from aircraft, while in summer, 79% of groups and 97% of individual bison were detected. I also used photography to quantify the bias due to counting large groups of bison accurately and found that undercounting increased with group size and could reach 15%. I compared survey conditions between seasons and identified optimal time windows for conducting surveys in both winter and summer. These windows account for the habitats and total area bison occupy, and group size distribution. Bison became increasingly scattered over the Yellowstone region in smaller groups and more occupied unfavorable habitats as winter progressed. Therefore, the best conditions for winter surveys occur early in the season (Dec-Jan). In summer, bison were most spatially aggregated and occurred in the largest groups by early August. Low variability between surveys and high detection probability provide population estimates with an overall coefficient of variation of approximately 8% and have high power for detecting trends in population change. I demonstrated how population estimates from winter and summer can be integrated into a comprehensive monitoring program to estimate annual growth rates, overall winter mortality, and an index of calf production, requiring about 30 hours of flight per year.

  2. A Conceptual Framework for Detecting Cheating in Online and Take-Home Exams

    ERIC Educational Resources Information Center

    D'Souza, Kelwyn A.; Siegfeldt, Denise V.

    2017-01-01

    Selecting the right methodology to use for detecting cheating in online exams requires considerable time and effort due to a wide variety of scholarly publications on academic dishonesty in online education. This article offers a cheating detection framework that can serve as a guideline for conducting cheating studies. The necessary theories and…

  3. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  4. An Advanced Deep Learning Approach for Ki-67 Stained Hotspot Detection and Proliferation Rate Scoring for Prognostic Evaluation of Breast Cancer.

    PubMed

    Saha, Monjoy; Chakraborty, Chandan; Arun, Indu; Ahmed, Rosina; Chatterjee, Sanjoy

    2017-06-12

    Being a non-histone protein, Ki-67 is one of the essential biomarkers for the immunohistochemical assessment of proliferation rate in breast cancer screening and grading. The Ki-67 signature is always sensitive to radiotherapy and chemotherapy. Due to random morphological, color and intensity variations of cell nuclei (immunopositive and immunonegative), manual/subjective assessment of Ki-67 scoring is error-prone and time-consuming. Hence, several machine learning approaches have been reported; nevertheless, none of them had worked on deep learning based hotspots detection and proliferation scoring. In this article, we suggest an advanced deep learning model for computerized recognition of candidate hotspots and subsequent proliferation rate scoring by quantifying Ki-67 appearance in breast cancer immunohistochemical images. Unlike existing Ki-67 scoring techniques, our methodology uses Gamma mixture model (GMM) with Expectation-Maximization for seed point detection and patch selection and deep learning, comprises with decision layer, for hotspots detection and proliferation scoring. Experimental results provide 93% precision, 0.88% recall and 0.91% F-score value. The model performance has also been compared with the pathologists' manual annotations and recently published articles. In future, the proposed deep learning framework will be highly reliable and beneficial to the junior and senior pathologists for fast and efficient Ki-67 scoring.

  5. VarDetect: a nucleotide sequence variation exploratory tool

    PubMed Central

    Ngamphiw, Chumpol; Kulawonganunchai, Supasak; Assawamakin, Anunchai; Jenwitheesuk, Ekachai; Tongsima, Sissades

    2008-01-01

    Background Single nucleotide polymorphisms (SNPs) are the most commonly studied units of genetic variation. The discovery of such variation may help to identify causative gene mutations in monogenic diseases and SNPs associated with predisposing genes in complex diseases. Accurate detection of SNPs requires software that can correctly interpret chromatogram signals to nucleotides. Results We present VarDetect, a stand-alone nucleotide variation exploratory tool that automatically detects nucleotide variation from fluorescence based chromatogram traces. Accurate SNP base-calling is achieved using pre-calculated peak content ratios, and is enhanced by rules which account for common sequence reading artifacts. The proposed software tool is benchmarked against four other well-known SNP discovery software tools (PolyPhred, novoSNP, Genalys and Mutation Surveyor) using fluorescence based chromatograms from 15 human genes. These chromatograms were obtained from sequencing 16 two-pooled DNA samples; a total of 32 individual DNA samples. In this comparison of automatic SNP detection tools, VarDetect achieved the highest detection efficiency. Availability VarDetect is compatible with most major operating systems such as Microsoft Windows, Linux, and Mac OSX. The current version of VarDetect is freely available at . PMID:19091032

  6. GBOOST: a GPU-based tool for detecting gene-gene interactions in genome-wide case control studies.

    PubMed

    Yung, Ling Sing; Yang, Can; Wan, Xiang; Yu, Weichuan

    2011-05-01

    Collecting millions of genetic variations is feasible with the advanced genotyping technology. With a huge amount of genetic variations data in hand, developing efficient algorithms to carry out the gene-gene interaction analysis in a timely manner has become one of the key problems in genome-wide association studies (GWAS). Boolean operation-based screening and testing (BOOST), a recent work in GWAS, completes gene-gene interaction analysis in 2.5 days on a desktop computer. Compared with central processing units (CPUs), graphic processing units (GPUs) are highly parallel hardware and provide massive computing resources. We are, therefore, motivated to use GPUs to further speed up the analysis of gene-gene interactions. We implement the BOOST method based on a GPU framework and name it GBOOST. GBOOST achieves a 40-fold speedup compared with BOOST. It completes the analysis of Wellcome Trust Case Control Consortium Type 2 Diabetes (WTCCC T2D) genome data within 1.34 h on a desktop computer equipped with Nvidia GeForce GTX 285 display card. GBOOST code is available at http://bioinformatics.ust.hk/BOOST.html#GBOOST.

  7. Non-lambertian reflectance modeling and shape recovery of faces using tensor splines.

    PubMed

    Kumar, Ritwik; Barmpoutis, Angelos; Banerjee, Arunava; Vemuri, Baba C

    2011-03-01

    Modeling illumination effects and pose variations of a face is of fundamental importance in the field of facial image analysis. Most of the conventional techniques that simultaneously address both of these problems work with the Lambertian assumption and thus fall short of accurately capturing the complex intensity variation that the facial images exhibit or recovering their 3D shape in the presence of specularities and cast shadows. In this paper, we present a novel Tensor-Spline-based framework for facial image analysis. We show that, using this framework, the facial apparent BRDF field can be accurately estimated while seamlessly accounting for cast shadows and specularities. Further, using local neighborhood information, the same framework can be exploited to recover the 3D shape of the face (to handle pose variation). We quantitatively validate the accuracy of the Tensor Spline model using a more general model based on the mixture of single-lobed spherical functions. We demonstrate the effectiveness of our technique by presenting extensive experimental results for face relighting, 3D shape recovery, and face recognition using the Extended Yale B and CMU PIE benchmark data sets.

  8. Culture and Literacy: Frameworks for Understanding.

    ERIC Educational Resources Information Center

    Westby, Carol E.

    1995-01-01

    This article presents a framework for understanding cultural variations in beliefs, values, and communication styles and considers the role of culture in relation to children's response to formal education and literacy. Major dimensions of cultural variability discussed include individualism/collectivism and high-context/low-context. (Author/DB)

  9. Parametric estimation for reinforced concrete relief shelter for Aceh cases

    NASA Astrophysics Data System (ADS)

    Atthaillah; Saputra, Eri; Iqbal, Muhammad

    2018-05-01

    This paper was a work in progress (WIP) to discover a rapid parametric framework for post-disaster permanent shelter’s materials estimation. The intended shelters were reinforced concrete construction with bricks as its wall. Inevitably, in post-disaster cases, design variations were needed to help suited victims condition. It seemed impossible to satisfy a beneficiary with a satisfactory design utilizing the conventional method. This study offered a parametric framework to overcome slow construction-materials estimation issue against design variations. Further, this work integrated parametric tool, which was Grasshopper to establish algorithms that simultaneously model, visualize, calculate and write the calculated data to a spreadsheet in a real-time. Some customized Grasshopper components were created using GHPython scripting for a more optimized algorithm. The result from this study was a partial framework that successfully performed modeling, visualization, calculation and writing the calculated data simultaneously. It meant design alterations did not escalate time needed for modeling, visualization, and material estimation. Further, the future development of the parametric framework will be made open source.

  10. Real-time classification of vehicles by type within infrared imagery

    NASA Astrophysics Data System (ADS)

    Kundegorski, Mikolaj E.; Akçay, Samet; Payen de La Garanderie, Grégoire; Breckon, Toby P.

    2016-10-01

    Real-time classification of vehicles into sub-category types poses a significant challenge within infra-red imagery due to the high levels of intra-class variation in thermal vehicle signatures caused by aspects of design, current operating duration and ambient thermal conditions. Despite these challenges, infra-red sensing offers significant generalized target object detection advantages in terms of all-weather operation and invariance to visual camouflage techniques. This work investigates the accuracy of a number of real-time object classification approaches for this task within the wider context of an existing initial object detection and tracking framework. Specifically we evaluate the use of traditional feature-driven bag of visual words and histogram of oriented gradient classification approaches against modern convolutional neural network architectures. Furthermore, we use classical photogrammetry, within the context of current target detection and classification techniques, as a means of approximating 3D target position within the scene based on this vehicle type classification. Based on photogrammetric estimation of target position, we then illustrate the use of regular Kalman filter based tracking operating on actual 3D vehicle trajectories. Results are presented using a conventional thermal-band infra-red (IR) sensor arrangement where targets are tracked over a range of evaluation scenarios.

  11. Overcoming complexities: Damage detection using dictionary learning framework

    NASA Astrophysics Data System (ADS)

    Alguri, K. Supreet; Melville, Joseph; Deemer, Chris; Harley, Joel B.

    2018-04-01

    For in situ damage detection, guided wave structural health monitoring systems have been widely researched due to their ability to evaluate large areas and their ability detect many types of damage. These systems often evaluate structural health by recording initial baseline measurements from a pristine (i.e., undamaged) test structure and then comparing later measurements with that baseline. Yet, it is not always feasible to have a pristine baseline. As an alternative, substituting the baseline with data from a surrogate (nearly identical and pristine) structure is a logical option. While effective in some circumstance, surrogate data is often still a poor substitute for pristine baseline measurements due to minor differences between the structures. To overcome this challenge, we present a dictionary learning framework to adapt surrogate baseline data to better represent an undamaged test structure. We compare the performance of our framework with two other surrogate-based damage detection strategies: (1) using raw surrogate data for comparison and (2) using sparse wavenumber analysis, a precursor to our framework for improving the surrogate data. We apply our framework to guided wave data from two 108 mm by 108 mm aluminum plates. With 20 measurements, we show that our dictionary learning framework achieves a 98% accuracy, raw surrogate data achieves a 92% accuracy, and sparse wavenumber analysis achieves a 57% accuracy.

  12. Measuring Adverse Events in Helicopter Emergency Medical Services: Establishing Content Validity

    PubMed Central

    Patterson, P. Daniel; Lave, Judith R.; Martin-Gill, Christian; Weaver, Matthew D.; Wadas, Richard J.; Arnold, Robert M.; Roth, Ronald N.; Mosesso, Vincent N.; Guyette, Francis X.; Rittenberger, Jon C.; Yealy, Donald M.

    2015-01-01

    Introduction We sought to create a valid framework for detecting Adverse Events (AEs) in the high-risk setting of Helicopter Emergency Medical Services (HEMS). Methods We assembled a panel of 10 expert clinicians (n=6 emergency medicine physicians and n=4 prehospital nurses and flight paramedics) affiliated with a large multi-state HEMS organization in the Northeast U.S. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the Content Validity Index (CVI), to quantify the validity of the framework’s content. Results The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: 1) a trigger tool, 2) a method for rating proximal cause, and 3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. Conclusions We demonstrate a standardized process for the development of a content valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS. PMID:24003951

  13. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  14. Feature selection and classifier parameters estimation for EEG signals peak detection using particle swarm optimization.

    PubMed

    Adam, Asrul; Shapiai, Mohd Ibrahim; Tumari, Mohd Zaidi Mohd; Mohamad, Mohd Saberi; Mubin, Marizan

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model.

  15. Community pharmacists' prescription intervention practices--exploring variations in practice in Norwegian pharmacies.

    PubMed

    Mandt, Ingunn; Horn, Anne Marie; Ekedahl, Anders; Granas, Anne Gerd

    2010-03-01

    Evidence suggests that prescription intervention frequencies have been found to vary as much as 10-fold among Norwegian pharmacies and among pharmacists within the same pharmacy. To explore community pharmacists' perceptions of how their prescription intervention practices were influenced by their working environment, their technological resources, the physical and social structures of the pharmacies, their relations with colleagues, and to the individual pharmacist's professional skills. Two focus groups consisting of 14 community pharmacists in total, from urban and rural areas in Norway, discussed their working procedures and professional judgments related to prescription interventions. Organizational theories were used as theoretical and analytical frameworks in the study. A framework based on Leavitt's organizational model was to structure our interview guide. The study units were the statements of the individual pharmacists. Recurrent themes were identified and condensed. Two processes describing variations in the dispensing workflow including prescription interventions were derived--an active dispensing process extracting information about the patient's medication from several sources and a fast dispensing process focusing mainly on the information available on the prescription. Both workflow processes were used in the same pharmacies and by the same pharmacist but on different occasions. A pharmacy layout allowing interactions between pharmacist and patients and a convenient organization of technology, layout, pharmacist-patient and pharmacist-coworker transactions at the workplace was essential for detecting and solving prescription problems. Pharmacists limited their contact with general practitioners when they considered the problem a formality and/or when they knew the answers themselves. The combined use of dispensing software and the Internet was a driving force toward more independent and cognitively advanced prescription interventions. Implementation of a general organizational model made it easier to analyze and interpret the pharmacists' intervention practices. Working environment, technology, management and professional skills may all contribute to variations in pharmacists' prescription intervention practices in and between community pharmacies. Copyright 2010 Elsevier Inc. All rights reserved.

  16. A phylogenetic framework facilitates Y-STR variant discovery and classification via massively parallel sequencing.

    PubMed

    Huszar, Tunde I; Jobling, Mark A; Wetton, Jon H

    2018-04-12

    Short tandem repeats on the male-specific region of the Y chromosome (Y-STRs) are permanently linked as haplotypes, and therefore Y-STR sequence diversity can be considered within the robust framework of a phylogeny of haplogroups defined by single nucleotide polymorphisms (SNPs). Here we use massively parallel sequencing (MPS) to analyse the 23 Y-STRs in Promega's prototype PowerSeq™ Auto/Mito/Y System kit (containing the markers of the PowerPlex® Y23 [PPY23] System) in a set of 100 diverse Y chromosomes whose phylogenetic relationships are known from previous megabase-scale resequencing. Including allele duplications and alleles resulting from likely somatic mutation, we characterised 2311 alleles, demonstrating 99.83% concordance with capillary electrophoresis (CE) data on the same sample set. The set contains 267 distinct sequence-based alleles (an increase of 58% compared to the 169 detectable by CE), including 60 novel Y-STR variants phased with their flanking sequences which have not been reported previously to our knowledge. Variation includes 46 distinct alleles containing non-reference variants of SNPs/indels in both repeat and flanking regions, and 145 distinct alleles containing repeat pattern variants (RPV). For DYS385a,b, DYS481 and DYS390 we observed repeat count variation in short flanking segments previously considered invariable, and suggest new MPS-based structural designations based on these. We considered the observed variation in the context of the Y phylogeny: several specific haplogroup associations were observed for SNPs and indels, reflecting the low mutation rates of such variant types; however, RPVs showed less phylogenetic coherence and more recurrence, reflecting their relatively high mutation rates. In conclusion, our study reveals considerable additional diversity at the Y-STRs of the PPY23 set via MPS analysis, demonstrates high concordance with CE data, facilitates nomenclature standardisation, and places Y-STR sequence variants in their phylogenetic context. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Adaptive Framework for Classification and Novel Class Detection over Evolving Data Streams with Limited Labeled Data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haque, Ahsanul; Khan, Latifur; Baron, Michael

    2015-09-01

    Most approaches to classifying evolving data streams either divide the stream of data into fixed-size chunks or use gradual forgetting to address the problems of infinite length and concept drift. Finding the fixed size of the chunks or choosing a forgetting rate without prior knowledge about time-scale of change is not a trivial task. As a result, these approaches suffer from a trade-off between performance and sensitivity. To address this problem, we present a framework which uses change detection techniques on the classifier performance to determine chunk boundaries dynamically. Though this framework exhibits good performance, it is heavily dependent onmore » the availability of true labels of data instances. However, labeled data instances are scarce in realistic settings and not readily available. Therefore, we present a second framework which is unsupervised in nature, and exploits change detection on classifier confidence values to determine chunk boundaries dynamically. In this way, it avoids the use of labeled data while still addressing the problems of infinite length and concept drift. Moreover, both of our proposed frameworks address the concept evolution problem by detecting outliers having similar values for the attributes. We provide theoretical proof that our change detection method works better than other state-of-the-art approaches in this particular scenario. Results from experiments on various benchmark and synthetic data sets also show the efficiency of our proposed frameworks.« less

  18. The Determinants of Low Fertility in India

    PubMed Central

    Dharmalingam, A.; Rajan, Sowmya; Morgan, S. Philip

    2015-01-01

    Using a conceptual framework focusing on factors that enhance or reduce fertility relative to desired family size (see Bongaarts 2001), we study fertility variation across time (1992–2006) and space (states) in India. Our empirical analyses use data from three waves of the Indian National Family Health Surveys. We find that this framework can account for a substantial portion of the variation in the TFR across the states and over time. Our estimates focus attention on the critical components of contemporary Indian fertility, especially desired family size, unwanted fertility, son preference, and fertility postponement. PMID:24993746

  19. a Framework of Change Detection Based on Combined Morphologica Features and Multi-Index Classification

    NASA Astrophysics Data System (ADS)

    Li, S.; Zhang, S.; Yang, D.

    2017-09-01

    Remote sensing images are particularly well suited for analysis of land cover change. In this paper, we present a new framework for detection of changing land cover using satellite imagery. Morphological features and a multi-index are used to extract typical objects from the imagery, including vegetation, water, bare land, buildings, and roads. Our method, based on connected domains, is different from traditional methods; it uses image segmentation to extract morphological features, while the enhanced vegetation index (EVI), the differential water index (NDWI) are used to extract vegetation and water, and a fragmentation index is used to the correct extraction results of water. HSV transformation and threshold segmentation extract and remove the effects of shadows on extraction results. Change detection is performed on these results. One of the advantages of the proposed framework is that semantic information is extracted automatically using low-level morphological features and indexes. Another advantage is that the proposed method detects specific types of change without any training samples. A test on ZY-3 images demonstrates that our framework has a promising capability to detect change.

  20. A Biological Signal-Based Stress Monitoring Framework for Children Using Wearable Devices.

    PubMed

    Choi, Yerim; Jeon, Yu-Mi; Wang, Lin; Kim, Kwanho

    2017-08-23

    The safety of children has always been an important issue, and several studies have been conducted to determine the stress state of a child to ensure the safety. Audio signals and biological signals including heart rate are known to be effective for stress state detection. However, collecting those data requires specialized equipment, which is not appropriate for the constant monitoring of children, and advanced data analysis is required for accurate detection. In this regard, we propose a stress state detection framework which utilizes both audio signal and heart rate collected from wearable devices, and adopted machine learning methods for the detection. Experiments using real-world data were conducted to compare detection performances across various machine learning methods and noise levels of audio signal. Adopting the proposed framework in the real-world will contribute to the enhancement of child safety.

  1. Automatic temporal segment detection via bilateral long short-term memory recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Sun, Bo; Cao, Siming; He, Jun; Yu, Lejun; Li, Liandong

    2017-03-01

    Constrained by the physiology, the temporal factors associated with human behavior, irrespective of facial movement or body gesture, are described by four phases: neutral, onset, apex, and offset. Although they may benefit related recognition tasks, it is not easy to accurately detect such temporal segments. An automatic temporal segment detection framework using bilateral long short-term memory recurrent neural networks (BLSTM-RNN) to learn high-level temporal-spatial features, which synthesizes the local and global temporal-spatial information more efficiently, is presented. The framework is evaluated in detail over the face and body database (FABO). The comparison shows that the proposed framework outperforms state-of-the-art methods for solving the problem of temporal segment detection.

  2. Very Large Graphs for Information Extraction (VLG) Detection and Inference in the Presence of Uncertainty

    DTIC Science & Technology

    2015-09-21

    this framework, MIT LL carried out a one-year proof- of-concept study to determine the capabilities and challenges in the detection of anomalies in...extremely large graphs [5]. Under this effort, two real datasets were considered, and algorithms for data modeling and anomaly detection were developed...is required in a well-defined experimental framework for the detection of anomalies in very large graphs. This study is intended to inform future

  3. Forward collision warning based on kernelized correlation filters

    NASA Astrophysics Data System (ADS)

    Pu, Jinchuan; Liu, Jun; Zhao, Yong

    2017-07-01

    A vehicle detection and tracking system is one of the indispensable methods to reduce the occurrence of traffic accidents. The nearest vehicle is the most likely to cause harm to us. So, this paper will do more research on about the nearest vehicle in the region of interest (ROI). For this system, high accuracy, real-time and intelligence are the basic requirement. In this paper, we set up a system that combines the advanced KCF tracking algorithm with the HaarAdaBoost detection algorithm. The KCF algorithm reduces computation time and increase the speed through the cyclic shift and diagonalization. This algorithm satisfies the real-time requirement. At the same time, Haar features also have the same advantage of simple operation and high speed for detection. The combination of this two algorithm contribute to an obvious improvement of the system running rate comparing with previous works. The detection result of the HaarAdaBoost classifier provides the initial value for the KCF algorithm. This fact optimizes KCF algorithm flaws that manual car marking in the initial phase, which is more scientific and more intelligent. Haar detection and KCF tracking with Histogram of Oriented Gradient (HOG) ensures the accuracy of the system. We evaluate the performance of framework on dataset that were self-collected. The experimental results demonstrate that the proposed method is robust and real-time. The algorithm can effectively adapt to illumination variation, even in the night it can meet the detection and tracking requirements, which is an improvement compared with the previous work.

  4. Monitoring forest cover loss using multiple data streams, a case study of a tropical dry forest in Bolivia

    NASA Astrophysics Data System (ADS)

    Dutrieux, Loïc Paul; Verbesselt, Jan; Kooistra, Lammert; Herold, Martin

    2015-09-01

    Automatically detecting forest disturbances as they occur can be extremely challenging for certain types of environments, particularly those presenting strong natural variations. Here, we use a generic structural break detection framework (BFAST) to improve the monitoring of forest cover loss by combining multiple data streams. Forest change monitoring is performed using Landsat data in combination with MODIS or rainfall data to further improve the modelling and monitoring. We tested the use of the Normalized Difference Vegetation Index (NDVI) from the Moderate Resolution Imaging Spectroradiometer (MODIS) with varying spatial aggregation window sizes as well as a rainfall derived index as external regressors. The method was evaluated on a dry tropical forest area in lowland Bolivia where forest cover loss is known to occur, and we validated the results against a set of ground truth samples manually interpreted using the TimeSync environment. We found that the addition of an external regressor allows to take advantage of the difference in spatial extent between human induced and naturally induced variations and only detect the processes of interest. Of all configurations, we found the 13 by 13 km MODIS NDVI window to be the most successful, with an overall accuracy of 87%. Compared with a single pixel approach, the proposed method produced better time-series model fits resulting in increases of overall accuracy (from 82% to 87%), and decrease in omission and commission errors (from 33% to 24% and from 3% to 0% respectively). The presented approach seems particularly relevant for areas with high inter-annual natural variability, such as forests regularly experiencing exceptional drought events.

  5. Context-dependent logo matching and recognition.

    PubMed

    Sahbi, Hichem; Ballan, Lamberto; Serra, Giuseppe; Del Bimbo, Alberto

    2013-03-01

    We contribute, through this paper, to the design of a novel variational framework able to match and recognize multiple instances of multiple reference logos in image archives. Reference logos and test images are seen as constellations of local features (interest points, regions, etc.) and matched by minimizing an energy function mixing: 1) a fidelity term that measures the quality of feature matching, 2) a neighborhood criterion that captures feature co-occurrence/geometry, and 3) a regularization term that controls the smoothness of the matching solution. We also introduce a detection/recognition procedure and study its theoretical consistency. Finally, we show the validity of our method through extensive experiments on the challenging MICC-Logos dataset. Our method overtakes, by 20%, baseline as well as state-of-the-art matching/recognition procedures.

  6. Tunable Impedance Spectroscopy Sensors via Selective Nanoporous Materials.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nenoff, Tina M.; Small, Leo J

    Impedance spectroscopy was leveraged to directly detect the sorption of I 2 by selective adsorption into nanoporous metal organic frameworks (MOF). Films of three different types of MOF frameworks, respectively, were drop cast onto platinum interdigitated electrodes, dried, and exposed to gaseous I 2 at 25, 40, or 70 C. The MOF frameworks varied in topology from small pores (equivalent to I 2 diameter) to large pore frameworks. The combination of the chemistry of the framework and pore size dictated quantity and kinetics of I 2 adsorption. Air, argon, methanol, and water were found to produce minimal changes in ZIF-8more » impedance. Independent of MOF framework characteristics, all resultant sensors showed high response to I 2 in air. As an example of sensor output, I 2 was readily detected at 25 C in air within 720 s of exposure, using an un-optimized sensor geometry with a small pored MOF. Further optimization of sensor geometry, decreasing MOF film thicknesses and maximizing sensor capacitance, will enable faster detection of trace I 2 .« less

  7. An analysis of Australian graduate critical care nurse education.

    PubMed

    Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M

    2015-01-01

    Preparation of specialist critical care nurses in Australia is at graduate level, although there remains considerable variation in courses offered in relation to qualification, content, assessment and outcomes. As higher education providers must now comply with the Australian Qualifications Framework (AQF) a study was conducted to examine existing critical care courses and graduate practice outcomes. Twenty-two critical care courses were reviewed. Data sources included course provider, websites, course curricula and telephone interviews with course coordinators. A framework approach, was used consisting of five key stages: preliminary immersion of raw data, conceptualising a thematic framework, indexing, charting, mapping and interpretation of data. Analysis revealed considerable variations in course delivery and graduate practice outcomes. Most courses used professional competency standards as a framework for course curricula and clinical assessment, with inconsistency in their translation to graduate practice outcomes. Twenty-one courses included clinical assessment at graduate certificate level with no clinical assessment conducted at master level. The expected practice outcome for fifteen courses was safe practice with graduates not expected to practice at a specialist or team leadership level. Minimum graduate practice standards were not included in three courses as an expected outcome. The AQF requires graduate nurse education to be compliant with academic outcome standards. The findings of our study indicate variations between courses and subsequent graduate practice outcomes. It is therefore timely to establish national critical care education graduate practice standards.

  8. The Influence of Momentary Goal Structures

    ERIC Educational Resources Information Center

    Zaleski, Diana Janet

    2010-01-01

    Adolescents' cognition is influenced by a dynamic educational environment. Studies examining the influence of schools, classrooms, and teachers often overlook the momentary variation found in these environments and the effect this variation has on student cognition. Using an achievement goal theory framework, this study examined the momentary…

  9. Variation and Linguistic Theory.

    ERIC Educational Resources Information Center

    Bailey, Charles-James N.

    This volume presents principles and models for describing language variation, and introduces a time-based, dynamic framework for linguistic description. The book first summarizes some of the problems of grammatical description encountered from Saussure through the present and then outlines possibilities for new descriptions of language which take…

  10. ADM Analysis of gravity models within the framework of bimetric variational formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golovnev, Alexey; Karčiauskas, Mindaugas; Nyrhinen, Hannu J., E-mail: agolovnev@yandex.ru, E-mail: mindaugas.karciauskas@helsinki.fi, E-mail: hannu.nyrhinen@helsinki.fi

    2015-05-01

    Bimetric variational formalism was recently employed to construct novel bimetric gravity models. In these models an affine connection is generated by an additional tensor field which is independent of the physical metric. In this work we demonstrate how the ADM decomposition can be applied to study such models and provide some technical intermediate details. Using ADM decomposition we are able to prove that a linear model is unstable as has previously been indicated by perturbative analysis. Moreover, we show that it is also very difficult if not impossible to construct a non-linear model which is ghost-free within the framework ofmore » bimetric variational formalism. However, we demonstrate that viable models are possible along similar lines of thought. To this end, we consider a set up in which the affine connection is a variation of the Levi-Civita one. As a proof of principle we construct a gravity model with a massless scalar field obtained this way.« less

  11. State Policy Regimes and Charter School Performance

    ERIC Educational Resources Information Center

    Pelz, Mikael L.

    2015-01-01

    The policy diffusion framework is critical to understanding the spread of policy innovations such as charter schools in the United States. This framework, however, is less instructive in explaining the state-by-state configuration of these policies. What explains the wide variation in charter school policy among states? This study addresses this…

  12. Computer Technology, Large-Scale Social Integration, and the Local Community.

    ERIC Educational Resources Information Center

    Calhoun, Craig

    1986-01-01

    A conceptual framework is proposed for studying variations in kind and extent of social integration and relatedness, such as those new communication technology may foster. Emphasis is on the contrast between direct and indirect social relationships. The framework is illustrated by consideration of potential social impacts of widespread…

  13. Competency Mapping Framework for Regulating Professionally Oriented Degree Programmes in Higher Education

    ERIC Educational Resources Information Center

    Perera, Srinath; Babatunde, Solomon Olusola; Zhou, Lei; Pearson, John; Ekundayo, Damilola

    2017-01-01

    Recognition of the huge variation between professional graduate degree programmes and employer requirements, especially in the construction industry, necessitated a need for assessing and developing competencies that aligned with professionally oriented programmes. The purpose of this research is to develop a competency mapping framework (CMF) in…

  14. Evaluation of an Integrated Framework for Biodiversity with a New Metric for Functional Dispersion

    PubMed Central

    Presley, Steven J.; Scheiner, Samuel M.; Willig, Michael R.

    2014-01-01

    Growing interest in understanding ecological patterns from phylogenetic and functional perspectives has driven the development of metrics that capture variation in evolutionary histories or ecological functions of species. Recently, an integrated framework based on Hill numbers was developed that measures three dimensions of biodiversity based on abundance, phylogeny and function of species. This framework is highly flexible, allowing comparison of those diversity dimensions, including different aspects of a single dimension and their integration into a single measure. The behavior of those metrics with regard to variation in data structure has not been explored in detail, yet is critical for ensuring an appropriate match between the concept and its measurement. We evaluated how each metric responds to particular data structures and developed a new metric for functional biodiversity. The phylogenetic metric is sensitive to variation in the topology of phylogenetic trees, including variation in the relative lengths of basal, internal and terminal branches. In contrast, the functional metric exhibited multiple shortcomings: (1) species that are functionally redundant contribute nothing to functional diversity and (2) a single highly distinct species causes functional diversity to approach the minimum possible value. We introduced an alternative, improved metric based on functional dispersion that solves both of these problems. In addition, the new metric exhibited more desirable behavior when based on multiple traits. PMID:25148103

  15. Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data

    PubMed Central

    Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993

  16. On-line early fault detection and diagnosis of municipal solid waste incinerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao Jinsong; Huang Jianchao; Sun Wei

    A fault detection and diagnosis framework is proposed in this paper for early fault detection and diagnosis (FDD) of municipal solid waste incinerators (MSWIs) in order to improve the safety and continuity of production. In this framework, principal component analysis (PCA), one of the multivariate statistical technologies, is used for detecting abnormal events, while rule-based reasoning performs the fault diagnosis and consequence prediction, and also generates recommendations for fault mitigation once an abnormal event is detected. A software package, SWIFT, is developed based on the proposed framework, and has been applied in an actual industrial MSWI. The application shows thatmore » automated real-time abnormal situation management (ASM) of the MSWI can be achieved by using SWIFT, resulting in an industrially acceptable low rate of wrong diagnosis, which has resulted in improved process continuity and environmental performance of the MSWI.« less

  17. A multi-scale comparison of trait linkages to environmental and spatial variables in fish communities across a large freshwater lake.

    PubMed

    Strecker, Angela L; Casselman, John M; Fortin, Marie-Josée; Jackson, Donald A; Ridgway, Mark S; Abrams, Peter A; Shuter, Brian J

    2011-07-01

    Species present in communities are affected by the prevailing environmental conditions, and the traits that these species display may be sensitive indicators of community responses to environmental change. However, interpretation of community responses may be confounded by environmental variation at different spatial scales. Using a hierarchical approach, we assessed the spatial and temporal variation of traits in coastal fish communities in Lake Huron over a 5-year time period (2001-2005) in response to biotic and abiotic environmental factors. The association of environmental and spatial variables with trophic, life-history, and thermal traits at two spatial scales (regional basin-scale, local site-scale) was quantified using multivariate statistics and variation partitioning. We defined these two scales (regional, local) on which to measure variation and then applied this measurement framework identically in all 5 study years. With this framework, we found that there was no change in the spatial scales of fish community traits over the course of the study, although there were small inter-annual shifts in the importance of regional basin- and local site-scale variables in determining community trait composition (e.g., life-history, trophic, and thermal). The overriding effects of regional-scale variables may be related to inter-annual variation in average summer temperature. Additionally, drivers of fish community traits were highly variable among study years, with some years dominated by environmental variation and others dominated by spatially structured variation. The influence of spatial factors on trait composition was dynamic, which suggests that spatial patterns in fish communities over large landscapes are transient. Air temperature and vegetation were significant variables in most years, underscoring the importance of future climate change and shoreline development as drivers of fish community structure. Overall, a trait-based hierarchical framework may be a useful conservation tool, as it highlights the multi-scaled interactive effect of variables over a large landscape.

  18. Revisiting the Idea of "Critical Aspects"

    ERIC Educational Resources Information Center

    Pang, Ming Fai; Ki, Wing Wah

    2016-01-01

    Over the years, two new strands of research have evolved from the phenomenographic research tradition: the first concerns advancement of the variation theory of learning, whilst the second involves development of the learning study approach. In this paper, the conceptual frameworks of phenomenography, variation theory, and learning studies are…

  19. Principal Stratification: A Tool for Understanding Variation in Program Effects across Endogenous Subgroups

    ERIC Educational Resources Information Center

    Page, Lindsay C.; Feller, Avi; Grindal, Todd; Miratrix, Luke; Somers, Marie-Andree

    2015-01-01

    Increasingly, researchers are interested in questions regarding treatment-effect variation across partially or fully latent subgroups defined not by pretreatment characteristics but by postrandomization actions. One promising approach to address such questions is principal stratification. Under this framework, a researcher defines endogenous…

  20. Detection of a slow-flow component in contrast-enhanced ultrasound of the synovia for the differential diagnosis of arthritis

    NASA Astrophysics Data System (ADS)

    Rizzo, Gaia; Tonietto, Matteo; Castellaro, Marco; Raffeiner, Bernd; Coran, Alessandro; Fiocco, Ugo; Stramare, Roberto; Grisan, Enrico

    2017-03-01

    Contrast Enhanced Ultrasound (CEUS) is a sensitive imaging technique to assess tissue vascularity, that can be useful in the quantification of different perfusion patterns. This can particularly important in the early detection and differentiation of different types of arthritis. A Gamma-variate can accurately quantify synovial perfusion and it is flexible enough to describe many heterogeneous patterns. However, in some cases the heterogeneity of the kinetics can be such that even the Gamma model does not properly describe the curve, especially in presence of recirculation or of an additional slowflow component. In this work we apply to CEUS data both the Gamma-variate and the single compartment recirculation model (SCR) which takes explicitly into account an additional component of slow flow. The models are solved within a Bayesian framework. We also employed the perfusion estimates obtained with SCR to train a support vector machine classifier to distinguish different types of arthritis. When dividing the patients into two groups (rheumatoid arthritis and polyarticular RA-like psoriatic arthritis vs. other arthritis types), the slow component amplitude was significantly different across groups: mean values of a1 and its variability were statistically higher in RA and RA-like patients (131% increase in mean, p = 0.035 and 73% increase in standard deviation, p = 0.049 respectively). The SVM classifier achieved a balanced accuracy of 89%, with a sensitivity of 100% and a specificity of 78%.

  1. Variation in detection of ductal carcinoma in situ (DCIS) during screening mammography A survey within the International Cancer Screening Network (ICSN)

    PubMed Central

    Lynge, Elsebeth; Ponti, Antonio; James, Ted; Májek, Ondřej; von Euler-Chelpin, My; Anttila, Ahti; Fitzpatrick, Patricia; Frigerio, Alfonso; Kawai, Masaaki; Scharpantgen, Astrid; Broeders, Mireille; Hofvind, Solveig; Vidal, Carmen; Ederra, Maria; Salas, Dolores; Bulliard, Jean-Luc; Tomatis, Mariano; Kerlikowske, Karla; Taplin, Stephen

    2013-01-01

    Background There is concern about detection of Ductal Carcinoma in Situ (DCIS) in screening mammography. DCIS accounts for a substantial proportion of screen detected lesions but its effect on breast cancer mortality is debated. The International Cancer Screening Network conducted a comparative analysis to determine variation in DCIS detection. Patients and Methods Data were collected during 2004–2008 on number of screening examinations, detected breast cancers, DCIS cases, and Globocan 2008 breast cancer incidence rates derived from national or regional cancer registers. We calculated screen-detection rates for breast cancers and DCIS. Results Data were obtained from 15 screening settings in 12 countries; 7,176,050 screening examinations; 29,605 breast cancers; and 5,324 DCIS cases. The ratio between highest and lowest breast cancer incidence was 2.88 (95% confidence interval (CI) 2.76–3.00); 2.97 (95% CI 2.51–3.51) for detection of breast cancer; and 3.49 (95% CI 2.70–4.51) for detection of DCIS. Conclusions Considerable international variation was found in DCIS detection. This variation could not be fully explained by variation in incidence nor in breast cancer detection rates. It suggests the potential for wide discrepancies in management of DCIS resulting in overtreatment of indolent DCIS or undertreatment of potentially curable disease. Comprehensive cancer registration is needed to monitor DCIS detection. Efforts to understand discrepancies and standardize management may improve care. PMID:24041876

  2. Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data

    PubMed Central

    2017-01-01

    Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects. PMID:28984823

  3. Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data.

    PubMed

    Falque, Raphael; Vidal-Calleja, Teresa; Miro, Jaime Valls

    2017-10-06

    Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects.

  4. Toward automated face detection in thermal and polarimetric thermal imagery

    NASA Astrophysics Data System (ADS)

    Gordon, Christopher; Acosta, Mark; Short, Nathan; Hu, Shuowen; Chan, Alex L.

    2016-05-01

    Visible spectrum face detection algorithms perform pretty reliably under controlled lighting conditions. However, variations in illumination and application of cosmetics can distort the features used by common face detectors, thereby degrade their detection performance. Thermal and polarimetric thermal facial imaging are relatively invariant to illumination and robust to the application of makeup, due to their measurement of emitted radiation instead of reflected light signals. The objective of this work is to evaluate a government off-the-shelf wavelet based naïve-Bayes face detection algorithm and a commercial off-the-shelf Viola-Jones cascade face detection algorithm on face imagery acquired in different spectral bands. New classifiers were trained using the Viola-Jones cascade object detection framework with preprocessed facial imagery. Preprocessing using Difference of Gaussians (DoG) filtering reduces the modality gap between facial signatures across the different spectral bands, thus enabling more correlated histogram of oriented gradients (HOG) features to be extracted from the preprocessed thermal and visible face images. Since the availability of training data is much more limited in the thermal spectrum than in the visible spectrum, it is not feasible to train a robust multi-modal face detector using thermal imagery alone. A large training dataset was constituted with DoG filtered visible and thermal imagery, which was subsequently used to generate a custom trained Viola-Jones detector. A 40% increase in face detection rate was achieved on a testing dataset, as compared to the performance of a pre-trained/baseline face detector. Insights gained in this research are valuable in the development of more robust multi-modal face detectors.

  5. A methodological framework for the evaluation of syndromic surveillance systems: a case study of England.

    PubMed

    Colón-González, Felipe J; Lake, Iain R; Morbey, Roger A; Elliot, Alex J; Pebody, Richard; Smith, Gillian E

    2018-04-24

    Syndromic surveillance complements traditional public health surveillance by collecting and analysing health indicators in near real time. The rationale of syndromic surveillance is that it may detect health threats faster than traditional surveillance systems permitting more timely, and hence potentially more effective public health action. The effectiveness of syndromic surveillance largely relies on the methods used to detect aberrations. Very few studies have evaluated the performance of syndromic surveillance systems and consequently little is known about the types of events that such systems can and cannot detect. We introduce a framework for the evaluation of syndromic surveillance systems that can be used in any setting based upon the use of simulated scenarios. For a range of scenarios this allows the time and probability of detection to be determined and uncertainty is fully incorporated. In addition, we demonstrate how such a framework can model the benefits of increases in the number of centres reporting syndromic data and also determine the minimum size of outbreaks that can or cannot be detected. Here, we demonstrate its utility using simulations of national influenza outbreaks and localised outbreaks of cryptosporidiosis. Influenza outbreaks are consistently detected with larger outbreaks being detected in a more timely manner. Small cryptosporidiosis outbreaks (<1000 symptomatic individuals) are unlikely to be detected. We also demonstrate the advantages of having multiple syndromic data streams (e.g. emergency attendance data, telephone helpline data, general practice consultation data) as different streams are able to detect different outbreak types with different efficacy (e.g. emergency attendance data are useful for the detection of pandemic influenza but not for outbreaks of cryptosporidiosis). We also highlight that for any one disease, the utility of data streams may vary geographically, and that the detection ability of syndromic surveillance varies seasonally (e.g. an influenza outbreak starting in July is detected sooner than one starting later in the year). We argue that our framework constitutes a useful tool for public health emergency preparedness in multiple settings. The proposed framework allows the exhaustive evaluation of any syndromic surveillance system and constitutes a useful tool for emergency preparedness and response.

  6. Addressing multi-label imbalance problem of surgical tool detection using CNN.

    PubMed

    Sahu, Manish; Mukhopadhyay, Anirban; Szengel, Angelika; Zachow, Stefan

    2017-06-01

    A fully automated surgical tool detection framework is proposed for endoscopic video streams. State-of-the-art surgical tool detection methods rely on supervised one-vs-all or multi-class classification techniques, completely ignoring the co-occurrence relationship of the tools and the associated class imbalance. In this paper, we formulate tool detection as a multi-label classification task where tool co-occurrences are treated as separate classes. In addition, imbalance on tool co-occurrences is analyzed and stratification techniques are employed to address the imbalance during convolutional neural network (CNN) training. Moreover, temporal smoothing is introduced as an online post-processing step to enhance runtime prediction. Quantitative analysis is performed on the M2CAI16 tool detection dataset to highlight the importance of stratification, temporal smoothing and the overall framework for tool detection. The analysis on tool imbalance, backed by the empirical results, indicates the need and superiority of the proposed framework over state-of-the-art techniques.

  7. An efficient semi-supervised community detection framework in social networks.

    PubMed

    Li, Zhen; Gong, Yong; Pan, Zhisong; Hu, Guyu

    2017-01-01

    Community detection is an important tasks across a number of research fields including social science, biology, and physics. In the real world, topology information alone is often inadequate to accurately find out community structure due to its sparsity and noise. The potential useful prior information such as pairwise constraints which contain must-link and cannot-link constraints can be obtained from domain knowledge in many applications. Thus, combining network topology with prior information to improve the community detection accuracy is promising. Previous methods mainly utilize the must-link constraints while cannot make full use of cannot-link constraints. In this paper, we propose a semi-supervised community detection framework which can effectively incorporate two types of pairwise constraints into the detection process. Particularly, must-link and cannot-link constraints are represented as positive and negative links, and we encode them by adding different graph regularization terms to penalize closeness of the nodes. Experiments on multiple real-world datasets show that the proposed framework significantly improves the accuracy of community detection.

  8. Feature Selection and Classifier Parameters Estimation for EEG Signals Peak Detection Using Particle Swarm Optimization

    PubMed Central

    Adam, Asrul; Mohd Tumari, Mohd Zaidi; Mohamad, Mohd Saberi

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model. PMID:25243236

  9. Comparison of an adaptive local thresholding method on CBCT and µCT endodontic images

    NASA Astrophysics Data System (ADS)

    Michetti, Jérôme; Basarab, Adrian; Diemer, Franck; Kouame, Denis

    2018-01-01

    Root canal segmentation on cone beam computed tomography (CBCT) images is difficult because of the noise level, resolution limitations, beam hardening and dental morphological variations. An image processing framework, based on an adaptive local threshold method, was evaluated on CBCT images acquired on extracted teeth. A comparison with high quality segmented endodontic images on micro computed tomography (µCT) images acquired from the same teeth was carried out using a dedicated registration process. Each segmented tooth was evaluated according to volume and root canal sections through the area and the Feret’s diameter. The proposed method is shown to overcome the limitations of CBCT and to provide an automated and adaptive complete endodontic segmentation. Despite a slight underestimation (-4, 08%), the local threshold segmentation method based on edge-detection was shown to be fast and accurate. Strong correlations between CBCT and µCT segmentations were found both for the root canal area and diameter (respectively 0.98 and 0.88). Our findings suggest that combining CBCT imaging with this image processing framework may benefit experimental endodontology, teaching and could represent a first development step towards the clinical use of endodontic CBCT segmentation during pulp cavity treatment.

  10. Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems

    NASA Astrophysics Data System (ADS)

    Abeynayake, Canicious; Tran, Minh D.

    2015-05-01

    Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.

  11. Real-time video analysis for retail stores

    NASA Astrophysics Data System (ADS)

    Hassan, Ehtesham; Maurya, Avinash K.

    2015-03-01

    With the advancement in video processing technologies, we can capture subtle human responses in a retail store environment which play decisive role in the store management. In this paper, we present a novel surveillance video based analytic system for retail stores targeting localized and global traffic estimate. Development of an intelligent system for human traffic estimation in real-life poses a challenging problem because of the variation and noise involved. In this direction, we begin with a novel human tracking system by an intelligent combination of motion based and image level object detection. We demonstrate the initial evaluation of this approach on available standard dataset yielding promising result. Exact traffic estimate in a retail store require correct separation of customers from service providers. We present a role based human classification framework using Gaussian mixture model for this task. A novel feature descriptor named graded colour histogram is defined for object representation. Using, our role based human classification and tracking system, we have defined a novel computationally efficient framework for two types of analytics generation i.e., region specific people count and dwell-time estimation. This system has been extensively evaluated and tested on four hours of real-life video captured from a retail store.

  12. Using the knowledge-to-action framework to guide the timing of dialysis initiation.

    PubMed

    Sood, Manish M; Manns, Braden; Nesrallah, Gihad

    2014-05-01

    The optimal time at which to initiate chronic dialysis remains unknown. Using a contemporary knowledge translation approach (the knowledge-to-action framework), a pan-Canadian collaboration (CANN-NET) set out to study the scope of the problem, then develop and disseminate evidence-based guidelines addressing the timing of dialysis initiation. The purpose of this review is to summarize the key findings and describe the planned Canadian knowledge translation strategy for improving knowledge and practices pertaining to the timing dialysis initiation. New research has provided considerable insights regarding the initiation of dialysis. A Canadian cohort study identified significant variation in the estimated glomerular filtration rate level at dialysis initiation, and a survey of providers identified related knowledge gaps that might be amenable to knowledge translation interventions. A recent knowledge synthesis/guideline concluded that early dialysis initiation is costly, and provides no measureable clinical benefits. A systematic knowledge translation intervention including a multifaceted approach may aid in reducing variation in practice and improving the quality of care. Utilizing the knowledge-to-action framework, we identified practice variation and key barriers to the optimal timing for dialysis initiation that may be amenable to knowledge translation strategies.

  13. Modeling Geomagnetic Variations using a Machine Learning Framework

    NASA Astrophysics Data System (ADS)

    Cheung, C. M. M.; Handmer, C.; Kosar, B.; Gerules, G.; Poduval, B.; Mackintosh, G.; Munoz-Jaramillo, A.; Bobra, M.; Hernandez, T.; McGranaghan, R. M.

    2017-12-01

    We present a framework for data-driven modeling of Heliophysics time series data. The Solar Terrestrial Interaction Neural net Generator (STING) is an open source python module built on top of state-of-the-art statistical learning frameworks (traditional machine learning methods as well as deep learning). To showcase the capability of STING, we deploy it for the problem of predicting the temporal variation of geomagnetic fields. The data used includes solar wind measurements from the OMNI database and geomagnetic field data taken by magnetometers at US Geological Survey observatories. We examine the predictive capability of different machine learning techniques (recurrent neural networks, support vector machines) for a range of forecasting times (minutes to 12 hours). STING is designed to be extensible to other types of data. We show how STING can be used on large sets of data from different sensors/observatories and adapted to tackle other problems in Heliophysics.

  14. Effective Vehicle-Based Kangaroo Detection for Collision Warning Systems Using Region-Based Convolutional Networks.

    PubMed

    Saleh, Khaled; Hossny, Mohammed; Nahavandi, Saeid

    2018-06-12

    Traffic collisions between kangaroos and motorists are on the rise on Australian roads. According to a recent report, it was estimated that there were more than 20,000 kangaroo vehicle collisions that occurred only during the year 2015 in Australia. In this work, we are proposing a vehicle-based framework for kangaroo detection in urban and highway traffic environment that could be used for collision warning systems. Our proposed framework is based on region-based convolutional neural networks (RCNN). Given the scarcity of labeled data of kangaroos in traffic environments, we utilized our state-of-the-art data generation pipeline to generate 17,000 synthetic depth images of traffic scenes with kangaroo instances annotated in them. We trained our proposed RCNN-based framework on a subset of the generated synthetic depth images dataset. The proposed framework achieved a higher average precision (AP) score of 92% over all the testing synthetic depth image datasets. We compared our proposed framework against other baseline approaches and we outperformed it with more than 37% in AP score over all the testing datasets. Additionally, we evaluated the generalization performance of the proposed framework on real live data and we achieved a resilient detection accuracy without any further fine-tuning of our proposed RCNN-based framework.

  15. A compressive sensing based secure watermark detection and privacy preserving storage framework.

    PubMed

    Qia Wang; Wenjun Zeng; Jun Tian

    2014-03-01

    Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.

  16. Influence of gravel mining and other factors on detection probabilities of Coastal Plain fishes in the Mobile River Basin, Alabama

    USGS Publications Warehouse

    Hayer, C.-A.; Irwin, E.R.

    2008-01-01

    We used an information-theoretic approach to examine the variation in detection probabilities for 87 Piedmont and Coastal Plain fishes in relation to instream gravel mining in four Alabama streams of the Mobile River drainage. Biotic and abiotic variables were also included in candidate models. Detection probabilities were heterogeneous across species and varied with habitat type, stream, season, and water quality. Instream gravel mining influenced the variation in detection probabilities for 38% of the species collected, probably because it led to habitat loss and increased sedimentation. Higher detection probabilities were apparent at unmined sites than at mined sites for 78% of the species for which gravel mining was shown to influence detection probabilities, indicating potential negative impacts to these species. Physical and chemical attributes also explained the variation in detection probabilities for many species. These results indicate that anthropogenic impacts can affect detection probabilities for fishes, and such variation should be considered when developing monitoring programs or routine sampling protocols. ?? Copyright by the American Fisheries Society 2008.

  17. Variational Pragmatics and "Responding to Thanks"--Revisited

    ERIC Educational Resources Information Center

    Bieswanger, Markus

    2015-01-01

    In 2005, Klaus P. Schneider published a fascinating article with the title "'No problem, you're welcome, anytime': Responding to thanks in Ireland, England, and the U.S.A." Adopting the then emerging and now established framework of variational pragmatics, Schneider's pioneering paper presents the results of a study on differences…

  18. Combining Formal and Functional Approaches to Topic Structure

    ERIC Educational Resources Information Center

    Zellers, Margaret; Post, Brechtje

    2012-01-01

    Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to…

  19. Neurodevelopmental Variation as a Framework for Thinking about the Twice Exceptional

    ERIC Educational Resources Information Center

    Gilger, Jeffrey W.; Hynd, George W.

    2008-01-01

    Developmental exceptionalities span the range of learning abilities and encompass children with both learning disorders and learning gifts. The purpose of this article is to stimulate thinking about these exceptionalities, particularly the complexities and variations within and across people. Investigators tend to view learning disabilities or…

  20. Regional variations in the distribution and colocalization of extracellular matrix proteins in the juvenile bovine meniscus

    PubMed Central

    Vanderploeg, Eric J; Wilson, Christopher G; Imler, Stacy M; Ling, Carrie Hang-Yin; Levenston, Marc E

    2012-01-01

    A deeper understanding of the composition and organization of extracellular matrix molecules in native, healthy meniscus tissue is required to fully appreciate the degeneration that occurs in joint disease and the intricate environment in which an engineered meniscal graft would need to function. In this study, regional variations in the tissue-level and pericellular distributions of collagen types I, II and VI and the proteoglycans aggrecan, biglycan and decorin were examined in the juvenile bovine meniscus. The collagen networks were extensively, but not completely, colocalized, with tissue-level organization that varied with radial position across the meniscus. Type VI collagen exhibited close association with large bundles composed of type I and II collagen and, in contrast to type I and II collagen, was further concentrated in the pericellular matrix. Aggrecan was detected throughout the inner region of the meniscus but was restricted to the pericellular matrix and sheaths of collagen bundles in the middle and outer regions. The small proteoglycans biglycan and decorin exhibited regional variations in staining intensity but were consistently localized in the intra- and/or peri-cellular compartments. These results provide insight into the complex hierarchy of extracellular matrix organization in the meniscus and provide a framework for better understanding meniscal degeneration and disease progression and evaluating potential repair and regeneration strategies. PMID:22703476

  1. 3D face recognition under expressions, occlusions, and pose variations.

    PubMed

    Drira, Hassen; Ben Amor, Boulbaba; Srivastava, Anuj; Daoudi, Mohamed; Slama, Rim

    2013-09-01

    We propose a novel geometric framework for analyzing 3D faces, with the specific goals of comparing, matching, and averaging their shapes. Here we represent facial surfaces by radial curves emanating from the nose tips and use elastic shape analysis of these curves to develop a Riemannian framework for analyzing shapes of full facial surfaces. This representation, along with the elastic Riemannian metric, seems natural for measuring facial deformations and is robust to challenges such as large facial expressions (especially those with open mouths), large pose variations, missing parts, and partial occlusions due to glasses, hair, and so on. This framework is shown to be promising from both--empirical and theoretical--perspectives. In terms of the empirical evaluation, our results match or improve upon the state-of-the-art methods on three prominent databases: FRGCv2, GavabDB, and Bosphorus, each posing a different type of challenge. From a theoretical perspective, this framework allows for formal statistical inferences, such as the estimation of missing facial parts using PCA on tangent spaces and computing average shapes.

  2. Microbial community pattern detection in human body habitats via ensemble clustering framework.

    PubMed

    Yang, Peng; Su, Xiaoquan; Ou-Yang, Le; Chua, Hon-Nian; Li, Xiao-Li; Ning, Kang

    2014-01-01

    The human habitat is a host where microbial species evolve, function, and continue to evolve. Elucidating how microbial communities respond to human habitats is a fundamental and critical task, as establishing baselines of human microbiome is essential in understanding its role in human disease and health. Recent studies on healthy human microbiome focus on particular body habitats, assuming that microbiome develop similar structural patterns to perform similar ecosystem function under same environmental conditions. However, current studies usually overlook a complex and interconnected landscape of human microbiome and limit the ability in particular body habitats with learning models of specific criterion. Therefore, these methods could not capture the real-world underlying microbial patterns effectively. To obtain a comprehensive view, we propose a novel ensemble clustering framework to mine the structure of microbial community pattern on large-scale metagenomic data. Particularly, we first build a microbial similarity network via integrating 1920 metagenomic samples from three body habitats of healthy adults. Then a novel symmetric Nonnegative Matrix Factorization (NMF) based ensemble model is proposed and applied onto the network to detect clustering pattern. Extensive experiments are conducted to evaluate the effectiveness of our model on deriving microbial community with respect to body habitat and host gender. From clustering results, we observed that body habitat exhibits a strong bound but non-unique microbial structural pattern. Meanwhile, human microbiome reveals different degree of structural variations over body habitat and host gender. In summary, our ensemble clustering framework could efficiently explore integrated clustering results to accurately identify microbial communities, and provide a comprehensive view for a set of microbial communities. The clustering results indicate that structure of human microbiome is varied systematically across body habitats and host genders. Such trends depict an integrated biography of microbial communities, which offer a new insight towards uncovering pathogenic model of human microbiome.

  3. Microbial community pattern detection in human body habitats via ensemble clustering framework

    PubMed Central

    2014-01-01

    Background The human habitat is a host where microbial species evolve, function, and continue to evolve. Elucidating how microbial communities respond to human habitats is a fundamental and critical task, as establishing baselines of human microbiome is essential in understanding its role in human disease and health. Recent studies on healthy human microbiome focus on particular body habitats, assuming that microbiome develop similar structural patterns to perform similar ecosystem function under same environmental conditions. However, current studies usually overlook a complex and interconnected landscape of human microbiome and limit the ability in particular body habitats with learning models of specific criterion. Therefore, these methods could not capture the real-world underlying microbial patterns effectively. Results To obtain a comprehensive view, we propose a novel ensemble clustering framework to mine the structure of microbial community pattern on large-scale metagenomic data. Particularly, we first build a microbial similarity network via integrating 1920 metagenomic samples from three body habitats of healthy adults. Then a novel symmetric Nonnegative Matrix Factorization (NMF) based ensemble model is proposed and applied onto the network to detect clustering pattern. Extensive experiments are conducted to evaluate the effectiveness of our model on deriving microbial community with respect to body habitat and host gender. From clustering results, we observed that body habitat exhibits a strong bound but non-unique microbial structural pattern. Meanwhile, human microbiome reveals different degree of structural variations over body habitat and host gender. Conclusions In summary, our ensemble clustering framework could efficiently explore integrated clustering results to accurately identify microbial communities, and provide a comprehensive view for a set of microbial communities. The clustering results indicate that structure of human microbiome is varied systematically across body habitats and host genders. Such trends depict an integrated biography of microbial communities, which offer a new insight towards uncovering pathogenic model of human microbiome. PMID:25521415

  4. Active edge maps for medical image registration

    NASA Astrophysics Data System (ADS)

    Kerwin, William; Yuan, Chun

    2001-07-01

    Applying edge detection prior to performing image registration yields several advantages over raw intensity- based registration. Advantages include the ability to register multicontrast or multimodality images, immunity to intensity variations, and the potential for computationally efficient algorithms. In this work, a common framework for edge-based image registration is formulated as an adaptation of snakes used in boundary detection. Called active edge maps, the new formulation finds a one-to-one transformation T(x) that maps points in a source image to corresponding locations in a target image using an energy minimization approach. The energy consists of an image component that is small when edge features are well matched in the two images, and an internal term that restricts T(x) to allowable configurations. The active edge map formulation is illustrated here with a specific example developed for affine registration of carotid artery magnetic resonance images. In this example, edges are identified using a magnitude of gradient operator, image energy is determined using a Gaussian weighted distance function, and the internal energy includes separate, adjustable components that control volume preservation and rigidity.

  5. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  6. Self-contained image mapping of placental vasculature in 3D ultrasound-guided fetoscopy.

    PubMed

    Yang, Liangjing; Wang, Junchen; Ando, Takehiro; Kubota, Akihiro; Yamashita, Hiromasa; Sakuma, Ichiro; Chiba, Toshio; Kobayashi, Etsuko

    2016-09-01

    Surgical navigation technology directed at fetoscopic procedures is relatively underdeveloped compared with other forms of endoscopy. The narrow fetoscopic field of views and the vast vascular network on the placenta make examination and photocoagulation treatment of twin-to-twin transfusion syndrome challenging. Though ultrasonography is used for intraoperative guidance, its navigational ability is not fully exploited. This work aims to integrate 3D ultrasound imaging and endoscopic vision seamlessly for placental vasculature mapping through a self-contained framework without external navigational devices. This is achieved through development, integration, and experimentation of novel navigational modules. Firstly, a framework design that addresses the current limitations based on identified gaps is conceptualized. Secondly, integration of navigational modules including (1) ultrasound-based localization, (2) image alignment, and (3) vision-based tracking to update the scene texture map is implemented. This updated texture map is projected to an ultrasound-constructed 3D model for photorealistic texturing of the 3D scene creating a panoramic view of the moving fetoscope. In addition, a collaborative scheme for the integration of the modular workflow system is proposed to schedule updates in a systematic fashion. Finally, experiments are carried out to evaluate each modular variation and an integrated collaborative scheme of the framework. The modules and the collaborative scheme are evaluated through a series of phantom experiments with controlled trajectories for repeatability. The collaborative framework demonstrated the best accuracy (5.2 % RMS error) compared with all the three single-module variations during the experiment. Validation on an ex vivo monkey placenta shows visual continuity of the freehand fetoscopic panorama. The proposed developed collaborative framework and the evaluation study of the framework variations provide analytical insights for effective integration of ultrasonography and endoscopy. This contributes to the development of navigation techniques in fetoscopic procedures and can potentially be extended to other applications in intraoperative imaging.

  7. Magnitude and sources of bias in the detection of mixed strain M. tuberculosis infection.

    PubMed

    Plazzotta, Giacomo; Cohen, Ted; Colijn, Caroline

    2015-03-07

    High resolution tests for genetic variation reveal that individuals may simultaneously host more than one distinct strain of Mycobacterium tuberculosis. Previous studies find that this phenomenon, which we will refer to as "mixed infection", may affect the outcomes of treatment for infected individuals and may influence the impact of population-level interventions against tuberculosis. In areas where the incidence of TB is high, mixed infections have been found in nearly 20% of patients; these studies may underestimate the actual prevalence of mixed infection given that tests may not be sufficiently sensitive for detecting minority strains. Specific reasons for failing to detect mixed infections would include low initial numbers of minority strain cells in sputum, stochastic growth in culture and the physical division of initial samples into parts (typically only one of which is genotyped). In this paper, we develop a mathematical framework that models the study designs aimed to detect mixed infections. Using both a deterministic and a stochastic approach, we obtain posterior estimates of the prevalence of mixed infection. We find that the posterior estimate of the prevalence of mixed infection may be substantially higher than the fraction of cases in which it is detected. We characterize this bias in terms of the sensitivity of the genotyping method and the relative growth rates and initial population sizes of the different strains collected in sputum. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics

    NASA Astrophysics Data System (ADS)

    Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu

    2007-11-01

    In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.

  9. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    USGS Publications Warehouse

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  10. Exploiting Acoustic and Syntactic Features for Automatic Prosody Labeling in a Maximum Entropy Framework

    PubMed Central

    Sridhar, Vivek Kumar Rangarajan; Bangalore, Srinivas; Narayanan, Shrikanth S.

    2009-01-01

    In this paper, we describe a maximum entropy-based automatic prosody labeling framework that exploits both language and speech information. We apply the proposed framework to both prominence and phrase structure detection within the Tones and Break Indices (ToBI) annotation scheme. Our framework utilizes novel syntactic features in the form of supertags and a quantized acoustic–prosodic feature representation that is similar to linear parameterizations of the prosodic contour. The proposed model is trained discriminatively and is robust in the selection of appropriate features for the task of prosody detection. The proposed maximum entropy acoustic–syntactic model achieves pitch accent and boundary tone detection accuracies of 86.0% and 93.1% on the Boston University Radio News corpus, and, 79.8% and 90.3% on the Boston Directions corpus. The phrase structure detection through prosodic break index labeling provides accuracies of 84% and 87% on the two corpora, respectively. The reported results are significantly better than previously reported results and demonstrate the strength of maximum entropy model in jointly modeling simple lexical, syntactic, and acoustic features for automatic prosody labeling. PMID:19603083

  11. Mass detection in digital breast tomosynthesis data using convolutional neural networks and multiple instance learning.

    PubMed

    Yousefi, Mina; Krzyżak, Adam; Suen, Ching Y

    2018-05-01

    Digital breast tomosynthesis (DBT) was developed in the field of breast cancer screening as a new tomographic technique to minimize the limitations of conventional digital mammography breast screening methods. A computer-aided detection (CAD) framework for mass detection in DBT has been developed and is described in this paper. The proposed framework operates on a set of two-dimensional (2D) slices. With plane-to-plane analysis on corresponding 2D slices from each DBT, it automatically learns complex patterns of 2D slices through a deep convolutional neural network (DCNN). It then applies multiple instance learning (MIL) with a randomized trees approach to classify DBT images based on extracted information from 2D slices. This CAD framework was developed and evaluated using 5040 2D image slices derived from 87 DBT volumes. The empirical results demonstrate that this proposed CAD framework achieves much better performance than CAD systems that use hand-crafted features and deep cardinality-restricted Bolzmann machines to detect masses in DBTs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Geographic Variation in Diagnostic Ability and Quality of Care Metrics: A Case Study of Ankylosing Spondylitis and Low Back Pain

    PubMed Central

    Shafrin, Jason; Griffith, Jenny; Shim, Jin Joo; Huber, Caroline; Ganguli, Arijit; Aubry, Wade

    2017-01-01

    Studies examining geographic variation in care for low back pain often focus on process and outcome measures conditional on patient diagnosis but generally do not take into account a physician’s ability to diagnose the root cause of low back pain. In our case study, we used increased detection of ankylosing spondylitis—a relatively rare inflammatory back disease—as a proxy for diagnostic ability and measured the relationship between ankylosing spondylitis detection, potentially inappropriate low back pain care, and cost. Using 5 years of health insurance claims data, we found significant variation in ankylosing spondylitis detection across metropolitan statistical areas (MSAs), with 8.1% of the variation in detection explained by a region’s racial composition. Furthermore, low back pain patients in MSAs with higher ankylosing spondylitis detection had 7.9% lower use of corticosteroids, 9.0% lower use of opioids, and 8.2% lower pharmacy cost, compared with patients living in low-detection MSAs. PMID:28548005

  13. Epigenetic changes detected in micropropagated hop plants.

    PubMed

    Peredo, Elena L; Arroyo-García, Rosa; Revilla, M Angeles

    2009-07-01

    Micropropagation is a widely used technique in hops (Humulus lupulus L.). However, to the best of our knowledge, the genetic and epigenetic stability of the microplants has never been tested before. In the present study, two hop accessions were established in vitro and micropropagated for 2 years. The genetic and epigenetic stability of the in vitro plants was analyzed with several molecular techniques: random amplified DNA polymorphism (RAPD), retrotransposon microsatellite amplified polymorphism (REMAP), and methylation-sensitive amplification polymorphism (MSAP). No genetic variation among control and treated plants was found, even after 12 cycles of micropropagation. Epigenetic variation was detected, first, when field and in vitro samples were compared. Nearly a 30% of the detected fragments presented the same pattern of alterations in all the vitroplants. Second, lower levels of epigenetic variation were detected among plants from the different subcultures. Part of this detected variation seemed to be accumulated along the 12 sequential subcultures tested.

  14. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  15. 3D Graphene Frameworks/Co3 O4 Composites Electrode for High-Performance Supercapacitor and Enzymeless Glucose Detection.

    PubMed

    Bao, Lin; Li, Tao; Chen, Shu; Peng, Chang; Li, Ling; Xu, Qian; Chen, Yashao; Ou, Encai; Xu, Weijian

    2017-02-01

    3D graphene frameworks/Co 3 O 4 composites are produced by the thermal explosion method, in which the generation of Co 3 O 4 nanoparticles, reduction of graphene oxide, and creation of 3D frameworks are simultaneously completed. The process prevents the agglomeration of Co 3 O 4 particles effectively, resulting in monodispersed Co 3 O 4 nanoparticles scattered on the 3D graphene frameworks evenly. The prepared 3D graphene frameworks/Co 3 O 4 composites used as electrodes for supercapacitor display a definite improvement on electrochemical performance with high specific capacitance (≈1765 F g -1 at a current density of 1 A g -1 ), good rate performance (≈1266 F g -1 at a current density of 20 A g -1 ), and excellent stability (≈93% maintenance of specific capacitance at a constant current density of 10 A g -1 after 5000 cycles). In addition, the composites are also employed as nonenzymatic sensors for the electrochemical detection of glucose, which exhibit high sensitivity (122.16 µA mM -1  cm -2 ) and noteworthy lower detection limit (157 × 10 -9 M, S/N = 3). Therefore, the authors expect that the 3D graphene frameworks/Co 3 O 4 composites described here would possess potential applications as the electrode materials in supercapacitors and nonenzymatic detection of glucose. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Network Community Detection based on the Physarum-inspired Computational Framework.

    PubMed

    Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili

    2016-12-13

    Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.

  17. Parallel Molecular Distributed Detection With Brownian Motion.

    PubMed

    Rogers, Uri; Koh, Min-Sung

    2016-12-01

    This paper explores the in vivo distributed detection of an undesired biological agent's (BAs) biomarkers by a group of biological sized nanomachines in an aqueous medium under drift. The term distributed, indicates that the system information relative to the BAs presence is dispersed across the collection of nanomachines, where each nanomachine possesses limited communication, computation, and movement capabilities. Using Brownian motion with drift, a probabilistic detection and optimal data fusion framework, coined molecular distributed detection, will be introduced that combines theory from both molecular communication and distributed detection. Using the optimal data fusion framework as a guide, simulation indicates that a sub-optimal fusion method exists, allowing for a significant reduction in implementation complexity while retaining BA detection accuracy.

  18. Differential principal component analysis of ChIP-seq.

    PubMed

    Ji, Hongkai; Li, Xia; Wang, Qian-fei; Ning, Yang

    2013-04-23

    We propose differential principal component analysis (dPCA) for analyzing multiple ChIP-sequencing datasets to identify differential protein-DNA interactions between two biological conditions. dPCA integrates unsupervised pattern discovery, dimension reduction, and statistical inference into a single framework. It uses a small number of principal components to summarize concisely the major multiprotein synergistic differential patterns between the two conditions. For each pattern, it detects and prioritizes differential genomic loci by comparing the between-condition differences with the within-condition variation among replicate samples. dPCA provides a unique tool for efficiently analyzing large amounts of ChIP-sequencing data to study dynamic changes of gene regulation across different biological conditions. We demonstrate this approach through analyses of differential chromatin patterns at transcription factor binding sites and promoters as well as allele-specific protein-DNA interactions.

  19. Implicit Shape Models for Object Detection in 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Velizhev, A.; Shapovalov, R.; Schindler, K.

    2012-07-01

    We present a method for automatic object localization and recognition in 3D point clouds representing outdoor urban scenes. The method is based on the implicit shape models (ISM) framework, which recognizes objects by voting for their center locations. It requires only few training examples per class, which is an important property for practical use. We also introduce and evaluate an improved version of the spin image descriptor, more robust to point density variation and uncertainty in normal direction estimation. Our experiments reveal a significant impact of these modifications on the recognition performance. We compare our results against the state-of-the-art method and get significant improvement in both precision and recall on the Ohio dataset, consisting of combined aerial and terrestrial LiDAR scans of 150,000 m2 of urban area in total.

  20. Cultural variation of perceptions of crew behaviour in multi-pilot aircraft.

    PubMed

    Hörmann, H J

    2001-09-01

    As the "last line of defence" pilots in commercial aviation often have to counteract effects of unexpected system flaws that could endanger the safety of a given flight. In order to timely detect and mitigate consequences of latent or active failures, effective team behaviour of the crew members is an indispensable condition. While this fact is generally agreed in the aviation community, there seems to be a wide range of concepts how crews should interact most effectively. Within the framework of the European project JARTEL the cultural robustness of evaluations of crew behaviour was examined. 105 instructor pilots from 14 different airlines representing 12 European countries participated in this project. The instructors' evaluations of crew behaviours in eight video scenarios will be compared in relation to cultural differences on Hofstede's dimensions of Power Distance and Individualism.

  1. Somatic Genetic Variation in Solid Pseudopapillary Tumor of the Pancreas by Whole Exome Sequencing

    PubMed Central

    Guo, Meng; Luo, Guopei; Jin, Kaizhou; Long, Jiang; Cheng, He; Lu, Yu; Wang, Zhengshi; Yang, Chao; Xu, Jin; Ni, Quanxing; Yu, Xianjun; Liu, Chen

    2017-01-01

    Solid pseudopapillary tumor of the pancreas (SPT) is a rare pancreatic disease with a unique clinical manifestation. Although CTNNB1 gene mutations had been universally reported, genetic variation profiles of SPT are largely unidentified. We conducted whole exome sequencing in nine SPT patients to probe the SPT-specific insertions and deletions (indels) and single nucleotide polymorphisms (SNPs). In total, 54 SNPs and 41 indels of prominent variations were demonstrated through parallel exome sequencing. We detected that CTNNB1 mutations presented throughout all patients studied (100%), and a higher count of SNPs was particularly detected in patients with older age, larger tumor, and metastatic disease. By aggregating 95 detected variation events and viewing the interconnections among each of the genes with variations, CTNNB1 was identified as the core portion in the network, which might collaborate with other events such as variations of USP9X, EP400, HTT, MED12, and PKD1 to regulate tumorigenesis. Pathway analysis showed that the events involved in other cancers had the potential to influence the progression of the SNPs count. Our study revealed an insight into the variation of the gene encoding region underlying solid-pseudopapillary neoplasm tumorigenesis. The detection of these variations might partly reflect the potential molecular mechanism. PMID:28054945

  2. Thoracic lymph node station recognition on CT images based on automatic anatomy recognition with an optimal parent strategy

    NASA Astrophysics Data System (ADS)

    Xu, Guoping; Udupa, Jayaram K.; Tong, Yubing; Cao, Hanqiang; Odhner, Dewey; Torigian, Drew A.; Wu, Xingyu

    2018-03-01

    Currently, there are many papers that have been published on the detection and segmentation of lymph nodes from medical images. However, it is still a challenging problem owing to low contrast with surrounding soft tissues and the variations of lymph node size and shape on computed tomography (CT) images. This is particularly very difficult on low-dose CT of PET/CT acquisitions. In this study, we utilize our previous automatic anatomy recognition (AAR) framework to recognize the thoracic-lymph node stations defined by the International Association for the Study of Lung Cancer (IASLC) lymph node map. The lymph node stations themselves are viewed as anatomic objects and are localized by using a one-shot method in the AAR framework. Two strategies have been taken in this paper for integration into AAR framework. The first is to combine some lymph node stations into composite lymph node stations according to their geometrical nearness. The other is to find the optimal parent (organ or union of organs) as an anchor for each lymph node station based on the recognition error and thereby find an overall optimal hierarchy to arrange anchor organs and lymph node stations. Based on 28 contrast-enhanced thoracic CT image data sets for model building, 12 independent data sets for testing, our results show that thoracic lymph node stations can be localized within 2-3 voxels compared to the ground truth.

  3. FSM-F: Finite State Machine Based Framework for Denial of Service and Intrusion Detection in MANET.

    PubMed

    N Ahmed, Malik; Abdullah, Abdul Hanan; Kaiwartya, Omprakash

    2016-01-01

    Due to the continuous advancements in wireless communication in terms of quality of communication and affordability of the technology, the application area of Mobile Adhoc Networks (MANETs) significantly growing particularly in military and disaster management. Considering the sensitivity of the application areas, security in terms of detection of Denial of Service (DoS) and intrusion has become prime concern in research and development in the area. The security systems suggested in the past has state recognition problem where the system is not able to accurately identify the actual state of the network nodes due to the absence of clear definition of states of the nodes. In this context, this paper proposes a framework based on Finite State Machine (FSM) for denial of service and intrusion detection in MANETs. In particular, an Interruption Detection system for Adhoc On-demand Distance Vector (ID-AODV) protocol is presented based on finite state machine. The packet dropping and sequence number attacks are closely investigated and detection systems for both types of attacks are designed. The major functional modules of ID-AODV includes network monitoring system, finite state machine and attack detection model. Simulations are carried out in network simulator NS-2 to evaluate the performance of the proposed framework. A comparative evaluation of the performance is also performed with the state-of-the-art techniques: RIDAN and AODV. The performance evaluations attest the benefits of proposed framework in terms of providing better security for denial of service and intrusion detection attacks.

  4. Influence of gene flow on divergence dating - implications for the speciation history of Takydromus grass lizards.

    PubMed

    Tseng, Shu-Ping; Li, Shou-Hsien; Hsieh, Chia-Hung; Wang, Hurng-Yi; Lin, Si-Min

    2014-10-01

    Dating the time of divergence and understanding speciation processes are central to the study of the evolutionary history of organisms but are notoriously difficult. The difficulty is largely rooted in variations in the ancestral population size or in the genealogy variation across loci. To depict the speciation processes and divergence histories of three monophyletic Takydromus species endemic to Taiwan, we sequenced 20 nuclear loci and combined with one mitochondrial locus published in GenBank. They were analysed by a multispecies coalescent approach within a Bayesian framework. Divergence dating based on the gene tree approach showed high variation among loci, and the divergence was estimated at an earlier date than when derived by the species-tree approach. To test whether variations in the ancestral population size accounted for the majority of this variation, we conducted computer inferences using isolation-with-migration (IM) and approximate Bayesian computation (ABC) frameworks. The results revealed that gene flow during the early stage of speciation was strongly favoured over the isolation model, and the initiation of the speciation process was far earlier than the dates estimated by gene- and species-based divergence dating. Due to their limited dispersal ability, it is suggested that geographical isolation may have played a major role in the divergence of these Takydromus species. Nevertheless, this study reveals a more complex situation and demonstrates that gene flow during the speciation process cannot be overlooked and may have a great impact on divergence dating. By using multilocus data and incorporating Bayesian coalescence approaches, we provide a more biologically realistic framework for delineating the divergence history of Takydromus. © 2014 John Wiley & Sons Ltd.

  5. Variational and perturbative formulations of quantum mechanical/molecular mechanical free energy with mean-field embedding and its analytical gradients.

    PubMed

    Yamamoto, Takeshi

    2008-12-28

    Conventional quantum chemical solvation theories are based on the mean-field embedding approximation. That is, the electronic wavefunction is calculated in the presence of the mean field of the environment. In this paper a direct quantum mechanical/molecular mechanical (QM/MM) analog of such a mean-field theory is formulated based on variational and perturbative frameworks. In the variational framework, an appropriate QM/MM free energy functional is defined and is minimized in terms of the trial wavefunction that best approximates the true QM wavefunction in a statistically averaged sense. Analytical free energy gradient is obtained, which takes the form of the gradient of effective QM energy calculated in the averaged MM potential. In the perturbative framework, the above variational procedure is shown to be equivalent to the first-order expansion of the QM energy (in the exact free energy expression) about the self-consistent reference field. This helps understand the relation between the variational procedure and the exact QM/MM free energy as well as existing QM/MM theories. Based on this, several ways are discussed for evaluating non-mean-field effects (i.e., statistical fluctuations of the QM wavefunction) that are neglected in the mean-field calculation. As an illustration, the method is applied to an S(N)2 Menshutkin reaction in water, NH(3)+CH(3)Cl-->NH(3)CH(3) (+)+Cl(-), for which free energy profiles are obtained at the Hartree-Fock, MP2, B3LYP, and BHHLYP levels by integrating the free energy gradient. Non-mean-field effects are evaluated to be <0.5 kcal/mol using a Gaussian fluctuation model for the environment, which suggests that those effects are rather small for the present reaction in water.

  6. A framework for quantifying the impacts of sub-pixel reflectance variance and covariance on cloud optical thickness and effective radius retrievals based on the bi-spectral method

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Werner, F.; Cho, H.-M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2017-02-01

    The so-called bi-spectral method retrieves cloud optical thickness (τ) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near infrared (VIS/NIR) band and the other in a shortwave-infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved τ and re. In this study, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the τ and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the τ and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval.

  7. A Framework for Quantifying the Impacts of Sub-Pixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bi-Spectral Method.

    NASA Technical Reports Server (NTRS)

    Zhang, Z; Werner, F.; Cho, H. -M.; Wind, Galina; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2017-01-01

    The so-called bi-spectral method retrieves cloud optical thickness (t) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near infrared (VIS/NIR) band and the other in a shortwave-infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved t and re. In this study, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the t and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the t and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval.

  8. A variational method for automatic localization of the most pathological ROI in the knee cartilage

    NASA Astrophysics Data System (ADS)

    Qazi, Arish A.; Dam, Erik B.; Loog, Marco; Nielsen, Mads; Lauze, Francois; Christiansen, Claus

    2008-03-01

    Osteoarthritis (OA) is a degenerative joint disease characterized by degradation of the articular cartilage, and is a major cause of disability. At present, there is no cure for OA and currently available treatments are directed towards relief of symptoms. Recently it was shown that cartilage homogeneity visualized by MRI and representing the biochemical changes undergoing in the cartilage is a potential marker for early detection of knee OA. In this paper based on homogeneity we present an automatic technique, embedded in a variational framework, for localization of a region of interest in the knee cartilage that best indicates where the pathology of the disease is dominant. The technique is evaluated on 283 knee MR scans. We show that OA affects certain areas of the cartilage more distinctly, and these are more towards the peripheral region of the cartilage. We propose that this region in the cartilage corresponds anatomically to the area covered by the meniscus in healthy subjects. This finding may provide valuable clues in the pathology and the etiology of OA and thereby may improve treatment efficacy. Moreover our method is generic and may be applied to other organs as well.

  9. Pinpointing the base of the AGN jets through general relativistic X-ray reverberation studies

    NASA Astrophysics Data System (ADS)

    Emmanoulopoulos, D.

    2015-03-01

    Many theoretical models of Active Galactic Nuclei (AGN) predict that the X-ray corona, lying above the black hole, constitutes the base of the X-ray jet. Thus, by studying the exact geometry of the close black hole environment, we can pinpoint the launching site of the jet. Detection of negative X-ray reverberation time delays (i.e. soft band X-ray variations lagging behind the corresponding hard band X-ray variations) can yield significant information about the geometrical properties of the AGN, such as the location of the X-ray source, as well as the physical properties of the the black hole, such as its mass and spin. In the frame-work of the lamp-post geometry, I present the first systematic X-ray time-lag modelling results of an ensemble of 12 AGN, using a fully general relativistic (GR) ray tracing approach for the estimation of the systems' response functions. By combing these state-of-the art GR response models with statistically innovative fitting routines, I derive the geometrical layout of the close BH environment for each source, unveiling the position of the AGN jet-base.

  10. Splenomegaly Segmentation using Global Convolutional Kernels and Conditional Generative Adversarial Networks

    PubMed Central

    Huo, Yuankai; Xu, Zhoubing; Bao, Shunxing; Bermudez, Camilo; Plassard, Andrew J.; Liu, Jiaqi; Yao, Yuang; Assad, Albert; Abramson, Richard G.; Landman, Bennett A.

    2018-01-01

    Spleen volume estimation using automated image segmentation technique may be used to detect splenomegaly (abnormally enlarged spleen) on Magnetic Resonance Imaging (MRI) scans. In recent years, Deep Convolutional Neural Networks (DCNN) segmentation methods have demonstrated advantages for abdominal organ segmentation. However, variations in both size and shape of the spleen on MRI images may result in large false positive and false negative labeling when deploying DCNN based methods. In this paper, we propose the Splenomegaly Segmentation Network (SSNet) to address spatial variations when segmenting extraordinarily large spleens. SSNet was designed based on the framework of image-to-image conditional generative adversarial networks (cGAN). Specifically, the Global Convolutional Network (GCN) was used as the generator to reduce false negatives, while the Markovian discriminator (PatchGAN) was used to alleviate false positives. A cohort of clinically acquired 3D MRI scans (both T1 weighted and T2 weighted) from patients with splenomegaly were used to train and test the networks. The experimental results demonstrated that a mean Dice coefficient of 0.9260 and a median Dice coefficient of 0.9262 using SSNet on independently tested MRI volumes of patients with splenomegaly.

  11. Enriching the Theoretical Horizons of Phenomenography, Variation Theory and Learning Studies

    ERIC Educational Resources Information Center

    Dahlin, Bo

    2007-01-01

    The aim of this article is to introduce some theoretical frameworks which may develop the research going on within phenomenography and variation theory. Central concepts from the epistemological and cognitive theories of Charles S. Peirce, Niklas Luhmann and Margaret Boden are presented and their implications for phenomenography and variation…

  12. A Study of Korean EFL Learners' Apology Speech Acts: Strategy and Pragmatic Transfer Influenced by Sociolinguistic Variations.

    ERIC Educational Resources Information Center

    Yang, Tae-Kyoung

    2002-01-01

    Examines how apology speech act strategies frequently used in daily life are transferred in the framework of interlanguage pragmatics and sociolinguistics and how they are influenced by sociolinguistic variations such as social status, social distance, severity of offense, and formal or private relationships. (Author/VWL)

  13. Variations in Pedagogical Design of Massive Open Online Courses (MOOCs) across Disciplines

    ERIC Educational Resources Information Center

    Najafi, Hadieh; Rolheiser, Carol; Håklev, Stian; Harrison, Laurie

    2017-01-01

    Given that few studies have formally examined pedagogical design considerations of Massive Online Open Courses (MOOCs), this study explored variations in the pedagogical design of six MOOCs offered at the University of Toronto, while considering disciplinary characteristics and expectations of each MOOC. Using a framework (Neumann et al., 2002)…

  14. Female Students and Denominational Affiliation: Sources of Success and Variation among Nineteenth-Century Academies.

    ERIC Educational Resources Information Center

    Beadie, Nancy

    1999-01-01

    Studies the institutional characteristics and strategic choices of successful academies operating under the New York Regents system from 1838 to 1850. Identifies single-sex education and denominational affiliation as important for success. Suggests frameworks for investigating variations among the 19th-century academies and discusses implications…

  15. A framework for the use of agent based modeling to simulate inter- and intraindividual variation in human behaviors

    EPA Science Inventory

    Simulation of human behavior in exposure modeling is a complex task. Traditionally, inter-individual variation in human activity has been modeled by drawing from a pool of single day time-activity diaries such as the US EPA Consolidated Human Activity Database (CHAD). Here, an ag...

  16. Literacy Skills Gaps: A Cross-Level Analysis on International and Intergenerational Variations

    ERIC Educational Resources Information Center

    Kim, Suehye

    2018-01-01

    The global agenda for sustainable development has centred lifelong learning on UNESCO's Education 2030 Framework for Action. The study described in this article aimed to examine international and intergenerational variations in literacy skills gaps within the context of the United Nations Sustainable Development Goals (SDGs). For this purpose, the…

  17. The Impact on Individualizing Student Models on Necessary Practice Opportunities

    ERIC Educational Resources Information Center

    Lee, Jung In; Brunskill, Emma

    2012-01-01

    When modeling student learning, tutors that use the Knowledge Tracing framework often assume that all students have the same set of model parameters. We find that when fitting parameters to individual students, there is significant variation among the individual's parameters. We examine if this variation is important in terms of instructional…

  18. Detection of an explosive simulant via electrical impedance spectroscopy utilizing the UiO-66-NH2 metal-organic framework.

    PubMed

    Peterson, G W; McEntee, M; Harris, C R; Klevitch, A D; Fountain, A W; Soliz, J R; Balboa, A; Hauser, A J

    2016-11-01

    Electrical impedance spectroscopy, in conjunction with the metal-organic framework (MOF) UiO-66-NH 2 , is used to detect trace levels of the explosive simulant 2,6-dinitrotoluene. The combination of porosity and functionality of the MOF provides an effective dielectric structure, resulting in changes of impedance magnitude and phase angle. The promising data indicate that MOFs may be used in low-cost, robust explosive detection devices.

  19. Characterising dark matter searches at colliders and direct detection experiments: Vector mediators

    DOE PAGES

    Buchmueller, Oliver; Dolan, Matthew J.; Malik, Sarah A.; ...

    2015-01-09

    We introduce a Minimal Simplified Dark Matter (MSDM) framework to quantitatively characterise dark matter (DM) searches at the LHC. We study two MSDM models where the DM is a Dirac fermion which interacts with a vector and axial-vector mediator. The models are characterised by four parameters: m DM, M med , g DM and g q, the DM and mediator masses, and the mediator couplings to DM and quarks respectively. The MSDM models accurately capture the full event kinematics, and the dependence on all masses and couplings can be systematically studied. The interpretation of mono-jet searches in this framework canmore » be used to establish an equal-footing comparison with direct detection experiments. For theories with a vector mediator, LHC mono-jet searches possess better sensitivity than direct detection searches for light DM masses (≲5 GeV). For axial-vector mediators, LHC and direct detection searches generally probe orthogonal directions in the parameter space. We explore the projected limits of these searches from the ultimate reach of the LHC and multi-ton xenon direct detection experiments, and find that the complementarity of the searches remains. In conclusion, we provide a comparison of limits in the MSDM and effective field theory (EFT) frameworks to highlight the deficiencies of the EFT framework, particularly when exploring the complementarity of mono-jet and direct detection searches.« less

  20. Panoptes: web-based exploration of large scale genome variation data.

    PubMed

    Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic

    2017-10-15

    The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Adaptive Agent Modeling of Distributed Language: Investigations on the Effects of Cultural Variation and Internal Action Representations

    ERIC Educational Resources Information Center

    Cangelosi, Angelo

    2007-01-01

    In this paper we present the "grounded adaptive agent" computational framework for studying the emergence of communication and language. This modeling framework is based on simulations of population of cognitive agents that evolve linguistic capabilities by interacting with their social and physical environment (internal and external symbol…

  2. Developing Policy Instruments for Education in the EU: The European Qualifications Framework for Lifelong Learning

    ERIC Educational Resources Information Center

    Elken, Mari

    2015-01-01

    The European Qualifications Framework (EQF) for lifelong learning has been characterized as a policy instrument with a number of contested ideas, raising questions about the process through which such instruments are developed at European level. The introduction of the EQF is in this article examined through variations of neo-institutional theory:…

  3. A Technology Immune Technology Enabled Problem within an Action on Objects Framework: Stamping Functions

    ERIC Educational Resources Information Center

    Connell, Michael; Abramovich, Sergei

    2017-01-01

    This paper illustrates how the notion of Technology Immune Technology Enabled (TITE) problems (Abramovich, 2014), in this case an exploration of variations in surface area we refer to as Stamping Functions, might be incorporated into a K-6 mathematics methods class operating within an Action on Objects framework (Connell, 2001). TITE problems have…

  4. Detection and quantitation of single nucleotide polymorphisms, DNA sequence variations, DNA mutations, DNA damage and DNA mismatches

    DOEpatents

    McCutchen-Maloney, Sandra L.

    2002-01-01

    DNA mutation binding proteins alone and as chimeric proteins with nucleases are used with solid supports to detect DNA sequence variations, DNA mutations and single nucleotide polymorphisms. The solid supports may be flow cytometry beads, DNA chips, glass slides or DNA dips sticks. DNA molecules are coupled to solid supports to form DNA-support complexes. Labeled DNA is used with unlabeled DNA mutation binding proteins such at TthMutS to detect DNA sequence variations, DNA mutations and single nucleotide length polymorphisms by binding which gives an increase in signal. Unlabeled DNA is utilized with labeled chimeras to detect DNA sequence variations, DNA mutations and single nucleotide length polymorphisms by nuclease activity of the chimera which gives a decrease in signal.

  5. Fire, humans, and climate: modeling distribution dynamics of boreal forest waterbirds.

    PubMed

    Börger, Luca; Nudds, Thomas D

    2014-01-01

    Understanding the effects of landscape change and environmental variability on ecological processes is important for evaluating resource management policies, such as the emulation of natural forest disturbances. We analyzed time series of detection/nondetection data using hierarchical models in a Bayesian multi-model inference framework to decompose the dynamics of species distributions into responses to environmental variability, spatial variation in habitat conditions, and population dynamics and interspecific interactions, while correcting for observation errors and variation in sampling regimes. We modeled distribution dynamics of 14 waterbird species (broadly defined, including wetland and riparian species) using data from two different breeding bird surveys collected in the Boreal Shield ecozone within Ontario, Canada. Temporal variation in species occupancy (2000-2006) was primarily driven by climatic variability. Only two species showed evidence of consistent temporal trends in distribution: Ring-necked Duck (Aythya collaris) decreased, and Red-winged Blackbird (Agelaius phoeniceus) increased. The models had good predictive ability on independent data over time (1997-1999). Spatial variation in species occupancy was strongly related to the distribution of specific land cover types and habitat disturbance: Fire and forest harvesting influenced occupancy more than did roads, settlements, or mines. Bioclimatic and habitat heterogeneity indices and geographic coordinates exerted negligible influence on most species distributions. Estimated habitat suitability indices had good predictive ability on spatially independent data (Hudson Bay Lowlands ecozone). Additionally, we detected effects of interspecific interactions. Species responses to fire and forest harvesting were similar for 13 of 14 species; thus, forest-harvesting practices in Ontario generally appeared to emulate the effects of fire for waterbirds over timescales of 10-20 years. Extrapolating to all 84 waterbird species breeding on the Ontario Boreal Shield, however, suggested that up to 30 species may instead have altered (short-term) distribution dynamics due to forestry practices. Hence, natural disturbances are critical components of the ecology of the boreal forest and forest practices which aim to approximate them may succeed in allowing the maintenance of the associated species, but improved monitoring and modeling of large-scale boreal forest bird distribution dynamics will be necessary to resolve existing uncertainties, especially on less-common species.

  6. Velocity variations associated with the large 2010 eruption of Merapi volcano, Java, retrieved from seismic multiplets and ambient noise cross-correlation

    NASA Astrophysics Data System (ADS)

    Budi-Santoso, Agus; Lesage, Philippe

    2016-07-01

    We present a study of the seismic velocity variations that occurred in the structure before the large 2010 eruption of Merapi volcano. For the first time to our knowledge, the technique of coda wave interferometry is applied to both families of similar events (multiplets) and to correlation functions of seismic noise. About half of the seismic events recorded at the summit stations belong to one of the ten multiplets identified, including 120 similar events that occurred in the last 20 hr preceding the eruption onset. Daily noise cross-correlation functions (NCF) were calculated for the six pairs of short-period stations available. Using the stretching method, we estimate time-series of apparent velocity variation (AVV) for each multiplet and each pair of stations. No significant velocity change is detected until September 2010. From 10 October to the beginning of the eruption on 26 October, a complex pattern of AVV is observed with amplitude of up to ±1.5 per cent. Velocity decrease is first observed from families of deep events and then from shallow earthquakes. In the same period, AVV with different signs and chronologies are estimated from NCF calculated for various station pairs. The location in the horizontal plane of the velocity perturbations related with the AVV obtained from NCF is estimated by using an approach based on the radiative transfer approximation. Although their spatial resolution is limited, the resulting maps display velocity decrease in the upper part of the edifice in the period 12-25 October. After the eruption onset, the pattern of velocity perturbations is significantly modified with respect to the previous one. We interpret these velocity variations in the framework of a scenario of magmatic intrusion that integrates most observations. The perturbation of the stress field associated with the magma migration can induce both decrease and increase of the seismic velocity of rocks. Thus the detected AVVs can be considered as precursors of volcanic eruptions in andesitic volcanoes, without taking their sign into account.

  7. Deep Spatial-Temporal Joint Feature Representation for Video Object Detection.

    PubMed

    Zhao, Baojun; Zhao, Boya; Tang, Linbo; Han, Yuqi; Wang, Wenzheng

    2018-03-04

    With the development of deep neural networks, many object detection frameworks have shown great success in the fields of smart surveillance, self-driving cars, and facial recognition. However, the data sources are usually videos, and the object detection frameworks are mostly established on still images and only use the spatial information, which means that the feature consistency cannot be ensured because the training procedure loses temporal information. To address these problems, we propose a single, fully-convolutional neural network-based object detection framework that involves temporal information by using Siamese networks. In the training procedure, first, the prediction network combines the multiscale feature map to handle objects of various sizes. Second, we introduce a correlation loss by using the Siamese network, which provides neighboring frame features. This correlation loss represents object co-occurrences across time to aid the consistent feature generation. Since the correlation loss should use the information of the track ID and detection label, our video object detection network has been evaluated on the large-scale ImageNet VID dataset where it achieves a 69.5% mean average precision (mAP).

  8. Generalized Birkhoffian representation of nonholonomic systems and its discrete variational algorithm

    NASA Astrophysics Data System (ADS)

    Liu, Shixing; Liu, Chang; Hua, Wei; Guo, Yongxin

    2016-11-01

    By using the discrete variational method, we study the numerical method of the general nonholonomic system in the generalized Birkhoffian framework, and construct a numerical method of generalized Birkhoffian equations called a self-adjoint-preserving algorithm. Numerical results show that it is reasonable to study the nonholonomic system by the structure-preserving algorithm in the generalized Birkhoffian framework. Project supported by the National Natural Science Foundation of China (Grant Nos. 11472124, 11572145, 11202090, and 11301350), the Doctor Research Start-up Fund of Liaoning Province, China (Grant No. 20141050), the China Postdoctoral Science Foundation (Grant No. 2014M560203), and the General Science and Technology Research Plans of Liaoning Educational Bureau, China (Grant No. L2013005).

  9. Time-dependent variational approach in terms of squeezed coherent states: Implication to semi-classical approximation

    NASA Technical Reports Server (NTRS)

    Tsue, Yasuhiko

    1994-01-01

    A general framework for time-dependent variational approach in terms of squeezed coherent states is constructed with the aim of describing quantal systems by means of classical mechanics including higher order quantal effects with the aid of canonicity conditions developed in the time-dependent Hartree-Fock theory. The Maslov phase occurring in a semi-classical quantization rule is investigated in this framework. In the limit of a semi-classical approximation in this approach, it is definitely shown that the Maslov phase has a geometric nature analogous to the Berry phase. It is also indicated that this squeezed coherent state approach is a possible way to go beyond the usual WKB approximation.

  10. Sensitive detection of KIT D816V in patients with mastocytosis.

    PubMed

    Tan, Angela; Westerman, David; McArthur, Grant A; Lynch, Kevin; Waring, Paul; Dobrovic, Alexander

    2006-12-01

    The 2447 A > T pathogenic variation at codon 816 of exon 17 (D816V) in the KIT gene, occurring in systemic mastocytosis (SM), leads to constitutive activation of tyrosine kinase activity and confers resistance to the tyrosine kinase inhibitor imatinib mesylate. Thus detection of this variation in SM patients is important for determining treatment strategy, but because the population of malignant cells carrying this variation is often small relative to the normal cell population, standard molecular detection methods can be unsuccessful. We developed 2 methods for detection of KIT D816V in SM patients. The first uses enriched sequencing of mutant alleles (ESMA) after BsmAI restriction enzyme digestion, and the second uses an allele-specific competitive blocker PCR (ACB-PCR) assay. We used these methods to assess 26 patients undergoing evaluation for SM, 13 of whom had SM meeting WHO classification criteria (before variation testing), and we compared the results with those obtained by direct sequencing. The sensitivities of the ESMA and the ACB-PCR assays were 1% and 0.1%, respectively. According to the ACB-PCR assay results, 65% (17/26) of patients were positive for D816V. Of the 17 positive cases, only 23.5% (4/17) were detected by direct sequencing. ESMA detected 2 additional exon 17 pathogenic variations, D816Y and D816N, but detected only 12 (70.5%) of the 17 D816V-positive cases. Overall, 100% (15/15) of the WHO-classified SM cases were codon 816 pathogenic variation positive. These findings demonstrate that the ACB-PCR assay combined with ESMA is a rapid and highly sensitive approach for detection of KIT D816V in SM patients.

  11. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  12. Rational synthesis of an exceptionally stable Zn(II) metal-organic framework for the highly selective and sensitive detection of picric acid.

    PubMed

    Hu, Yingli; Ding, Meili; Liu, Xiao-Qin; Sun, Lin-Bing; Jiang, Hai-Long

    2016-04-28

    Based on an organic ligand involving both carboxylate and tetrazole groups, a chemically stable Zn(II) metal-organic framework has been rationally synthesized and behaves as a fluorescence chemosensor for the highly selective and sensitive detection of picric acid, an extremely hazardous and strong explosive.

  13. A collaborative computing framework of cloud network and WBSN applied to fall detection and 3-D motion reconstruction.

    PubMed

    Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh

    2014-03-01

    As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.

  14. Stepwise and stagewise approaches for spatial cluster detection

    PubMed Central

    Xu, Jiale

    2016-01-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either hypothesis testing framework or Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic area. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power of detections. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. PMID:27246273

  15. Highly sensitive and selective fluoride detection in water through fluorophore release from a metal-organic framework

    PubMed Central

    Hinterholzinger, Florian M.; Rühle, Bastian; Wuttke, Stefan; Karaghiosoff, Konstantin; Bein, Thomas

    2013-01-01

    The detection, differentiation and visualization of compounds such as gases, liquids or ions are key challenges for the design of selective optical chemosensors. Optical chemical sensors employ a transduction mechanism that converts a specific analyte recognition event into an optical signal. Here we report a novel concept for fluoride ion sensing where a porous crystalline framework serves as a host for a fluorescent reporter molecule. The detection is based on the decomposition of the host scaffold which induces the release of the fluorescent dye molecule. Specifically, the hybrid composite of the metal-organic framework NH2-MIL-101(Al) and fluorescein acting as reporter shows an exceptional turn-on fluorescence in aqueous fluoride-containing solutions. Using this novel strategy, the optical detection of fluoride is extremely sensitive and highly selective in the presence of many other anions. PMID:24008779

  16. Improved biliary detection and diagnosis through intelligent machine analysis.

    PubMed

    Logeswaran, Rajasvaran

    2012-09-01

    This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  17. Quantifying the performance of in vivo portal dosimetry in detecting four types of treatment parameter variations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bojechko, C.; Ford, E. C., E-mail: eford@uw.edu

    Purpose: To quantify the ability of electronic portal imaging device (EPID) dosimetry used during treatment (in vivo) in detecting variations that can occur in the course of patient treatment. Methods: Images of transmitted radiation from in vivo EPID measurements were converted to a 2D planar dose at isocenter and compared to the treatment planning dose using a prototype software system. Using the treatment planning system (TPS), four different types of variability were modeled: overall dose scaling, shifting the positions of the multileaf collimator (MLC) leaves, shifting of the patient position, and changes in the patient body contour. The gamma passmore » rate was calculated for the modified and unmodified plans and used to construct a receiver operator characteristic (ROC) curve to assess the detectability of the different parameter variations. The detectability is given by the area under the ROC curve (AUC). The TPS was also used to calculate the impact of the variations on the target dose–volume histogram. Results: Nine intensity modulation radiation therapy plans were measured for four different anatomical sites consisting of 70 separate fields. Results show that in vivo EPID dosimetry was most sensitive to variations in the machine output, AUC = 0.70 − 0.94, changes in patient body habitus, AUC = 0.67 − 0.88, and systematic shifts in the MLC bank positions, AUC = 0.59 − 0.82. These deviations are expected to have a relatively small clinical impact [planning target volume (PTV) D{sub 99} change <7%]. Larger variations have even higher detectability. Displacements in the patient’s position and random variations in MLC leaf positions were not readily detectable, AUC < 0.64. The D{sub 99} of the PTV changed by up to 57% for the patient position shifts considered here. Conclusions: In vivo EPID dosimetry is able to detect relatively small variations in overall dose, systematic shifts of the MLC’s, and changes in the patient habitus. Shifts in the patient’s position which can introduce large changes in the target dose coverage were not readily detected.« less

  18. On a variational approach to some parameter estimation problems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.

    1985-01-01

    Examples (1-D seismic, large flexible structures, bioturbation, nonlinear population dispersal) in which a variation setting can provide a convenient framework for convergence and stability arguments in parameter estimation problems are considered. Some of these examples are 1-D seismic, large flexible structures, bioturbation, and nonlinear population dispersal. Arguments for convergence and stability via a variational approach of least squares formulations of parameter estimation problems for partial differential equations is one aspect of the problem considered.

  19. A Metal-Polydopamine Framework (MPDA) as an Effective Fluorescent Quencher for Highly Sensitive Detection of Hg (II) And Ag (I) ions Through Exonuclease III Activity.

    PubMed

    Ravikumar, Ayyanu; Panneerselvam, Perumal; Morad, Norhashimah

    2018-05-24

    In this paper, we propose a metal-polydopamine framework (MPDA) with specific molecular probe which appears to be the most promising approach to a strong fluorescence quencher. The MPDA framework quenching ability towards various organic fluorophore such as aminoethylcomarin acetate (AMCA), 6-carboxyfluorescein (FAM), carboxyteramethylrhodamine (TAMRA) and Cy5 are used to establish a fluorescent biosensor that can selectively recognize Hg2+ and Ag+ ion. The fluorescent quenching efficiency was sufficient to achieve more than 96%. The MPDA framework also exhibits different affinities with ssDNA and dsDNA. In addition, the FAM labelled ssDNA was adsorbed onto MPDA framework, based on their interaction with the complex formed between MPDA frameworks/ssDNA taken as a sensing platform. By taking advantage of this sensor highly sensitive and selective determination of Hg2+and Ag+ ions is achieved through Exonuclease III signal amplification activity. The detection limits of Hg2+and Ag+ achieved to be 1.2 pM and 34 pM respectively, were compared to co-existing metal ions and GO based sensors. Furthermore, the potential applications of this study establish the highly sensitive fluorescence detection targets in environmental and biological fields.

  20. Genome-Wide Structural Variation Detection by Genome Mapping on Nanochannel Arrays.

    PubMed

    Mak, Angel C Y; Lai, Yvonne Y Y; Lam, Ernest T; Kwok, Tsz-Piu; Leung, Alden K Y; Poon, Annie; Mostovoy, Yulia; Hastie, Alex R; Stedman, William; Anantharaman, Thomas; Andrews, Warren; Zhou, Xiang; Pang, Andy W C; Dai, Heng; Chu, Catherine; Lin, Chin; Wu, Jacob J K; Li, Catherine M L; Li, Jing-Woei; Yim, Aldrin K Y; Chan, Saki; Sibert, Justin; Džakula, Željko; Cao, Han; Yiu, Siu-Ming; Chan, Ting-Fung; Yip, Kevin Y; Xiao, Ming; Kwok, Pui-Yan

    2016-01-01

    Comprehensive whole-genome structural variation detection is challenging with current approaches. With diploid cells as DNA source and the presence of numerous repetitive elements, short-read DNA sequencing cannot be used to detect structural variation efficiently. In this report, we show that genome mapping with long, fluorescently labeled DNA molecules imaged on nanochannel arrays can be used for whole-genome structural variation detection without sequencing. While whole-genome haplotyping is not achieved, local phasing (across >150-kb regions) is routine, as molecules from the parental chromosomes are examined separately. In one experiment, we generated genome maps from a trio from the 1000 Genomes Project, compared the maps against that derived from the reference human genome, and identified structural variations that are >5 kb in size. We find that these individuals have many more structural variants than those published, including some with the potential of disrupting gene function or regulation. Copyright © 2016 by the Genetics Society of America.

  1. An Unified Multiscale Framework for Planar, Surface, and Curve Skeletonization.

    PubMed

    Jalba, Andrei C; Sobiecki, Andre; Telea, Alexandru C

    2016-01-01

    Computing skeletons of 2D shapes, and medial surface and curve skeletons of 3D shapes, is a challenging task. In particular, there is no unified framework that detects all types of skeletons using a single model, and also produces a multiscale representation which allows to progressively simplify, or regularize, all skeleton types. In this paper, we present such a framework. We model skeleton detection and regularization by a conservative mass transport process from a shape's boundary to its surface skeleton, next to its curve skeleton, and finally to the shape center. The resulting density field can be thresholded to obtain a multiscale representation of progressively simplified surface, or curve, skeletons. We detail a numerical implementation of our framework which is demonstrably stable and has high computational efficiency. We demonstrate our framework on several complex 2D and 3D shapes.

  2. Vascular Variations Associated with Intracranial Aneurysms.

    PubMed

    Orakdogen, Metin; Emon, Selin Tural; Somay, Hakan; Engin, Taner; Is, Merih; Hakan, Tayfun

    2017-01-01

    To investigate the vascular variations in patients with intracranial aneurysm in circle of Willis. We used the data on 128 consecutive intracranial aneurysm cases. Cerebral angiography images were analyzed retrospectively. Arteries were grouped as anterior cerebral arterial system (ACS), posterior cerebral arterial system (PCS) and middle cerebral arterial system (MCS) for grouping vascular variations. Lateralization, being single/multiple, gender; and also any connection with accompanying aneurysms" number, localization, dimension, whether bleeding/incidental aneurysm has been inspected. Variations were demonstrated in 57.8% of the cases. The most common variation was A1 variation (34.4%). The rate of variations was 36.7%, 24.2% and 10.2% respectively in ACS, PCS and MCS. MCS variations were significantly higher in males. Anterior communicating artery (ACoA) aneurysm observance rates were significantly higher and posterior communicating artery (PCoA) aneurysm and middle cerebral artery (MCA) aneurysm observance rates were significantly lower when compared to "no ACS variation detected" cases. In "PCS variation detected" cases, PCoA aneurysm observance rates and coexistence of multiple variations were significantly higher. The rate of vascular variations in patients with aneurysms was 57.8%. Arterial hypoplasia and aplasia were the most common variations. ACS was the most common region that variations were located in; they were mostly detected on the right side. Coexistence of ACoA aneurysm was higher than PCoA and MCA aneurysms. In the PCS variations group, PCoA aneurysms were the most common aneurysms that accompanying the variation and multiple variations were more common than in the other two groups. The variations in MCS were most common in males.

  3. Latest Results From the QuakeFinder Statistical Analysis Framework

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; MacLean, L. S.; Schneider, D.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired an unique dataset with outstanding spatial and temporal sampling of earth's magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. There have been many reports of anomalous variations in the earth's magnetic field preceding earthquakes. Specifically, several authors have drawn attention to apparent anomalous pulsations seen preceding earthquakes. Often studies in long term monitoring of seismic activity are limited by availability of event data. It is particularly difficult to acquire a large dataset for rigorous statistical analyses of the magnetic field near earthquake epicenters because large events are relatively rare. Since QF has acquired hundreds of earthquakes in more than 70 TB of data, we developed an automated approach for finding statistical significance of precursory behavior and developed an algorithm framework. Previously QF reported on the development of an Algorithmic Framework for data processing and hypothesis testing. The particular instance of algorithm we discuss identifies and counts magnetic variations from time series data and ranks each station-day according to the aggregate number of pulses in a time window preceding the day in question. If the hypothesis is true that magnetic field activity increases over some time interval preceding earthquakes, this should reveal itself by the station-days on which earthquakes occur receiving higher ranks than they would if the ranking scheme were random. This can be analysed using the Receiver Operating Characteristic test. In this presentation we give a status report of our latest results, largely focussed on reproducibility of results, robust statistics in the presence of missing data, and exploring optimization landscapes in our parameter space.

  4. GAGA: a new algorithm for genomic inference of geographic ancestry reveals fine level population substructure in Europeans.

    PubMed

    Lao, Oscar; Liu, Fan; Wollstein, Andreas; Kayser, Manfred

    2014-02-01

    Attempts to detect genetic population substructure in humans are troubled by the fact that the vast majority of the total amount of observed genetic variation is present within populations rather than between populations. Here we introduce a new algorithm for transforming a genetic distance matrix that reduces the within-population variation considerably. Extensive computer simulations revealed that the transformed matrix captured the genetic population differentiation better than the original one which was based on the T1 statistic. In an empirical genomic data set comprising 2,457 individuals from 23 different European subpopulations, the proportion of individuals that were determined as a genetic neighbour to another individual from the same sampling location increased from 25% with the original matrix to 52% with the transformed matrix. Similarly, the percentage of genetic variation explained between populations by means of Analysis of Molecular Variance (AMOVA) increased from 1.62% to 7.98%. Furthermore, the first two dimensions of a classical multidimensional scaling (MDS) using the transformed matrix explained 15% of the variance, compared to 0.7% obtained with the original matrix. Application of MDS with Mclust, SPA with Mclust, and GemTools algorithms to the same dataset also showed that the transformed matrix gave a better association of the genetic clusters with the sampling locations, and particularly so when it was used in the AMOVA framework with a genetic algorithm. Overall, the new matrix transformation introduced here substantially reduces the within population genetic differentiation, and can be broadly applied to methods such as AMOVA to enhance their sensitivity to reveal population substructure. We herewith provide a publically available (http://www.erasmusmc.nl/fmb/resources/GAGA) model-free method for improved genetic population substructure detection that can be applied to human as well as any other species data in future studies relevant to evolutionary biology, behavioural ecology, medicine, and forensics.

  5. Colour based fire detection method with temporal intensity variation filtration

    NASA Astrophysics Data System (ADS)

    Trambitckii, K.; Anding, K.; Musalimov, V.; Linß, G.

    2015-02-01

    Development of video, computing technologies and computer vision gives a possibility of automatic fire detection on video information. Under that project different algorithms was implemented to find more efficient way of fire detection. In that article colour based fire detection algorithm is described. But it is not enough to use only colour information to detect fire properly. The main reason of this is that in the shooting conditions may be a lot of things having colour similar to fire. A temporary intensity variation of pixels is used to separate them from the fire. These variations are averaged over the series of several frames. This algorithm shows robust work and was realised as a computer program by using of the OpenCV library.

  6. Deaf Pupils' Reasoning about Scientific Phenomena: School Science as a Framework for Understanding or as Fragments of Factual Knowledge.

    ERIC Educational Resources Information Center

    Molander, B. O.; Pedersen, Svend; Norell, Kia

    2001-01-01

    A Swedish interview study of how deaf pupils reason about phenomena in a science context revealed significant variation in the extent to which pupils used scientific principles for reasoning about science phenomena, which suggests that for some pupils, school science offers little as a framework for reasoning. (Contains references.) (DB)

  7. Exonic duplication CNV of NDRG1 associated with autosomal-recessive HMSN-Lom/CMT4D.

    PubMed

    Okamoto, Yuji; Goksungur, Meryem Tuba; Pehlivan, Davut; Beck, Christine R; Gonzaga-Jauregui, Claudia; Muzny, Donna M; Atik, Mehmed M; Carvalho, Claudia M B; Matur, Zeliha; Bayraktar, Serife; Boone, Philip M; Akyuz, Kaya; Gibbs, Richard A; Battaloglu, Esra; Parman, Yesim; Lupski, James R

    2014-05-01

    Copy-number variations as a mutational mechanism contribute significantly to human disease. Approximately one-half of the patients with Charcot-Marie-Tooth (CMT) disease have a 1.4 Mb duplication copy-number variation as the cause of their neuropathy. However, non-CMT1A neuropathy patients rarely have causative copy-number variations, and to date, autosomal-recessive disease has not been associated with copy-number variation as a mutational mechanism. We performed Agilent 8 × 60 K array comparative genomic hybridization on DNA from 12 recessive Turkish families with CMT disease. Additional molecular studies were conducted to detect breakpoint junctions and to evaluate gene expression levels in a family in which we detected an intragenic duplication copy-number variation. We detected an ~6.25 kb homozygous intragenic duplication in NDRG1, a gene known to be causative for recessive HMSNL/CMT4D, in three individuals from a Turkish family with CMT neuropathy. Further studies showed that this intragenic copy-number variation resulted in a homozygous duplication of exons 6-8 that caused decreased mRNA expression of NDRG1. Exon-focused high-resolution array comparative genomic hybridization enables the detection of copy-number variation carrier states in recessive genes, particularly small copy-number variations encompassing or disrupting single genes. In families for whom a molecular diagnosis has not been elucidated by conventional clinical assays, an assessment for copy-number variations in known CMT genes might be considered.

  8. Linking the serotonin transporter gene, family environments, hippocampal volume and depression onset: A prospective imaging gene × environment analysis.

    PubMed

    Little, Keriann; Olsson, Craig A; Youssef, George J; Whittle, Sarah; Simmons, Julian G; Yücel, Murat; Sheeber, Lisa B; Foley, Debra L; Allen, Nicholas B

    2015-11-01

    A single imaging gene-environment (IGxE) framework that is able to simultaneously model genetic, neurobiological, and environmental influences on psychopathology outcomes is needed to improve understanding of how complex interrelationships between allelic variation, differences in neuroanatomy or neuroactivity, and environmental experience affect risk for psychiatric disorder. In a longitudinal study of adolescent development we demonstrate the utility of such an IGxE framework by testing whether variation in parental behavior at age 12 altered the strength of an imaging genetics pathway, involving an indirect association between allelic variation in the serotonin transporter gene to variation in hippocampal volume and consequent onset of major depressive disorder by age 18. Results were consistent with the presence of an indirect effect of the serotonin transporter S-allele on depression onset via smaller left and right hippocampal volumes that was significant only in family environments involving either higher levels of parental aggression or lower levels of positive parenting. The previously reported finding of S-allele carriers' increased risk of depression in adverse environments may, therefore, be partly because of the effects of these environments on a neurobiological pathway from the serotonin transporter gene to depression onset that proceeds through variation in hippocampal volume. (c) 2015 APA, all rights reserved).

  9. Estimating abundance while accounting for rarity, correlated behavior, and other sources of variation in counts

    USGS Publications Warehouse

    Dorazio, Robert M.; Martin, Juulien; Edwards, Holly H.

    2013-01-01

    The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.

  10. Estimating abundance while accounting for rarity, correlated behavior, and other sources of variation in counts.

    PubMed

    Dorazio, Robert M; Martin, Julien; Edwards, Holly H

    2013-07-01

    The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.

  11. Fast–slow continuum and reproductive strategies structure plant life-history variation worldwide

    PubMed Central

    Salguero-Gómez, Roberto; Jones, Owen R.; Jongejans, Eelke; Blomberg, Simon P.; Hodgson, David J.; Mbeau-Ache, Cyril; Zuidema, Pieter A.; de Kroon, Hans; Buckley, Yvonne M.

    2016-01-01

    The identification of patterns in life-history strategies across the tree of life is essential to our prediction of population persistence, extinction, and diversification. Plants exhibit a wide range of patterns of longevity, growth, and reproduction, but the general determinants of this enormous variation in life history are poorly understood. We use demographic data from 418 plant species in the wild, from annual herbs to supercentennial trees, to examine how growth form, habitat, and phylogenetic relationships structure plant life histories and to develop a framework to predict population performance. We show that 55% of the variation in plant life-history strategies is adequately characterized using two independent axes: the fast–slow continuum, including fast-growing, short-lived plant species at one end and slow-growing, long-lived species at the other, and a reproductive strategy axis, with highly reproductive, iteroparous species at one extreme and poorly reproductive, semelparous plants with frequent shrinkage at the other. Our findings remain consistent across major habitats and are minimally affected by plant growth form and phylogenetic ancestry, suggesting that the relative independence of the fast–slow and reproduction strategy axes is general in the plant kingdom. Our findings have similarities with how life-history strategies are structured in mammals, birds, and reptiles. The position of plant species populations in the 2D space produced by both axes predicts their rate of recovery from disturbances and population growth rate. This life-history framework may complement trait-based frameworks on leaf and wood economics; together these frameworks may allow prediction of responses of plants to anthropogenic disturbances and changing environments. PMID:26699477

  12. Fast-slow continuum and reproductive strategies structure plant life-history variation worldwide.

    PubMed

    Salguero-Gómez, Roberto; Jones, Owen R; Jongejans, Eelke; Blomberg, Simon P; Hodgson, David J; Mbeau-Ache, Cyril; Zuidema, Pieter A; de Kroon, Hans; Buckley, Yvonne M

    2016-01-05

    The identification of patterns in life-history strategies across the tree of life is essential to our prediction of population persistence, extinction, and diversification. Plants exhibit a wide range of patterns of longevity, growth, and reproduction, but the general determinants of this enormous variation in life history are poorly understood. We use demographic data from 418 plant species in the wild, from annual herbs to supercentennial trees, to examine how growth form, habitat, and phylogenetic relationships structure plant life histories and to develop a framework to predict population performance. We show that 55% of the variation in plant life-history strategies is adequately characterized using two independent axes: the fast-slow continuum, including fast-growing, short-lived plant species at one end and slow-growing, long-lived species at the other, and a reproductive strategy axis, with highly reproductive, iteroparous species at one extreme and poorly reproductive, semelparous plants with frequent shrinkage at the other. Our findings remain consistent across major habitats and are minimally affected by plant growth form and phylogenetic ancestry, suggesting that the relative independence of the fast-slow and reproduction strategy axes is general in the plant kingdom. Our findings have similarities with how life-history strategies are structured in mammals, birds, and reptiles. The position of plant species populations in the 2D space produced by both axes predicts their rate of recovery from disturbances and population growth rate. This life-history framework may complement trait-based frameworks on leaf and wood economics; together these frameworks may allow prediction of responses of plants to anthropogenic disturbances and changing environments.

  13. A Framework for Policies and Practices to Improve Test Security Programs: Prevention, Detection, Investigation, and Resolution (PDIR)

    ERIC Educational Resources Information Center

    Ferrara, Steve

    2017-01-01

    Test security is not an end in itself; it is important because we want to be able to make valid interpretations from test scores. In this article, I propose a framework for comprehensive test security systems: prevention, detection, investigation, and resolution. The article discusses threats to test security, roles and responsibilities, rigorous…

  14. Two luminescent Zn(II) metal-organic frameworks for exceptionally selective detection of picric acid explosives.

    PubMed

    Shi, Zhi-Qiang; Guo, Zi-Jian; Zheng, He-Gen

    2015-05-14

    Two luminescent Zn(II) metal-organic frameworks were prepared from a π-conjugated thiophene-containing carboxylic acid ligand. These two MOFs show strong luminescene and their luminescence could be quenched by a series of nitroaromatic explosives. Importantly, they exhibit very highly sensitive and selective detection of picric acid compared to other nitroaromatic explosives.

  15. FSM-F: Finite State Machine Based Framework for Denial of Service and Intrusion Detection in MANET

    PubMed Central

    N. Ahmed, Malik; Abdullah, Abdul Hanan; Kaiwartya, Omprakash

    2016-01-01

    Due to the continuous advancements in wireless communication in terms of quality of communication and affordability of the technology, the application area of Mobile Adhoc Networks (MANETs) significantly growing particularly in military and disaster management. Considering the sensitivity of the application areas, security in terms of detection of Denial of Service (DoS) and intrusion has become prime concern in research and development in the area. The security systems suggested in the past has state recognition problem where the system is not able to accurately identify the actual state of the network nodes due to the absence of clear definition of states of the nodes. In this context, this paper proposes a framework based on Finite State Machine (FSM) for denial of service and intrusion detection in MANETs. In particular, an Interruption Detection system for Adhoc On-demand Distance Vector (ID-AODV) protocol is presented based on finite state machine. The packet dropping and sequence number attacks are closely investigated and detection systems for both types of attacks are designed. The major functional modules of ID-AODV includes network monitoring system, finite state machine and attack detection model. Simulations are carried out in network simulator NS-2 to evaluate the performance of the proposed framework. A comparative evaluation of the performance is also performed with the state-of-the-art techniques: RIDAN and AODV. The performance evaluations attest the benefits of proposed framework in terms of providing better security for denial of service and intrusion detection attacks. PMID:27285146

  16. Vibration-based damage detection in a concrete beam under temperature variations using AR models and state-space approaches

    NASA Astrophysics Data System (ADS)

    Clément, A.; Laurens, S.

    2011-07-01

    The Structural Health Monitoring of civil structures subjected to ambient vibrations is very challenging. Indeed, the variations of environmental conditions and the difficulty to characterize the excitation make the damage detection a hard task. Auto-regressive (AR) models coefficients are often used as damage sensitive feature. The presented work proposes a comparison of the AR approach with a state-space feature formed by the Jacobian matrix of the dynamical process. Since the detection of damage can be formulated as a novelty detection problem, Mahalanobis distance is applied to track new points from an undamaged reference collection of feature vectors. Data from a concrete beam subjected to temperature variations and damaged by several static loading are analyzed. It is observed that the damage sensitive features are effectively sensitive to temperature variations. However, the use of the Mahalanobis distance makes possible the detection of cracking with both of them. Early damage (before cracking) is only revealed by the AR coefficients with a good sensibility.

  17. CNV-seq, a new method to detect copy number variation using high-throughput sequencing.

    PubMed

    Xie, Chao; Tammi, Martti T

    2009-03-06

    DNA copy number variation (CNV) has been recognized as an important source of genetic variation. Array comparative genomic hybridization (aCGH) is commonly used for CNV detection, but the microarray platform has a number of inherent limitations. Here, we describe a method to detect copy number variation using shotgun sequencing, CNV-seq. The method is based on a robust statistical model that describes the complete analysis procedure and allows the computation of essential confidence values for detection of CNV. Our results show that the number of reads, not the length of the reads is the key factor determining the resolution of detection. This favors the next-generation sequencing methods that rapidly produce large amount of short reads. Simulation of various sequencing methods with coverage between 0.1x to 8x show overall specificity between 91.7 - 99.9%, and sensitivity between 72.2 - 96.5%. We also show the results for assessment of CNV between two individual human genomes.

  18. A parallel spatiotemporal saliency and discriminative online learning method for visual target tracking in aerial videos.

    PubMed

    Aghamohammadi, Amirhossein; Ang, Mei Choo; A Sundararajan, Elankovan; Weng, Ng Kok; Mogharrebi, Marzieh; Banihashem, Seyed Yashar

    2018-01-01

    Visual tracking in aerial videos is a challenging task in computer vision and remote sensing technologies due to appearance variation difficulties. Appearance variations are caused by camera and target motion, low resolution noisy images, scale changes, and pose variations. Various approaches have been proposed to deal with appearance variation difficulties in aerial videos, and amongst these methods, the spatiotemporal saliency detection approach reported promising results in the context of moving target detection. However, it is not accurate for moving target detection when visual tracking is performed under appearance variations. In this study, a visual tracking method is proposed based on spatiotemporal saliency and discriminative online learning methods to deal with appearance variations difficulties. Temporal saliency is used to represent moving target regions, and it was extracted based on the frame difference with Sauvola local adaptive thresholding algorithms. The spatial saliency is used to represent the target appearance details in candidate moving regions. SLIC superpixel segmentation, color, and moment features can be used to compute feature uniqueness and spatial compactness of saliency measurements to detect spatial saliency. It is a time consuming process, which prompted the development of a parallel algorithm to optimize and distribute the saliency detection processes that are loaded into the multi-processors. Spatiotemporal saliency is then obtained by combining the temporal and spatial saliencies to represent moving targets. Finally, a discriminative online learning algorithm was applied to generate a sample model based on spatiotemporal saliency. This sample model is then incrementally updated to detect the target in appearance variation conditions. Experiments conducted on the VIVID dataset demonstrated that the proposed visual tracking method is effective and is computationally efficient compared to state-of-the-art methods.

  19. A parallel spatiotemporal saliency and discriminative online learning method for visual target tracking in aerial videos

    PubMed Central

    2018-01-01

    Visual tracking in aerial videos is a challenging task in computer vision and remote sensing technologies due to appearance variation difficulties. Appearance variations are caused by camera and target motion, low resolution noisy images, scale changes, and pose variations. Various approaches have been proposed to deal with appearance variation difficulties in aerial videos, and amongst these methods, the spatiotemporal saliency detection approach reported promising results in the context of moving target detection. However, it is not accurate for moving target detection when visual tracking is performed under appearance variations. In this study, a visual tracking method is proposed based on spatiotemporal saliency and discriminative online learning methods to deal with appearance variations difficulties. Temporal saliency is used to represent moving target regions, and it was extracted based on the frame difference with Sauvola local adaptive thresholding algorithms. The spatial saliency is used to represent the target appearance details in candidate moving regions. SLIC superpixel segmentation, color, and moment features can be used to compute feature uniqueness and spatial compactness of saliency measurements to detect spatial saliency. It is a time consuming process, which prompted the development of a parallel algorithm to optimize and distribute the saliency detection processes that are loaded into the multi-processors. Spatiotemporal saliency is then obtained by combining the temporal and spatial saliencies to represent moving targets. Finally, a discriminative online learning algorithm was applied to generate a sample model based on spatiotemporal saliency. This sample model is then incrementally updated to detect the target in appearance variation conditions. Experiments conducted on the VIVID dataset demonstrated that the proposed visual tracking method is effective and is computationally efficient compared to state-of-the-art methods. PMID:29438421

  20. From Pixels to Response Maps: Discriminative Image Filtering for Face Alignment in the Wild.

    PubMed

    Asthana, Akshay; Zafeiriou, Stefanos; Tzimiropoulos, Georgios; Cheng, Shiyang; Pantic, Maja

    2015-06-01

    We propose a face alignment framework that relies on the texture model generated by the responses of discriminatively trained part-based filters. Unlike standard texture models built from pixel intensities or responses generated by generic filters (e.g. Gabor), our framework has two important advantages. First, by virtue of discriminative training, invariance to external variations (like identity, pose, illumination and expression) is achieved. Second, we show that the responses generated by discriminatively trained filters (or patch-experts) are sparse and can be modeled using a very small number of parameters. As a result, the optimization methods based on the proposed texture model can better cope with unseen variations. We illustrate this point by formulating both part-based and holistic approaches for generic face alignment and show that our framework outperforms the state-of-the-art on multiple "wild" databases. The code and dataset annotations are available for research purposes from http://ibug.doc.ic.ac.uk/resources.

  1. The MHOST finite element program: 3-D inelastic analysis methods for hot section components. Volume 1: Theoretical manual

    NASA Technical Reports Server (NTRS)

    Nakazawa, Shohei

    1991-01-01

    Formulations and algorithms implemented in the MHOST finite element program are discussed. The code uses a novel concept of the mixed iterative solution technique for the efficient 3-D computations of turbine engine hot section components. The general framework of variational formulation and solution algorithms are discussed which were derived from the mixed three field Hu-Washizu principle. This formulation enables the use of nodal interpolation for coordinates, displacements, strains, and stresses. Algorithmic description of the mixed iterative method includes variations for the quasi static, transient dynamic and buckling analyses. The global-local analysis procedure referred to as the subelement refinement is developed in the framework of the mixed iterative solution, of which the detail is presented. The numerically integrated isoparametric elements implemented in the framework is discussed. Methods to filter certain parts of strain and project the element discontinuous quantities to the nodes are developed for a family of linear elements. Integration algorithms are described for linear and nonlinear equations included in MHOST program.

  2. A framework for the interpretation of de novo mutation in human disease

    PubMed Central

    Samocha, Kaitlin E.; Robinson, Elise B.; Sanders, Stephan J.; Stevens, Christine; Sabo, Aniko; McGrath, Lauren M.; Kosmicki, Jack A.; Rehnström, Karola; Mallick, Swapan; Kirby, Andrew; Wall, Dennis P.; MacArthur, Daniel G.; Gabriel, Stacey B.; dePristo, Mark; Purcell, Shaun M.; Palotie, Aarno; Boerwinkle, Eric; Buxbaum, Joseph D.; Cook, Edwin H.; Gibbs, Richard A.; Schellenberg, Gerard D.; Sutcliffe, James S.; Devlin, Bernie; Roeder, Kathryn; Neale, Benjamin M.; Daly, Mark J.

    2014-01-01

    Spontaneously arising (‘de novo’) mutations play an important role in medical genetics. For diseases with extensive locus heterogeneity – such as autism spectrum disorders (ASDs) – the signal from de novo mutations (DNMs) is distributed across many genes, making it difficult to distinguish disease-relevant mutations from background variation. We provide a statistical framework for the analysis of DNM excesses per gene and gene set by calibrating a model of de novo mutation. We applied this framework to DNMs collected from 1,078 ASD trios and – while affirming a significant role for loss-of-function (LoF) mutations – found no excess of de novo LoF mutations in cases with IQ above 100, suggesting that the role of DNMs in ASD may reside in fundamental neurodevelopmental processes. We also used our model to identify ~1,000 genes that are significantly lacking functional coding variation in non-ASD samples and are enriched for de novo LoF mutations identified in ASD cases. PMID:25086666

  3. Geographical Variation in Diabetes Prevalence and Detection in China: Multilevel Spatial Analysis of 98,058 Adults

    PubMed Central

    Zhou, Maigeng; Astell-Burt, Thomas; Bi, Yufang; Feng, Xiaoqi; Jiang, Yong; Li, Yichong; Page, Andrew; Wang, Limin; Xu, Yu

    2015-01-01

    OBJECTIVE To investigate the geographic variation in diabetes prevalence and detection in China. RESEARCH DESIGN AND METHODS Self-report and biomedical data were collected from 98,058 adults aged ≥18 years (90.5% response) from 162 areas spanning mainland China. Diabetes status was assessed using American Diabetes Association criteria. Among those with diabetes, detection was defined by prior diagnosis. Choropleth maps were used to visually assess geographical variation in each outcome at the provincial level. The odds of each outcome were assessed using multilevel logistic regression, with adjustment for person- and area-level characteristics. RESULTS Geographic visualization at the provincial level indicated widespread variation in diabetes prevalence and detection across China. Regional prevalence adjusted for age, sex, and urban/rural socioeconomic circumstances (SECs) ranged from 8.3% (95% CI 7.2%, 9.7%) in the northeast to 12.7% (11.1%, 14.6%) in the north. A clear negative gradient in diabetes prevalence was observed from 13.1% (12.0%, 14.4%) in the urban high-SEC to 8.7% (7.8%, 9.6%) in rural low-SEC counties/districts. Adjusting for health literacy and other person-level characteristics only partially attenuated these geographic variations. Only one-third of participants living with diabetes had been previously diagnosed, but this also varied substantively by geography. Regional detection adjusted for age, sex, and urban/rural SEC, for example, spanned from 40.4% (34.9%, 46.3%) in the north to 15.6% (11.7%, 20.5%) in the southwest. Compared with detection of 40.8% (37.3%, 44.4%) in urban high-SEC counties, detection was poorest among rural low-SEC counties at just 20.5% (17.7%, 23.7%). Person-level characteristics did not fully account for these geographic variations in diabetes detection. CONCLUSIONS Strategies for addressing diabetes risk and improving detection require geographical targeting. PMID:25352654

  4. Long Term Safety Area Tracking (LT-SAT) with online failure detection and recovery for robotic minimally invasive surgery.

    PubMed

    Penza, Veronica; Du, Xiaofei; Stoyanov, Danail; Forgione, Antonello; Mattos, Leonardo S; De Momi, Elena

    2018-04-01

    Despite the benefits introduced by robotic systems in abdominal Minimally Invasive Surgery (MIS), major complications can still affect the outcome of the procedure, such as intra-operative bleeding. One of the causes is attributed to accidental damages to arteries or veins by the surgical tools, and some of the possible risk factors are related to the lack of sub-surface visibilty. Assistive tools guiding the surgical gestures to prevent these kind of injuries would represent a relevant step towards safer clinical procedures. However, it is still challenging to develop computer vision systems able to fulfill the main requirements: (i) long term robustness, (ii) adaptation to environment/object variation and (iii) real time processing. The purpose of this paper is to develop computer vision algorithms to robustly track soft tissue areas (Safety Area, SA), defined intra-operatively by the surgeon based on the real-time endoscopic images, or registered from a pre-operative surgical plan. We propose a framework to combine an optical flow algorithm with a tracking-by-detection approach in order to be robust against failures caused by: (i) partial occlusion, (ii) total occlusion, (iii) SA out of the field of view, (iv) deformation, (v) illumination changes, (vi) abrupt camera motion, (vii), blur and (viii) smoke. A Bayesian inference-based approach is used to detect the failure of the tracker, based on online context information. A Model Update Strategy (MUpS) is also proposed to improve the SA re-detection after failures, taking into account the changes of appearance of the SA model due to contact with instruments or image noise. The performance of the algorithm was assessed on two datasets, representing ex-vivo organs and in-vivo surgical scenarios. Results show that the proposed framework, enhanced with MUpS, is capable of maintain high tracking performance for extended periods of time ( ≃ 4 min - containing the aforementioned events) with high precision (0.7) and recall (0.8) values, and with a recovery time after a failure between 1 and 8 frames in the worst case. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Regularization by Functions of Bounded Variation and Applications to Image Enhancement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casas, E.; Kunisch, K.; Pola, C.

    1999-09-15

    Optimization problems regularized by bounded variation seminorms are analyzed. The optimality system is obtained and finite-dimensional approximations of bounded variation function spaces as well as of the optimization problems are studied. It is demonstrated that the choice of the vector norm in the definition of the bounded variation seminorm is of special importance for approximating subspaces consisting of piecewise constant functions. Algorithms based on a primal-dual framework that exploit the structure of these nondifferentiable optimization problems are proposed. Numerical examples are given for denoising of blocky images with very high noise.

  6. Abnormal Circulation Changes in the Winter Stratosphere, Detected Through Variations of D Region Ionospheric Absorption

    NASA Technical Reports Server (NTRS)

    Delamorena, B. A.

    1984-01-01

    A method to detect stratospheric warmings using ionospheric absorption records obtained by an Absorption Meter (method A3) is introduced. The activity of the stratospheric circulation and the D region ionospheric absorption as well as other atmospheric parameters during the winter anomaly experience an abnormal variation. A simultaneity was found in the beginning of abnormal variation in the mentioned parameters, using the absorption records for detecting the initiation of the stratospheric warming. Results of this scientific experience of forecasting in the El Arenosillo Range, are presented.

  7. A partial differential equation-based general framework adapted to Rayleigh's, Rician's and Gaussian's distributed noise for restoration and enhancement of magnetic resonance image.

    PubMed

    Yadav, Ram Bharos; Srivastava, Subodh; Srivastava, Rajeev

    2016-01-01

    The proposed framework is obtained by casting the noise removal problem into a variational framework. This framework automatically identifies the various types of noise present in the magnetic resonance image and filters them by choosing an appropriate filter. This filter includes two terms: the first term is a data likelihood term and the second term is a prior function. The first term is obtained by minimizing the negative log likelihood of the corresponding probability density functions: Gaussian or Rayleigh or Rician. Further, due to the ill-posedness of the likelihood term, a prior function is needed. This paper examines three partial differential equation based priors which include total variation based prior, anisotropic diffusion based prior, and a complex diffusion (CD) based prior. A regularization parameter is used to balance the trade-off between data fidelity term and prior. The finite difference scheme is used for discretization of the proposed method. The performance analysis and comparative study of the proposed method with other standard methods is presented for brain web dataset at varying noise levels in terms of peak signal-to-noise ratio, mean square error, structure similarity index map, and correlation parameter. From the simulation results, it is observed that the proposed framework with CD based prior is performing better in comparison to other priors in consideration.

  8. Complications of laryngeal framework surgery (phonosurgery).

    PubMed

    Tucker, H M; Wanamaker, J; Trott, M; Hicks, D

    1993-05-01

    The rising popularity of surgery involving the laryngeal framework (surgical medialization of immobile vocal folds, vocal fold tightening, pitch variation, etc.) has resulted in increasing case experience. Little has appeared in the literature regarding complications or long-term results of this type of surgery. Several years' experience in a major referral center with various types of laryngeal framework surgery has led to a small number of complications. These have included late extrusion of the prosthesis and delayed hemorrhage. A review of these complications and recommendations for modification of technique to minimize them in the future are discussed.

  9. An automated and integrated framework for dust storm detection based on ogc web processing services

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.

  10. A framework for small infrared target real-time visual enhancement

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoliang; Long, Gucan; Shang, Yang; Liu, Xiaolin

    2015-03-01

    This paper proposes a framework for small infrared target real-time visual enhancement. The framework is consisted of three parts: energy accumulation for small infrared target enhancement, noise suppression and weighted fusion. Dynamic programming based track-before-detection algorithm is adopted in the energy accumulation to detect the target accurately and enhance the target's intensity notably. In the noise suppression, the target region is weighted by a Gaussian mask according to the target's Gaussian shape. In order to fuse the processed target region and unprocessed background smoothly, the intensity in the target region is treated as weight in the fusion. Experiments on real small infrared target images indicate that the framework proposed in this paper can enhances the small infrared target markedly and improves the image's visual quality notably. The proposed framework outperforms tradition algorithms in enhancing the small infrared target, especially for image in which the target is hardly visible.

  11. The influence of habitats on female mobility in Central and Western Africa inferred from human mitochondrial variation

    PubMed Central

    2013-01-01

    Background When studying the genetic structure of human populations, the role of cultural factors may be difficult to ascertain due to a lack of formal models. Linguistic diversity is a typical example of such a situation. Patrilocality, on the other hand, can be integrated into a biological framework, allowing the formulation of explicit working hypotheses. The present study is based on the assumption that patrilocal traditions make the hypervariable region I of the mtDNA a valuable tool for the exploration of migratory dynamics, offering the opportunity to explore the relationships between genetic and linguistic diversity. We studied 85 Niger-Congo-speaking patrilocal populations that cover regions from Senegal to Central African Republic. A total of 4175 individuals were included in the study. Results By combining a multivariate analysis aimed at investigating the population genetic structure, with a Bayesian approach used to test models and extent of migration, we were able to detect a stepping-stone migration model as the best descriptor of gene flow across the region, with the main discontinuities corresponding to forested areas. Conclusions Our analyses highlight an aspect of the influence of habitat variation on human genetic diversity that has yet to be understood. Rather than depending simply on geographic linear distances, patterns of female genetic variation vary substantially between savannah and rainforest environments. Our findings may be explained by the effects of recent gene flow constrained by environmental factors, which superimposes on a background shaped by pre-agricultural peopling. PMID:23360301

  12. Population Genetics of the Eastern Hellbender (Cryptobranchus alleganiensis alleganiensis) across Multiple Spatial Scales

    PubMed Central

    Unger, Shem D.; Rhodes, Olin E.; Sutton, Trent M.; Williams, Rod N.

    2013-01-01

    Conservation genetics is a powerful tool to assess the population structure of species and provides a framework for informing management of freshwater ecosystems. As lotic habitats become fragmented, the need to assess gene flow for species of conservation management becomes a priority. The eastern hellbender (Cryptobranchus alleganiensis alleganiensis) is a large, fully aquatic paedamorphic salamander. Many populations are experiencing declines throughout their geographic range, yet the genetic ramifications of these declines are currently unknown. To this end, we examined levels of genetic variation and genetic structure at both range-wide and drainage (hierarchical) scales. We collected 1,203 individuals from 77 rivers throughout nine states from June 2007 to August 2011. Levels of genetic diversity were relatively high among all sampling locations. We detected significant genetic structure across populations (Fst values ranged from 0.001 between rivers within a single watershed to 0.218 between states). We identified two genetically differentiated groups at the range-wide scale: 1) the Ohio River drainage and 2) the Tennessee River drainage. An analysis of molecular variance (AMOVA) based on landscape-scale sampling of basins within the Tennessee River drainage revealed the majority of genetic variation (∼94–98%) occurs within rivers. Eastern hellbenders show a strong pattern of isolation by stream distance (IBSD) at the drainage level. Understanding levels of genetic variation and differentiation at multiple spatial and biological scales will enable natural resource managers to make more informed decisions and plan effective conservation strategies for cryptic, lotic species. PMID:24204565

  13. Fluorine Variations in the Globular Cluster NGC 6656 (M22): Implications for Internal Enrichment Timescales

    NASA Astrophysics Data System (ADS)

    D'Orazi, Valentina; Lucatello, Sara; Lugaro, Maria; Gratton, Raffaele G.; Angelou, George; Bragaglia, Angela; Carretta, Eugenio; Alves-Brito, Alan; Ivans, Inese I.; Masseron, Thomas; Mucciarelli, Alessio

    2013-01-01

    Observed chemical (anti)correlations in proton-capture elements among globular cluster stars are presently recognized as the signature of self-enrichment from now extinct, previous generations of stars. This defines the multiple population scenario. Since fluorine is also affected by proton captures, determining its abundance in globular clusters provides new and complementary clues regarding the nature of these previous generations and supplies strong observational constraints to the chemical enrichment timescales. In this paper, we present our results on near-infrared CRIRES spectroscopic observations of six cool giant stars in NGC 6656 (M22): the main objective is to derive the F content and its internal variation in this peculiar cluster, which exhibits significant changes in both light- and heavy-element abundances. Across our sample, we detected F variations beyond the measurement uncertainties and found that the F abundances are positively correlated with O and anticorrelated with Na, as expected according to the multiple population framework. Furthermore, our observations reveal an increase in the F content between the two different sub-groups, s-process rich and s-process poor, hosted within M22. The comparison with theoretical models suggests that asymptotic giant stars with masses between 4 and 5 M ⊙ are responsible for the observed chemical pattern, confirming evidence from previous works: the difference in age between the two sub-components in M22 must be not larger than a few hundred Myr. Based on observations taken with ESO telescopes under program 087.0319(A).

  14. High frequency variations of the main magnetic field: convergence of observations and theory (Petrus Peregrinus Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Jault, Dominique

    2013-04-01

    Understanding the main magnetic field variations has been hindered by the discrepancy between the periods (from months to years) of the simplest linear wave phenomena and the relatively long time intervals (10 to 100 years) over which magnetic field changes can be confidently monitored. A theoretical description of short-period waves within the Earth's fluid core is at hand. Quasi-geostrophic inertial waves (akin to Rossby waves in the atmosphere) are slightly modified in the presence of magnetic fields and torsional oscillations consist of differential motion between coaxial rigid cylindrical annuli. Torsional oscillations are sensitive to the whole magnetic field that they shear in the course of their propagation. From their modelling, we have thus gained an estimate for the magnetic field strength in the core interior. There is now ongoing work to extend the theoretical framework to longer times. Furthermore, data collected from the Swarm constellation of three satellites to be launched this year by ESA will permit to better separate the internal and external magnetic signals. We may thus dream to detect quasi-geostrophic inertial waves. As the spectral ranges of theoretical models and observations begin to overlap, we can now go beyond the understanding of the magnetic field variations as the juxtaposition of partial models, arranged as a set of nested Matryoshka dolls. This talk will give illustrations for this statement, among which the question of induction in the lower mantle.

  15. Highly selective luminescent sensing of picric acid based on a water-stable europium metal-organic framework

    NASA Astrophysics Data System (ADS)

    Xia, Tifeng; Zhu, Fengliang; Cui, Yuanjing; Yang, Yu; Wang, Zhiyu; Qian, Guodong

    2017-01-01

    A water-stable metal-organic framework (MOF) EuNDC has been synthesized for selective detection of the well-known contaminant and toxicant picric acid (PA) in aqueous solution. Due to the photo-induced electron transfer and self-absorption mechanism, EuNDC displayed rapid, selective and sensitive detection of PA with a detection limit of 37.6 ppb. Recyclability experiments revealed that EuNDC retains its initial luminescent intensity and same quenching efficiency in each cycle, suggesting high photostability and reusability for long-term sensing applications. The excellent detection performance of EuNDC makes it a promising PA sensing material for practical applications.

  16. An analytical framework for whole-genome sequence association studies and its implications for autism spectrum disorder.

    PubMed

    Werling, Donna M; Brand, Harrison; An, Joon-Yong; Stone, Matthew R; Zhu, Lingxue; Glessner, Joseph T; Collins, Ryan L; Dong, Shan; Layer, Ryan M; Markenscoff-Papadimitriou, Eirene; Farrell, Andrew; Schwartz, Grace B; Wang, Harold Z; Currall, Benjamin B; Zhao, Xuefang; Dea, Jeanselle; Duhn, Clif; Erdman, Carolyn A; Gilson, Michael C; Yadav, Rachita; Handsaker, Robert E; Kashin, Seva; Klei, Lambertus; Mandell, Jeffrey D; Nowakowski, Tomasz J; Liu, Yuwen; Pochareddy, Sirisha; Smith, Louw; Walker, Michael F; Waterman, Matthew J; He, Xin; Kriegstein, Arnold R; Rubenstein, John L; Sestan, Nenad; McCarroll, Steven A; Neale, Benjamin M; Coon, Hilary; Willsey, A Jeremy; Buxbaum, Joseph D; Daly, Mark J; State, Matthew W; Quinlan, Aaron R; Marth, Gabor T; Roeder, Kathryn; Devlin, Bernie; Talkowski, Michael E; Sanders, Stephan J

    2018-05-01

    Genomic association studies of common or rare protein-coding variation have established robust statistical approaches to account for multiple testing. Here we present a comparable framework to evaluate rare and de novo noncoding single-nucleotide variants, insertion/deletions, and all classes of structural variation from whole-genome sequencing (WGS). Integrating genomic annotations at the level of nucleotides, genes, and regulatory regions, we define 51,801 annotation categories. Analyses of 519 autism spectrum disorder families did not identify association with any categories after correction for 4,123 effective tests. Without appropriate correction, biologically plausible associations are observed in both cases and controls. Despite excluding previously identified gene-disrupting mutations, coding regions still exhibited the strongest associations. Thus, in autism, the contribution of de novo noncoding variation is probably modest in comparison to that of de novo coding variants. Robust results from future WGS studies will require large cohorts and comprehensive analytical strategies that consider the substantial multiple-testing burden.

  17. An automated image analysis framework for segmentation and division plane detection of single live Staphylococcus aureus cells which can operate at millisecond sampling time scales using bespoke Slimfield microscopy

    NASA Astrophysics Data System (ADS)

    Wollman, Adam J. M.; Miller, Helen; Foster, Simon; Leake, Mark C.

    2016-10-01

    Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphologically complex structures of fluorescently labelled proteins present in clusters of other types of cells.

  18. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.

    PubMed

    Elgendi, Mohamed

    2016-11-02

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  19. A multi-GPU real-time dose simulation software framework for lung radiotherapy.

    PubMed

    Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A

    2012-09-01

    Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.

  20. Highly sensitive detection of dipicolinic acid with a water-dispersible terbium-metal organic framework.

    PubMed

    Bhardwaj, Neha; Bhardwaj, Sanjeev; Mehta, Jyotsana; Kim, Ki-Hyun; Deep, Akash

    2016-12-15

    The sensitive detection of dipicolinic acid (DPA) is strongly associated with the sensing of bacterial organisms in food and many types of environmental samples. To date, the demand for a sensitive detection method for bacterial toxicity has increased remarkably. Herein, we investigated the DPA detection potential of a water-dispersible terbium-metal organic framework (Tb-MOF) based on the fluorescence quenching mechanism. The Tb-MOF showed a highly sensitive ability to detect DPA at a limit of detection of 0.04nM (linear range of detection: 1nM to 5µM) and also offered enhanced selectivity from other commonly associated organic molecules. The present study provides a basis for the application of Tb-MOF for direct, convenient, highly sensitive, and specific detection of DPA in the actual samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Dynamic Shade and Irradiance Simulation of Aquatic ...

    EPA Pesticide Factsheets

    Penumbra is a landscape shade and irradiance simulation model that simulates how solar energy spatially and temporally interacts within dynamic ecosystems such as riparian zones, forests, and other terrain that cast topological shadows. Direct and indirect solar energy accumulates across landscapes and is the main energy driver for increasing aquatic and landscape temperatures at both local and holistic scales. Landscape disturbances such as landuse change, clear cutting, and fire can cause significant variations in the resulting irradiance reaching particular locations. Penumbra can simulate solar angles and irradiance at definable temporal grains as low as one minute while simulating landscape shadowing up to an entire year. Landscapes can be represented at sub-meter resolutions with appropriate spatial data inputs, such as field data or elevation and surface object heights derived from light detection and ranging (LiDAR) data. This work describes Penumbra’s framework and methodology, external model integration capability, and appropriate model application for a variety of watershed restoration project types. First, an overview of Penumbra’s framework reveals what this model adds to the existing ecological modeling domain. Second, Penumbra’s stand-alone and integration modes are explained and demonstrated. Stand-alone modeling results are showcased within the 3-D visualization tool VISTAS (VISualizing Terrestrial-Aquatic Systems), which fluently summariz

  2. A probabilistic approach to joint cell tracking and segmentation in high-throughput microscopy videos.

    PubMed

    Arbelle, Assaf; Reyes, Jose; Chen, Jia-Yun; Lahav, Galit; Riklin Raviv, Tammy

    2018-04-22

    We present a novel computational framework for the analysis of high-throughput microscopy videos of living cells. The proposed framework is generally useful and can be applied to different datasets acquired in a variety of laboratory settings. This is accomplished by tying together two fundamental aspects of cell lineage construction, namely cell segmentation and tracking, via a Bayesian inference of dynamic models. In contrast to most existing approaches, which aim to be general, no assumption of cell shape is made. Spatial, temporal, and cross-sectional variation of the analysed data are accommodated by two key contributions. First, time series analysis is exploited to estimate the temporal cell shape uncertainty in addition to cell trajectory. Second, a fast marching (FM) algorithm is used to integrate the inferred cell properties with the observed image measurements in order to obtain image likelihood for cell segmentation, and association. The proposed approach has been tested on eight different time-lapse microscopy data sets, some of which are high-throughput, demonstrating promising results for the detection, segmentation and association of planar cells. Our results surpass the state of the art for the Fluo-C2DL-MSC data set of the Cell Tracking Challenge (Maška et al., 2014). Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Generalized estimators of avian abundance from count survey data

    USGS Publications Warehouse

    Royle, J. Andrew

    2004-01-01

    I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.

  4. Diversity Arrays Technology (DArT) for whole-genome profiling of barley

    PubMed Central

    Wenzl, Peter; Carling, Jason; Kudrna, David; Jaccoud, Damian; Huttner, Eric; Kleinhofs, Andris; Kilian, Andrzej

    2004-01-01

    Diversity Arrays Technology (DArT) can detect and type DNA variation at several hundred genomic loci in parallel without relying on sequence information. Here we show that it can be effectively applied to genetic mapping and diversity analyses of barley, a species with a 5,000-Mbp genome. We tested several complexity reduction methods and selected two that generated the most polymorphic genomic representations. Arrays containing individual fragments from these representations generated DArT fingerprints with a genotype call rate of 98.0% and a scoring reproducibility of at least 99.8%. The fingerprints grouped barley lines according to known genetic relationships. To validate the Mendelian behavior of DArT markers, we constructed a genetic map for a cross between cultivars Steptoe and Morex. Nearly all polymorphic array features could be incorporated into one of seven linkage groups (98.8%). The resulting map comprised ≈385 unique DArT markers and spanned 1,137 centimorgans. A comparison with the restriction fragment length polymorphism-based framework map indicated that the quality of the DArT map was equivalent, if not superior, to that of the framework map. These results highlight the potential of DArT as a generic technique for genome profiling in the context of molecular breeding and genomics. PMID:15192146

  5. A robust multi-kernel change detection framework for detecting leaf beetle defoliation using Landsat 7 ETM+ data

    NASA Astrophysics Data System (ADS)

    Anees, Asim; Aryal, Jagannath; O'Reilly, Małgorzata M.; Gale, Timothy J.; Wardlaw, Tim

    2016-12-01

    A robust non-parametric framework, based on multiple Radial Basic Function (RBF) kernels, is proposed in this study, for detecting land/forest cover changes using Landsat 7 ETM+ images. One of the widely used frameworks is to find change vectors (difference image) and use a supervised classifier to differentiate between change and no-change. The Bayesian Classifiers e.g. Maximum Likelihood Classifier (MLC), Naive Bayes (NB), are widely used probabilistic classifiers which assume parametric models, e.g. Gaussian function, for the class conditional distributions. However, their performance can be limited if the data set deviates from the assumed model. The proposed framework exploits the useful properties of Least Squares Probabilistic Classifier (LSPC) formulation i.e. non-parametric and probabilistic nature, to model class posterior probabilities of the difference image using a linear combination of a large number of Gaussian kernels. To this end, a simple technique, based on 10-fold cross-validation is also proposed for tuning model parameters automatically instead of selecting a (possibly) suboptimal combination from pre-specified lists of values. The proposed framework has been tested and compared with Support Vector Machine (SVM) and NB for detection of defoliation, caused by leaf beetles (Paropsisterna spp.) in Eucalyptus nitens and Eucalyptus globulus plantations of two test areas, in Tasmania, Australia, using raw bands and band combination indices of Landsat 7 ETM+. It was observed that due to multi-kernel non-parametric formulation and probabilistic nature, the LSPC outperforms parametric NB with Gaussian assumption in change detection framework, with Overall Accuracy (OA) ranging from 93.6% (κ = 0.87) to 97.4% (κ = 0.94) against 85.3% (κ = 0.69) to 93.4% (κ = 0.85), and is more robust to changing data distributions. Its performance was comparable to SVM, with added advantages of being probabilistic and capable of handling multi-class problems naturally with its original formulation.

  6. VizieR Online Data Catalog: RR Lyrae in SDSS Stripe 82 (Suveges+, 2012)

    NASA Astrophysics Data System (ADS)

    Suveges, M.; Sesar, B.; Varadi, M.; Mowlavi, N.; Becker, A. C.; Ivezic, Z.; Beck, M.; Nienartowicz, K.; Rimoldini, L.; Dubath, P.; Bartholdi, P.; Eyer, L.

    2013-05-01

    We propose a robust principal component analysis framework for the exploitation of multiband photometric measurements in large surveys. Period search results are improved using the time-series of the first principal component due to its optimized signal-to-noise ratio. The presence of correlated excess variations in the multivariate time-series enables the detection of weaker variability. Furthermore, the direction of the largest variance differs for certain types of variable stars. This can be used as an efficient attribute for classification. The application of the method to a subsample of Sloan Digital Sky Survey Stripe 82 data yielded 132 high-amplitude delta Scuti variables. We also found 129 new RR Lyrae variables, complementary to the catalogue of Sesar et al., extending the halo area mapped by Stripe 82 RR Lyrae stars towards the Galactic bulge. The sample also comprises 25 multiperiodic or Blazhko RR Lyrae stars. (8 data files).

  7. diffloop: a computational framework for identifying and analyzing differential DNA loops from sequencing data.

    PubMed

    Lareau, Caleb A; Aryee, Martin J; Berger, Bonnie

    2018-02-15

    The 3D architecture of DNA within the nucleus is a key determinant of interactions between genes, regulatory elements, and transcriptional machinery. As a result, differences in DNA looping structure are associated with variation in gene expression and cell state. To systematically assess changes in DNA looping architecture between samples, we introduce diffloop, an R/Bioconductor package that provides a suite of functions for the quality control, statistical testing, annotation, and visualization of DNA loops. We demonstrate this functionality by detecting differences between ENCODE ChIA-PET samples and relate looping to variability in epigenetic state. Diffloop is implemented as an R/Bioconductor package available at https://bioconductor.org/packages/release/bioc/html/diffloop.html. aryee.martin@mgh.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. Treatment of Infants Identified by Newborn Screening for Severe Combined Immunodeficiency

    PubMed Central

    Dorsey, Morna J.; Dvorak, Christopher C.; Cowan, Morton J.; Puck, Jennifer M.

    2017-01-01

    Background Severe combined immunodeficiency (SCID) is characterized by severely impaired T cell development and is fatal without treatment. Newborn screening (NBS) for SCID permits identification of affected infants before development of opportunistic infections and other complications. Substantial variation exists between treatment centers with regard to pre-transplant care and transplant protocols for NBS identified SCID infants, as well as for infants with other T lymphopenic disorders detected by NBS. Methods We developed approaches to management based on the study of infants identified by SCID NBS who received care at UCSF. Results From August 2010 through October 2016, 32 NBS SCID and leaky SCID cases from California and other states were treated and 42 NBS identified non-SCID T cell lymphopenia (TCL) cases were followed. Conclusions Our center’s approach supports successful outcomes; systematic review of our practice provides a framework for diagnosis and management, recognizing that more data will continue to shape best practices. PMID:28270365

  9. Coupling Functions Enable Secure Communications

    NASA Astrophysics Data System (ADS)

    Stankovski, Tomislav; McClintock, Peter V. E.; Stefanovska, Aneta

    2014-01-01

    Secure encryption is an essential feature of modern communications, but rapid progress in illicit decryption brings a continuing need for new schemes that are harder and harder to break. Inspired by the time-varying nature of the cardiorespiratory interaction, here we introduce a new class of secure communications that is highly resistant to conventional attacks. Unlike all earlier encryption procedures, this cipher makes use of the coupling functions between interacting dynamical systems. It results in an unbounded number of encryption key possibilities, allows the transmission or reception of more than one signal simultaneously, and is robust against external noise. Thus, the information signals are encrypted as the time variations of linearly independent coupling functions. Using predetermined forms of coupling function, we apply Bayesian inference on the receiver side to detect and separate the information signals while simultaneously eliminating the effect of external noise. The scheme is highly modular and is readily extendable to support different communications applications within the same general framework.

  10. Cost-effective sampling of (137)Cs-derived net soil redistribution: part 2 - estimating the spatial mean change over time.

    PubMed

    Chappell, A; Li, Y; Yu, H Q; Zhang, Y Z; Li, X Y

    2015-06-01

    The caesium-137 ((137)Cs) technique for estimating net, time-integrated soil redistribution by the processes of wind, water and tillage is increasingly being used with repeated sampling to form a baseline to evaluate change over small (years to decades) timeframes. This interest stems from knowledge that since the 1950s soil redistribution has responded dynamically to different phases of land use change and management. Currently, there is no standard approach to detect change in (137)Cs-derived net soil redistribution and thereby identify the driving forces responsible for change. We outline recent advances in space-time sampling in the soil monitoring literature which provide a rigorous statistical and pragmatic approach to estimating the change over time in the spatial mean of environmental properties. We apply the space-time sampling framework, estimate the minimum detectable change of net soil redistribution and consider the information content and cost implications of different sampling designs for a study area in the Chinese Loess Plateau. Three phases (1954-1996, 1954-2012 and 1996-2012) of net soil erosion were detectable and attributed to well-documented historical change in land use and management practices in the study area and across the region. We recommend that the design for space-time sampling is considered carefully alongside cost-effective use of the spatial mean to detect and correctly attribute cause of change over time particularly across spatial scales of variation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Detection and measurement of the intracellular calcium variation in follicular cells.

    PubMed

    Herrera-Navarro, Ana M; Terol-Villalobos, Iván R; Jiménez-Hernández, Hugo; Peregrina-Barreto, Hayde; Gonzalez-Barboza, José-Joel

    2014-01-01

    This work presents a new method for measuring the variation of intracellular calcium in follicular cells. The proposal consists in two stages: (i) the detection of the cell's nuclei and (ii) the analysis of the fluorescence variations. The first stage is performed via watershed modified transformation, where the process of labeling is controlled. The detection process uses the contours of the cells as descriptors, where they are enhanced with a morphological filter that homogenizes the luminance variation of the image. In the second stage, the fluorescence variations are modeled as an exponential decreasing function, where the fluorescence variations are highly correlated with the changes of intracellular free Ca(2+). Additionally, it is introduced a new morphological called medium reconstruction process, which helps to enhance the data for the modeling process. This filter exploits the undermodeling and overmodeling properties of reconstruction operators, such that it preserves the structure of the original signal. Finally, an experimental process shows evidence of the capabilities of the proposal.

  12. Detection and Measurement of the Intracellular Calcium Variation in Follicular Cells

    PubMed Central

    Herrera-Navarro, Ana M.; Terol-Villalobos, Iván R.; Jiménez-Hernández, Hugo; Peregrina-Barreto, Hayde; Gonzalez-Barboza, José-Joel

    2014-01-01

    This work presents a new method for measuring the variation of intracellular calcium in follicular cells. The proposal consists in two stages: (i) the detection of the cell's nuclei and (ii) the analysis of the fluorescence variations. The first stage is performed via watershed modified transformation, where the process of labeling is controlled. The detection process uses the contours of the cells as descriptors, where they are enhanced with a morphological filter that homogenizes the luminance variation of the image. In the second stage, the fluorescence variations are modeled as an exponential decreasing function, where the fluorescence variations are highly correlated with the changes of intracellular free Ca2+. Additionally, it is introduced a new morphological called medium reconstruction process, which helps to enhance the data for the modeling process. This filter exploits the undermodeling and overmodeling properties of reconstruction operators, such that it preserves the structure of the original signal. Finally, an experimental process shows evidence of the capabilities of the proposal. PMID:25342958

  13. Variance Function Regression in Hierarchical Age-Period-Cohort Models: Applications to the Study of Self-Reported Health

    PubMed Central

    Zheng, Hui; Yang, Yang; Land, Kenneth C.

    2012-01-01

    Two long-standing research problems of interest to sociologists are sources of variations in social inequalities and differential contributions of the temporal dimensions of age, time period, and cohort to variations in social phenomena. Recently, scholars have introduced a model called Variance Function Regression for the study of the former problem, and a model called Hierarchical Age-Period-Cohort regression has been developed for the study of the latter. This article presents an integration of these two models as a means to study the evolution of social inequalities along distinct temporal dimensions. We apply the integrated model to survey data on subjective health status. We find substantial age, period, and cohort effects, as well as gender differences, not only for the conditional mean of self-rated health (i.e., between-group disparities), but also for the variance in this mean (i.e., within-group disparities)—and it is detection of age, period, and cohort variations in the latter disparities that application of the integrated model permits. Net of effects of age and individual-level covariates, in recent decades, cohort differences in conditional means of self-rated health have been less important than period differences that cut across all cohorts. By contrast, cohort differences of variances in these conditional means have dominated period differences. In particular, post-baby boom birth cohorts show significant and increasing levels of within-group disparities. These findings illustrate how the integrated model provides a powerful framework through which to identify and study the evolution of variations in social inequalities across age, period, and cohort temporal dimensions. Accordingly, this model should be broadly applicable to the study of social inequality in many different substantive contexts. PMID:22904570

  14. The Principle of the Fermionic Projector: An Approach for Quantum Gravity?

    NASA Astrophysics Data System (ADS)

    Finster, Felix

    In this short article we introduce the mathematical framework of the principle of the fermionic projector and set up a variational principle in discrete space-time. The underlying physical principles are discussed. We outline the connection to the continuum theory and state recent results. In the last two sections, we speculate on how it might be possible to describe quantum gravity within this framework.

  15. "Hey! Today I Will Tell You about the Water Cycle!": Variations of Language and Organizational Features in Third-Grade Science Explanation Writing

    ERIC Educational Resources Information Center

    Avalos, Mary A.; Secada, Walter G.; Zisselsberger, Margarita Gómez; Gort, Mileidis

    2017-01-01

    This study investigated third graders' use and variation of linguistic resources when writing a science explanation. Using systemic functional linguistics as a framework, we purposefully selected and analyzed writing samples of students with high and low scores to explore how the students' use of language features (i.e., lexicogrammatical…

  16. Robust Optical Recognition of Cursive Pashto Script Using Scale, Rotation and Location Invariant Approach

    PubMed Central

    Ahmad, Riaz; Naz, Saeeda; Afzal, Muhammad Zeshan; Amin, Sayed Hassan; Breuel, Thomas

    2015-01-01

    The presence of a large number of unique shapes called ligatures in cursive languages, along with variations due to scaling, orientation and location provides one of the most challenging pattern recognition problems. Recognition of the large number of ligatures is often a complicated task in oriental languages such as Pashto, Urdu, Persian and Arabic. Research on cursive script recognition often ignores the fact that scaling, orientation, location and font variations are common in printed cursive text. Therefore, these variations are not included in image databases and in experimental evaluations. This research uncovers challenges faced by Arabic cursive script recognition in a holistic framework by considering Pashto as a test case, because Pashto language has larger alphabet set than Arabic, Persian and Urdu. A database containing 8000 images of 1000 unique ligatures having scaling, orientation and location variations is introduced. In this article, a feature space based on scale invariant feature transform (SIFT) along with a segmentation framework has been proposed for overcoming the above mentioned challenges. The experimental results show a significantly improved performance of proposed scheme over traditional feature extraction techniques such as principal component analysis (PCA). PMID:26368566

  17. Personality assessment and model comparison with behavioral data: A statistical framework and empirical demonstration with bonobos (Pan paniscus).

    PubMed

    Martin, Jordan S; Suarez, Scott A

    2017-08-01

    Interest in quantifying consistent among-individual variation in primate behavior, also known as personality, has grown rapidly in recent decades. Although behavioral coding is the most frequently utilized method for assessing primate personality, limitations in current statistical practice prevent researchers' from utilizing the full potential of their coding datasets. These limitations include the use of extensive data aggregation, not modeling biologically relevant sources of individual variance during repeatability estimation, not partitioning between-individual (co)variance prior to modeling personality structure, the misuse of principal component analysis, and an over-reliance upon exploratory statistical techniques to compare personality models across populations, species, and data collection methods. In this paper, we propose a statistical framework for primate personality research designed to address these limitations. Our framework synthesizes recently developed mixed-effects modeling approaches for quantifying behavioral variation with an information-theoretic model selection paradigm for confirmatory personality research. After detailing a multi-step analytic procedure for personality assessment and model comparison, we employ this framework to evaluate seven models of personality structure in zoo-housed bonobos (Pan paniscus). We find that differences between sexes, ages, zoos, time of observation, and social group composition contributed to significant behavioral variance. Independently of these factors, however, personality nonetheless accounted for a moderate to high proportion of variance in average behavior across observational periods. A personality structure derived from past rating research receives the strongest support relative to our model set. This model suggests that personality variation across the measured behavioral traits is best described by two correlated but distinct dimensions reflecting individual differences in affiliation and sociability (Agreeableness) as well as activity level, social play, and neophilia toward non-threatening stimuli (Openness). These results underscore the utility of our framework for quantifying personality in primates and facilitating greater integration between the behavioral ecological and comparative psychological approaches to personality research. © 2017 Wiley Periodicals, Inc.

  18. Towards a Framework for Change Detection in Data Sets

    NASA Astrophysics Data System (ADS)

    Böttcher, Mirko; Nauck, Detlef; Ruta, Dymitr; Spott, Martin

    Since the world with its markets, innovations and customers is changing faster than ever before, the key to survival for businesses is the ability to detect, assess and respond to changing conditions rapidly and intelligently. Discovering changes and reacting to or acting upon them before others do has therefore become a strategical issue for many companies. However, existing data analysis techniques are insufflent for this task since they typically assume that the domain under consideration is stable over time. This paper presents a framework that detects changes within a data set at virtually any level of granularity. The underlying idea is to derive a rule-based description of the data set at different points in time and to subsequently analyse how these rules change. Nevertheless, further techniques are required to assist the data analyst in interpreting and assessing their changes. Therefore the framework also contains methods to discard rules that are non-drivers for change and to assess the interestingness of detected changes.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciuca, Razvan; Hernández, Oscar F., E-mail: razvan.ciuca@mail.mcgill.ca, E-mail: oscarh@physics.mcgill.ca

    There exists various proposals to detect cosmic strings from Cosmic Microwave Background (CMB) or 21 cm temperature maps. Current proposals do not aim to find the location of strings on sky maps, all of these approaches can be thought of as a statistic on a sky map. We propose a Bayesian interpretation of cosmic string detection and within that framework, we derive a connection between estimates of cosmic string locations and cosmic string tension G μ. We use this Bayesian framework to develop a machine learning framework for detecting strings from sky maps and outline how to implement this frameworkmore » with neural networks. The neural network we trained was able to detect and locate cosmic strings on noiseless CMB temperature map down to a string tension of G μ=5 ×10{sup −9} and when analyzing a CMB temperature map that does not contain strings, the neural network gives a 0.95 probability that G μ≤2.3×10{sup −9}.« less

  20. Sources of Variation in a Two-Step Monitoring Protocol for Species Clustered in Conspicuous Points: Dolichotis patagonum as a Case Study.

    PubMed

    Alonso Roldán, Virginia; Bossio, Luisina; Galván, David E

    2015-01-01

    In species showing distributions attached to particular features of the landscape or conspicuous signs, counts are commonly made by making focal observations where animals concentrate. However, to obtain density estimates for a given area, independent searching for signs and occupancy rates of suitable sites is needed. In both cases, it is important to estimate detection probability and other possible sources of variation to avoid confounding effects on measurements of abundance variation. Our objective was to assess possible bias and sources of variation in a two-step protocol in which random designs were applied to search for signs while continuously recording video cameras were used to perform abundance counts where animals are concentrated, using mara (Dolichotis patagonum) as a case study. The protocol was successfully applied to maras within the Península Valdés protected area, given that the protocol was logistically suitable, allowed warrens to be found, the associated adults to be counted, and the detection probability to be estimated. Variability was documented in both components of the two-step protocol. These sources of variation should be taken into account when applying this protocol. Warren detectability was approximately 80% with little variation. Factors related to false positive detection were more important than imperfect detection. The detectability for individuals was approximately 90% using the entire day of observations. The shortest sampling period with a similar detection capacity than a day was approximately 10 hours, and during this period, the visiting dynamic did not show trends. For individual mara, the detection capacity of the camera was not significantly different from the observer during fieldwork. The presence of the camera did not affect the visiting behavior of adults to the warren. Application of this protocol will allow monitoring of the near-threatened mara providing a minimum local population size and a baseline for measuring long-term trends.

  1. A Comprehensive Texture Segmentation Framework for Segmentation of Capillary Non-Perfusion Regions in Fundus Fluorescein Angiograms

    PubMed Central

    Zheng, Yalin; Kwong, Man Ting; MacCormick, Ian J. C.; Beare, Nicholas A. V.; Harding, Simon P.

    2014-01-01

    Capillary non-perfusion (CNP) in the retina is a characteristic feature used in the management of a wide range of retinal diseases. There is no well-established computation tool for assessing the extent of CNP. We propose a novel texture segmentation framework to address this problem. This framework comprises three major steps: pre-processing, unsupervised total variation texture segmentation, and supervised segmentation. It employs a state-of-the-art multiphase total variation texture segmentation model which is enhanced by new kernel based region terms. The model can be applied to texture and intensity-based multiphase problems. A supervised segmentation step allows the framework to take expert knowledge into account, an AdaBoost classifier with weighted cost coefficient is chosen to tackle imbalanced data classification problems. To demonstrate its effectiveness, we applied this framework to 48 images from malarial retinopathy and 10 images from ischemic diabetic maculopathy. The performance of segmentation is satisfactory when compared to a reference standard of manual delineations: accuracy, sensitivity and specificity are 89.0%, 73.0%, and 90.8% respectively for the malarial retinopathy dataset and 80.8%, 70.6%, and 82.1% respectively for the diabetic maculopathy dataset. In terms of region-wise analysis, this method achieved an accuracy of 76.3% (45 out of 59 regions) for the malarial retinopathy dataset and 73.9% (17 out of 26 regions) for the diabetic maculopathy dataset. This comprehensive segmentation framework can quantify capillary non-perfusion in retinopathy from two distinct etiologies, and has the potential to be adopted for wider applications. PMID:24747681

  2. Tank-Binding Kinase 1 (TBK1) Gene and Open-Angle Glaucomas (An American Ophthalmological Society Thesis)

    PubMed Central

    Fingert, John H.; Robin, Alan L.; Scheetz, Todd E.; Kwon, Young H.; Liebmann, Jeffrey M.; Ritch, Robert; Alward, Wallace L.M.

    2016-01-01

    Purpose To investigate the role of TANK-binding kinase 1 (TBK1) gene copy-number variations (ie, gene duplications and triplications) in the pathophysiology of various open-angle glaucomas. Methods In previous studies, we discovered that copy-number variations in the TBK1 gene are associated with normal-tension glaucoma. Here, we investigated the prevalence of copy-number variations in cohorts of patients with other open-angle glaucomas—juvenile-onset open-angle glaucoma (n=30), pigmentary glaucoma (n=209), exfoliation glaucoma (n=225), and steroid-induced glaucoma (n=79)—using a quantitative polymerase chain reaction assay. Results No TBK1 gene copy-number variations were detected in patients with juvenile-onset open-angle glaucoma, pigmentary glaucoma, or steroid-induced glaucoma. A TBK1 gene duplication was detected in one (0.44%) of the 225 exfoliation glaucoma patients. Conclusions TBK1 gene copy-number variations (gene duplications and triplications) have been previously associated with normal-tension glaucoma. An exploration of other open-angle glaucomas detected a TBK1 copy-number variation in a patient with exfoliation glaucoma, which is the first example of a TBK1 mutation in a glaucoma patient with a diagnosis other than normal-tension glaucoma. A broader phenotypic range may be associated with TBK1 copy-number variations, although mutations in this gene are most often detected in patients with normal-tension glaucoma. PMID:27881886

  3. Tank-Binding Kinase 1 (TBK1) Gene and Open-Angle Glaucomas (An American Ophthalmological Society Thesis).

    PubMed

    Fingert, John H; Robin, Alan L; Scheetz, Todd E; Kwon, Young H; Liebmann, Jeffrey M; Ritch, Robert; Alward, Wallace L M

    2016-08-01

    To investigate the role of TANK-binding kinase 1 ( TBK1 ) gene copy-number variations (ie, gene duplications and triplications) in the pathophysiology of various open-angle glaucomas. In previous studies, we discovered that copy-number variations in the TBK1 gene are associated with normal-tension glaucoma. Here, we investigated the prevalence of copy-number variations in cohorts of patients with other open-angle glaucomas-juvenile-onset open-angle glaucoma (n=30), pigmentary glaucoma (n=209), exfoliation glaucoma (n=225), and steroid-induced glaucoma (n=79)-using a quantitative polymerase chain reaction assay. No TBK1 gene copy-number variations were detected in patients with juvenile-onset open-angle glaucoma, pigmentary glaucoma, or steroid-induced glaucoma. A TBK1 gene duplication was detected in one (0.44%) of the 225 exfoliation glaucoma patients. TBK1 gene copy-number variations (gene duplications and triplications) have been previously associated with normal-tension glaucoma. An exploration of other open-angle glaucomas detected a TBK1 copy-number variation in a patient with exfoliation glaucoma, which is the first example of a TBK1 mutation in a glaucoma patient with a diagnosis other than normal-tension glaucoma. A broader phenotypic range may be associated with TBK1 copy-number variations, although mutations in this gene are most often detected in patients with normal-tension glaucoma.

  4. Lessons Learned From Developing A Streaming Data Framework for Scientific Analysis

    NASA Technical Reports Server (NTRS)

    Wheeler. Kevin R.; Allan, Mark; Curry, Charles

    2003-01-01

    We describe the development and usage of a streaming data analysis software framework. The framework is used for three different applications: Earth science hyper-spectral imaging analysis, Electromyograph pattern detection, and Electroencephalogram state determination. In each application the framework was used to answer a series of science questions which evolved with each subsequent answer. This evolution is summarized in the form of lessons learned.

  5. A robust real-time abnormal region detection framework from capsule endoscopy images

    NASA Astrophysics Data System (ADS)

    Cheng, Yanfen; Liu, Xu; Li, Huiping

    2009-02-01

    In this paper we present a novel method to detect abnormal regions from capsule endoscopy images. Wireless Capsule Endoscopy (WCE) is a recent technology where a capsule with an embedded camera is swallowed by the patient to visualize the gastrointestinal tract. One challenge is one procedure of diagnosis will send out over 50,000 images, making physicians' reviewing process expensive. Physicians' reviewing process involves in identifying images containing abnormal regions (tumor, bleeding, etc) from this large number of image sequence. In this paper we construct a novel framework for robust and real-time abnormal region detection from large amount of capsule endoscopy images. The detected potential abnormal regions can be labeled out automatically to let physicians review further, therefore, reduce the overall reviewing process. In this paper we construct an abnormal region detection framework with the following advantages: 1) Trainable. Users can define and label any type of abnormal region they want to find; The abnormal regions, such as tumor, bleeding, etc., can be pre-defined and labeled using the graphical user interface tool we provided. 2) Efficient. Due to the large number of image data, the detection speed is very important. Our system can detect very efficiently at different scales due to the integral image features we used; 3) Robust. After feature selection we use a cascade of classifiers to further enforce the detection accuracy.

  6. Inferring causal genomic alterations in breast cancer using gene expression data

    PubMed Central

    2011-01-01

    Background One of the primary objectives in cancer research is to identify causal genomic alterations, such as somatic copy number variation (CNV) and somatic mutations, during tumor development. Many valuable studies lack genomic data to detect CNV; therefore, methods that are able to infer CNVs from gene expression data would help maximize the value of these studies. Results We developed a framework for identifying recurrent regions of CNV and distinguishing the cancer driver genes from the passenger genes in the regions. By inferring CNV regions across many datasets we were able to identify 109 recurrent amplified/deleted CNV regions. Many of these regions are enriched for genes involved in many important processes associated with tumorigenesis and cancer progression. Genes in these recurrent CNV regions were then examined in the context of gene regulatory networks to prioritize putative cancer driver genes. The cancer driver genes uncovered by the framework include not only well-known oncogenes but also a number of novel cancer susceptibility genes validated via siRNA experiments. Conclusions To our knowledge, this is the first effort to systematically identify and validate drivers for expression based CNV regions in breast cancer. The framework where the wavelet analysis of copy number alteration based on expression coupled with the gene regulatory network analysis, provides a blueprint for leveraging genomic data to identify key regulatory components and gene targets. This integrative approach can be applied to many other large-scale gene expression studies and other novel types of cancer data such as next-generation sequencing based expression (RNA-Seq) as well as CNV data. PMID:21806811

  7. Cell-free DNA fragment-size distribution analysis for non-invasive prenatal CNV prediction.

    PubMed

    Arbabi, Aryan; Rampášek, Ladislav; Brudno, Michael

    2016-06-01

    Non-invasive detection of aneuploidies in a fetal genome through analysis of cell-free DNA circulating in the maternal plasma is becoming a routine clinical test. Such tests, which rely on analyzing the read coverage or the allelic ratios at single-nucleotide polymorphism (SNP) loci, are not sensitive enough for smaller sub-chromosomal abnormalities due to sequencing biases and paucity of SNPs in a genome. We have developed an alternative framework for identifying sub-chromosomal copy number variations in a fetal genome. This framework relies on the size distribution of fragments in a sample, as fetal-origin fragments tend to be smaller than those of maternal origin. By analyzing the local distribution of the cell-free DNA fragment sizes in each region, our method allows for the identification of sub-megabase CNVs, even in the absence of SNP positions. To evaluate the accuracy of our method, we used a plasma sample with the fetal fraction of 13%, down-sampled it to samples with coverage of 10X-40X and simulated samples with CNVs based on it. Our method had a perfect accuracy (both specificity and sensitivity) for detecting 5 Mb CNVs, and after reducing the fetal fraction (to 11%, 9% and 7%), it could correctly identify 98.82-100% of the 5 Mb CNVs and had a true-negative rate of 95.29-99.76%. Our source code is available on GitHub at https://github.com/compbio-UofT/FSDA CONTACT: : brudno@cs.toronto.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. A statistical approach to detection of copy number variations in PCR-enriched targeted sequencing data.

    PubMed

    Demidov, German; Simakova, Tamara; Vnuchkova, Julia; Bragin, Anton

    2016-10-22

    Multiplex polymerase chain reaction (PCR) is a common enrichment technique for targeted massive parallel sequencing (MPS) protocols. MPS is widely used in biomedical research and clinical diagnostics as the fast and accurate tool for the detection of short genetic variations. However, identification of larger variations such as structure variants and copy number variations (CNV) is still being a challenge for targeted MPS. Some approaches and tools for structural variants detection were proposed, but they have limitations and often require datasets of certain type, size and expected number of amplicons affected by CNVs. In the paper, we describe novel algorithm for high-resolution germinal CNV detection in the PCR-enriched targeted sequencing data and present accompanying tool. We have developed a machine learning algorithm for the detection of large duplications and deletions in the targeted sequencing data generated with PCR-based enrichment step. We have performed verification studies and established the algorithm's sensitivity and specificity. We have compared developed tool with other available methods applicable for the described data and revealed its higher performance. We showed that our method has high specificity and sensitivity for high-resolution copy number detection in targeted sequencing data using large cohort of samples.

  9. Solution of Radiative Transfer Equation with a Continuous and Stochastic Varying Refractive Index by Legendre Transform Method

    PubMed Central

    Gantri, M.

    2014-01-01

    The present paper gives a new computational framework within which radiative transfer in a varying refractive index biological tissue can be studied. In our previous works, Legendre transform was used as an innovative view to handle the angular derivative terms in the case of uniform refractive index spherical medium. In biomedical optics, our analysis can be considered as a forward problem solution in a diffuse optical tomography imaging scheme. We consider a rectangular biological tissue-like domain with spatially varying refractive index submitted to a near infrared continuous light source. Interaction of radiation with the biological material into the medium is handled by a radiative transfer model. In the studied situation, the model displays two angular redistribution terms that are treated with Legendre integral transform. The model is used to study a possible detection of abnormalities in a general biological tissue. The effect of the embedded nonhomogeneous objects on the transmitted signal is studied. Particularly, detection of targets of localized heterogeneous inclusions within the tissue is discussed. Results show that models accounting for variation of refractive index can yield useful predictions about the target and the location of abnormal inclusions within the tissue. PMID:25013454

  10. How is water-use efficiency of terrestrial ecosystems distributed and changing on Earth?

    PubMed

    Tang, Xuguang; Li, Hengpeng; Desai, Ankur R; Nagy, Zoltan; Luo, Juhua; Kolb, Thomas E; Olioso, Albert; Xu, Xibao; Yao, Li; Kutsch, Werner; Pilegaard, Kim; Köstner, Barbara; Ammann, Christof

    2014-12-15

    A better understanding of ecosystem water-use efficiency (WUE) will help us improve ecosystem management for mitigation as well as adaption to global hydrological change. Here, long-term flux tower observations of productivity and evapotranspiration allow us to detect a consistent latitudinal trend in WUE, rising from the subtropics to the northern high-latitudes. The trend peaks at approximately 51°N, and then declines toward higher latitudes. These ground-based observations are consistent with global-scale estimates of WUE. Global analysis of WUE reveals existence of strong regional variations that correspond to global climate patterns. The latitudinal trends of global WUE for Earth's major plant functional types reveal two peaks in the Northern Hemisphere not detected by ground-based measurements. One peak is located at 20° ~ 30°N and the other extends a little farther north than 51°N. Finally, long-term spatiotemporal trend analysis using satellite-based remote sensing data reveals that land-cover and land-use change in recent years has led to a decline in global WUE. Our study provides a new framework for global research on the interactions between carbon and water cycles as well as responses to natural and human impacts.

  11. Fluctuations of hi-hat timing and dynamics in a virtuoso drum track of a popular music recording.

    PubMed

    Räsänen, Esa; Pulkkinen, Otto; Virtanen, Tuomas; Zollner, Manfred; Hennig, Holger

    2015-01-01

    Long-range correlated temporal fluctuations in the beats of musical rhythms are an inevitable consequence of human action. According to recent studies, such fluctuations also lead to a favored listening experience. The scaling laws of amplitude variations in rhythms, however, are widely unknown. Here we use highly sensitive onset detection and time series analysis to study the amplitude and temporal fluctuations of Jeff Porcaro's one-handed hi-hat pattern in "I Keep Forgettin'"-one of the most renowned 16th note patterns in modern drumming. We show that fluctuations of hi-hat amplitudes and interbeat intervals (times between hits) have clear long-range correlations and short-range anticorrelations separated by a characteristic time scale. In addition, we detect subtle features in Porcaro's drumming such as small drifts in the 16th note pulse and non-trivial periodic two-bar patterns in both hi-hat amplitudes and intervals. Through this investigation we introduce a step towards statistical studies of the 20th and 21st century music recordings in the framework of complex systems. Our analysis has direct applications to the development of drum machines and to drumming pedagogy.

  12. Dynamic graphs, community detection, and Riemannian geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakker, Craig; Halappanavar, Mahantesh; Visweswara Sathanur, Arun

    A community is a subset of a wider network where the members of that subset are more strongly connected to each other than they are to the rest of the network. In this paper, we consider the problem of identifying and tracking communities in graphs that change over time {dynamic community detection} and present a framework based on Riemannian geometry to aid in this task. Our framework currently supports several important operations such as interpolating between and averaging over graph snapshots. We compare these Riemannian methods with entry-wise linear interpolation and that the Riemannian methods are generally better suited tomore » dynamic community detection. Next steps with the Riemannian framework include developing higher-order interpolation methods (e.g. the analogues of polynomial and spline interpolation) and a Riemannian least-squares regression method for working with noisy data.« less

  13. A bifunctional luminescent Tb(III)-metal-organic framework by a tetracarboxylate ligand for highly selective detection of Fe3+ cation and Cr2O72- anion

    NASA Astrophysics Data System (ADS)

    Yu, Li; Wang, Chao; Hu, Chang-Jiang; Dong, Wen-Wen; Wu, Ya-Pan; Li, Dong-Sheng; Zhao, Jun

    2018-06-01

    Reaction of Tb3+ ions with p-terphenyl-3,3″,5,5″-tetracarboxylic acid (H4ptptc) in a mixed solvent system has afforded a new metal-organic framework formulated as [Tb2(ptptc)1.5(H2O)2]n (1). Compound 1 displays a 3D (5,6,8)-connected framework with fascinating one-dimensional triangle open channels. The luminescence explorations demonstrated that 1 exhibits highly selective and sensitive response to Fe3+ in DMF solution and biological system through luminescence quenching effects. In addition, 1 also shows high detection for the Cr2O72-, making it a promising dual functional materials for detecting Fe3+ cation and Cr2O72- anion with high sensitivity and selectivity.

  14. Advances in Micromechanics Modeling of Composites Structures for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Moncada, Albert

    Although high performance, light-weight composites are increasingly being used in applications ranging from aircraft, rotorcraft, weapon systems and ground vehicles, the assurance of structural reliability remains a critical issue. In composites, damage is absorbed through various fracture processes, including fiber failure, matrix cracking and delamination. An important element in achieving reliable composite systems is a strong capability of assessing and inspecting physical damage of critical structural components. Installation of a robust Structural Health Monitoring (SHM) system would be very valuable in detecting the onset of composite failure. A number of major issues still require serious attention in connection with the research and development aspects of sensor-integrated reliable SHM systems for composite structures. In particular, the sensitivity of currently available sensor systems does not allow detection of micro level damage; this limits the capability of data driven SHM systems. As a fundamental layer in SHM, modeling can provide in-depth information on material and structural behavior for sensing and detection, as well as data for learning algorithms. This dissertation focuses on the development of a multiscale analysis framework, which is used to detect various forms of damage in complex composite structures. A generalized method of cells based micromechanics analysis, as implemented in NASA's MAC/GMC code, is used for the micro-level analysis. First, a baseline study of MAC/GMC is performed to determine the governing failure theories that best capture the damage progression. The deficiencies associated with various layups and loading conditions are addressed. In most micromechanics analysis, a representative unit cell (RUC) with a common fiber packing arrangement is used. The effect of variation in this arrangement within the RUC has been studied and results indicate this variation influences the macro-scale effective material properties and failure stresses. The developed model has been used to simulate impact damage in a composite beam and an airfoil structure. The model data was verified through active interrogation using piezoelectric sensors. The multiscale model was further extended to develop a coupled damage and wave attenuation model, which was used to study different damage states such as fiber-matrix debonding in composite structures with surface bonded piezoelectric sensors.

  15. Techniques A: continuous waves

    NASA Astrophysics Data System (ADS)

    Beuthan, J.

    1993-08-01

    In a vast amount of medical diseases the biochemical and physiological changes of soft tissues are hardly detectable by conventional techniques of diagnostic imaging (x- ray, ultrasound, computer tomography, and MRI). The detectivity is low and the technical efforts are tremendous. On the other hand these pathologic variations induce significant changes of the optical tissue parameters which can be detected. The corresponding variations of the scattered light can most easily be detected and evaluated by infrared diaphanoscopy, even on optical thick tissue slices.

  16. Framework for Detection and Localization of Extreme Climate Event with Pixel Recursive Super Resolution

    NASA Astrophysics Data System (ADS)

    Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.

    2017-12-01

    Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.

  17. Genetic analysis of CHST6 and TGFBI in Turkish patients with corneal dystrophies: Five novel variations in CHST6

    PubMed Central

    Yaylacioglu Tuncay, Fulya; Kayman Kurekci, Gülsüm; Guntekin Ergun, Sezen; Pasaoglu, Ozge Tugce; Akata, Rustu Fikret; Dincer, Pervin Rukiye

    2016-01-01

    Purpose To identify pathogenic variations in carbohydrate sulfotransferase 6 (CHST6) and transforming growth factor, beta-induced (TGFBI) genes in Turkish patients with corneal dystrophy (CD). Methods In this study, patients with macular corneal dystrophy (MCD; n = 18), granular corneal dystrophy type 1 (GCD1; n = 12), and lattice corneal dystrophy type 1 (LCD1; n = 4), as well as 50 healthy controls, were subjected to clinical and genetic examinations. The level of antigenic keratan sulfate (AgKS) in the serum samples of patients with MCD was determined with enzyme-linked immunosorbent assay (ELISA) to immunophenotypically subtype the patients as MCD type I and MCD type II. DNA was isolated from venous blood samples from the patients and controls. Variations were analyzed with DNA sequencing in the coding region of CHST6 in patients with MCD and exons 4 and 12 in TGFBI in patients with LCD1 and GCD1. Clinical characteristics and the detected variations were evaluated to determine any existing genotype–phenotype correlations. Results The previously reported R555W mutation in TGFBI was detected in 12 patients with GCD1, and the R124C mutation in TGFBI was detected in four patients with LCD1. Serum AgKS levels indicated that 12 patients with MCD were in subgroup I, and five patients with MCD were in subgroup II. No genetic variation was detected in the coding region of CHST6 for three patients with MCD type II. In other patients with MCD, three previously reported missense variations (c. 1A>T, c.738C>G, and c.631 C>T), three novel missense variations (c.164 T>C, c.526 G>A, c. 610 C>T), and two novel frameshift variations (c.894_895 insG and c. 462_463 delGC) were detected. These variations did not exist in the control chromosomes, 1000 Genomes, and dbSNP. Conclusions This is the first molecular analysis of TGFBI and CHST6 in Turkish patients with different types of CD. We detected previously reported, well-known hot spot mutations in TGFBI in the patients with GCD1 and LCD1. Eight likely pathogenic variations in CHST6, five of them novel, were reported in patients with MCD, which enlarges the mutational spectrum of MCD. PMID:27829782

  18. Genetic analysis of CHST6 and TGFBI in Turkish patients with corneal dystrophies: Five novel variations in CHST6.

    PubMed

    Yaylacioglu Tuncay, Fulya; Kayman Kurekci, Gülsüm; Guntekin Ergun, Sezen; Pasaoglu, Ozge Tugce; Akata, Rustu Fikret; Dincer, Pervin Rukiye

    2016-01-01

    To identify pathogenic variations in carbohydrate sulfotransferase 6 ( CHST6 ) and transforming growth factor, beta-induced ( TGFBI ) genes in Turkish patients with corneal dystrophy (CD). In this study, patients with macular corneal dystrophy (MCD; n = 18), granular corneal dystrophy type 1 (GCD1; n = 12), and lattice corneal dystrophy type 1 (LCD1; n = 4), as well as 50 healthy controls, were subjected to clinical and genetic examinations. The level of antigenic keratan sulfate (AgKS) in the serum samples of patients with MCD was determined with enzyme-linked immunosorbent assay (ELISA) to immunophenotypically subtype the patients as MCD type I and MCD type II. DNA was isolated from venous blood samples from the patients and controls. Variations were analyzed with DNA sequencing in the coding region of CHST6 in patients with MCD and exons 4 and 12 in TGFBI in patients with LCD1 and GCD1. Clinical characteristics and the detected variations were evaluated to determine any existing genotype-phenotype correlations. The previously reported R555W mutation in TGFBI was detected in 12 patients with GCD1, and the R124C mutation in TGFBI was detected in four patients with LCD1. Serum AgKS levels indicated that 12 patients with MCD were in subgroup I, and five patients with MCD were in subgroup II. No genetic variation was detected in the coding region of CHST6 for three patients with MCD type II. In other patients with MCD, three previously reported missense variations (c. 1A>T, c.738C>G, and c.631 C>T), three novel missense variations (c.164 T>C, c.526 G>A, c. 610 C>T), and two novel frameshift variations (c.894_895 insG and c. 462_463 delGC) were detected. These variations did not exist in the control chromosomes, 1000 Genomes, and dbSNP. This is the first molecular analysis of TGFBI and CHST6 in Turkish patients with different types of CD. We detected previously reported, well-known hot spot mutations in TGFBI in the patients with GCD1 and LCD1. Eight likely pathogenic variations in CHST6 , five of them novel, were reported in patients with MCD, which enlarges the mutational spectrum of MCD.

  19. Device for detecting imminent failure of high-dielectric stress capacitors. [Patent application

    DOEpatents

    McDuff, G.G.

    1980-11-05

    A device is described for detecting imminent failure of a high-dielectric stress capacitor utilizing circuitry for detecting pulse width variations and pulse magnitude variations. Inexpensive microprocessor circuitry is utilized to make numerical calculations of digital data supplied by detection circuitry for comparison of pulse width data and magnitude data to determine if preselected ranges have been exceeded, thereby indicating imminent failure of a capacitor. Detection circuitry may be incorporated in transmission lines, pulse power circuitry, including laser pulse circuitry or any circuitry where capacitors or capacitor banks are utilized.

  20. Device for detecting imminent failure of high-dielectric stress capacitors

    DOEpatents

    McDuff, George G.

    1982-01-01

    A device for detecting imminent failure of a high-dielectric stress capacitor utilizing circuitry for detecting pulse width variations and pulse magnitude variations. Inexpensive microprocessor circuitry is utilized to make numerical calculations of digital data supplied by detection circuitry for comparison of pulse width data and magnitude data to determine if preselected ranges have been exceeded, thereby indicating imminent failure of a capacitor. Detection circuitry may be incorporated in transmission lines, pulse power circuitry, including laser pulse circuitry or any circuitry where capacitors or capactior banks are utilized.

  1. The effect of using different competence frameworks to audit the content of a masters program in public health.

    PubMed

    Harrison, Roger A; Gemmell, Isla; Reed, Katie

    2015-01-01

    (1) To quantify the effect of using different public health competence frameworks to audit the curriculum of an online distance learning MPH program, and (2) to measure variation in the outcomes of the audit depending on which competence framework is used. Retrospective audit. We compared the teaching content of an online distance learning MPH program against each competence listed in different public health competence frameworks relevant to an MPH. We then compared the number of competences covered in each module in the program's teaching curriculum and in the program overall, for each of the competence frameworks used in this audit. A comprehensive search of the literature identified two competence frameworks specific to MPH programs and two for public health professional/specialty training. The number of individual competences in each framework were 32 for the taught aspects of the UK Faculty of Public Health Specialist Training Program, 117 for the American Association of Public Health, 282 for the exam curriculum of the UK Faculty of Public Health Part A exam, and 393 for the European Core Competencies for MPH Education. This gave a total of 824 competences included in the audit. Overall, the online MPH program covered 88-96% of the competences depending on the specific framework used. This fell when the audit focused on just the three mandatory modules in the program, and the variation between the different competence frameworks was much larger. Using different competence frameworks to audit the curriculum of an MPH program can give different indications of its quality, especially as it fails to capture teaching considered to be relevant, yet not included in an existing competence framework. The strengths and weaknesses of using competence frameworks to audit the content of an MPH program have largely been ignored. These debates are vital given that external organizations responsible for accreditation specify a particular competence framework to be used. Our study found that each of four different competence frameworks suggested different levels of quality in our teaching program, at least in terms of the competences included in the curriculum. Relying on just one established framework missed some aspects of the curriculum included in other frameworks used in this study. Conversely, each framework included items not covered by the others. Thus, levels of agreement with the content of our MPH and established areas of competence were, in part, dependent on the competence framework used to compare its' content. While not entirely a surprising finding, this study makes an important point and makes explicit the challenges of selecting an appropriate competence framework to inform MPH programs, and especially one which recruits students from around the world.

  2. Framework for a space shuttle main engine health monitoring system

    NASA Technical Reports Server (NTRS)

    Hawman, Michael W.; Galinaitis, William S.; Tulpule, Sharayu; Mattedi, Anita K.; Kamenetz, Jeffrey

    1990-01-01

    A framework developed for a health management system (HMS) which is directed at improving the safety of operation of the Space Shuttle Main Engine (SSME) is summarized. An emphasis was placed on near term technology through requirements to use existing SSME instrumentation and to demonstrate the HMS during SSME ground tests within five years. The HMS framework was developed through an analysis of SSME failure modes, fault detection algorithms, sensor technologies, and hardware architectures. A key feature of the HMS framework design is that a clear path from the ground test system to a flight HMS was maintained. Fault detection techniques based on time series, nonlinear regression, and clustering algorithms were developed and demonstrated on data from SSME ground test failures. The fault detection algorithms exhibited 100 percent detection of faults, had an extremely low false alarm rate, and were robust to sensor loss. These algorithms were incorporated into a hierarchical decision making strategy for overall assessment of SSME health. A preliminary design for a hardware architecture capable of supporting real time operation of the HMS functions was developed. Utilizing modular, commercial off-the-shelf components produced a reliable low cost design with the flexibility to incorporate advances in algorithm and sensor technology as they become available.

  3. Benchmarking global land surface models in CMIP5: analysis of ecosystem water use efficiency (WUE) and Budyko framework

    NASA Astrophysics Data System (ADS)

    Li, Longhui

    2015-04-01

    Twelve Earth System Models (ESMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are evaluated in terms of ecosystem water use efficiency (WUE) and Budyko framework. Simulated values of GPP and ET from ESMs were validated against with FLUXNET measurements, and the slope of linear regression between the measurement and the model ranged from 0.24 in CanESM2 to 0.8 in GISS-E2 for GPP, and from 0.51 to 0.86 for ET. The performances of 12 ESMs in simulating ET are generally better than GPP. Compared with flux-tower-based estimates by Jung et al. [Journal of Geophysical Research 116 (2011) G00J07] (JU11), all ESMs could capture the latitudinal variations of GPP and ET, but the majority of models extremely overestimated GPP and ET, particularly around the equator. The 12 ESMs showed much larger variations in latitudinal WUE. 4 of 12 ESMs predicted global annual GPP of higher than 150 Pg C year-1, and the other 8 ESMs predicted global GPP with ±15% error of the JU11 GPP. In contrast, all EMSs predicted moderate bias for global ET. The coefficient of variation (CV) of ET (0.11) is significantly less than that of GPP (0.25). More than half of 12 ESMs generally comply with the Budyko framework but some models deviated much. Spatial analysis of error in GPP and ET indicated that model results largely differ among models at different regions. This study suggested that the estimate of ET was much better than GPP. Incorporating the convergence of WUE and the Budyko framework into ESMs as constraints in the next round of CMIP scheme is expected to decrease the uncertainties of carbon and water fluxes estimates.

  4. A Metacommunity Framework for Enhancing the Effectiveness of Biological Monitoring Strategies

    PubMed Central

    Roque, Fabio O.; Cottenie, Karl

    2012-01-01

    Because of inadequate knowledge and funding, the use of biodiversity indicators is often suggested as a way to support management decisions. Consequently, many studies have analyzed the performance of certain groups as indicator taxa. However, in addition to knowing whether certain groups can adequately represent the biodiversity as a whole, we must also know whether they show similar responses to the main structuring processes affecting biodiversity. Here we present an application of the metacommunity framework for evaluating the effectiveness of biodiversity indicators. Although the metacommunity framework has contributed to a better understanding of biodiversity patterns, there is still limited discussion about its implications for conservation and biomonitoring. We evaluated the effectiveness of indicator taxa in representing spatial variation in macroinvertebrate community composition in Atlantic Forest streams, and the processes that drive this variation. We focused on analyzing whether some groups conform to environmental processes and other groups are more influenced by spatial processes, and on how this can help in deciding which indicator group or groups should be used. We showed that a relatively small subset of taxa from the metacommunity would represent 80% of the variation in community composition shown by the entire metacommunity. Moreover, this subset does not have to be composed of predetermined taxonomic groups, but rather can be defined based on random subsets. We also found that some random subsets composed of a small number of genera performed better in responding to major environmental gradients. There were also random subsets that seemed to be affected by spatial processes, which could indicate important historical processes. We were able to integrate in the same theoretical and practical framework, the selection of biodiversity surrogates, indicators of environmental conditions, and more importantly, an explicit integration of environmental and spatial processes into the selection approach. PMID:22937068

  5. Farm-specific economic value of automatic lameness detection systems in dairy cattle: From concepts to operational simulations.

    PubMed

    Van De Gucht, Tim; Saeys, Wouter; Van Meensel, Jef; Van Nuffel, Annelies; Vangeyte, Jurgen; Lauwers, Ludwig

    2018-01-01

    Although prototypes of automatic lameness detection systems for dairy cattle exist, information about their economic value is lacking. In this paper, a conceptual and operational framework for simulating the farm-specific economic value of automatic lameness detection systems was developed and tested on 4 system types: walkover pressure plates, walkover pressure mats, camera systems, and accelerometers. The conceptual framework maps essential factors that determine economic value (e.g., lameness prevalence, incidence and duration, lameness costs, detection performance, and their relationships). The operational simulation model links treatment costs and avoided losses with detection results and farm-specific information, such as herd size and lameness status. Results show that detection performance, herd size, discount rate, and system lifespan have a large influence on economic value. In addition, lameness prevalence influences the economic value, stressing the importance of an adequate prior estimation of the on-farm prevalence. The simulations provide first estimates for the upper limits for purchase prices of automatic detection systems. The framework allowed for identification of knowledge gaps obstructing more accurate economic value estimation. These include insights in cost reductions due to early detection and treatment, and links between specific lameness causes and their related losses. Because this model provides insight in the trade-offs between automatic detection systems' performance and investment price, it is a valuable tool to guide future research and developments. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery.

    PubMed

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W; Chen, Zhuo Georgia; Fei, Baowei

    2015-01-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  8. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    NASA Astrophysics Data System (ADS)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-12-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  9. CNV-TV: a robust method to discover copy number variation from short sequencing reads.

    PubMed

    Duan, Junbo; Zhang, Ji-Gang; Deng, Hong-Wen; Wang, Yu-Ping

    2013-05-02

    Copy number variation (CNV) is an important structural variation (SV) in human genome. Various studies have shown that CNVs are associated with complex diseases. Traditional CNV detection methods such as fluorescence in situ hybridization (FISH) and array comparative genomic hybridization (aCGH) suffer from low resolution. The next generation sequencing (NGS) technique promises a higher resolution detection of CNVs and several methods were recently proposed for realizing such a promise. However, the performances of these methods are not robust under some conditions, e.g., some of them may fail to detect CNVs of short sizes. There has been a strong demand for reliable detection of CNVs from high resolution NGS data. A novel and robust method to detect CNV from short sequencing reads is proposed in this study. The detection of CNV is modeled as a change-point detection from the read depth (RD) signal derived from the NGS, which is fitted with a total variation (TV) penalized least squares model. The performance (e.g., sensitivity and specificity) of the proposed approach are evaluated by comparison with several recently published methods on both simulated and real data from the 1000 Genomes Project. The experimental results showed that both the true positive rate and false positive rate of the proposed detection method do not change significantly for CNVs with different copy numbers and lengthes, when compared with several existing methods. Therefore, our proposed approach results in a more reliable detection of CNVs than the existing methods.

  10. Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.

    PubMed

    Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda

    2014-05-01

    We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.

  11. Large scale variation in DNA copy number in chicken breeds

    USDA-ARS?s Scientific Manuscript database

    Background Detecting genetic variation is a critical step in elucidating the molecular mechanisms underlying phenotypic diversity. Until recently, such detection has mostly focused on single nucleotide polymorphisms (SNPs) because of the ease in screening complete genomes. Another type of variant, c...

  12. Real time validation of GPS TEC precursor mask for Greece

    NASA Astrophysics Data System (ADS)

    Pulinets, Sergey; Davidenko, Dmitry

    2013-04-01

    It was established by earlier studies of pre-earthquake ionospheric variations that for every specific site these variations manifest definite stability in their temporal behavior within the time interval few days before the seismic shock. This self-similarity (characteristic to phenomena registered for processes observed close to critical point of the system) permits us to consider these variations as a good candidate to short-term precursor. Physical mechanism of GPS TEC variations before earthquakes is developed within the framework of Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model. Taking into account the different tectonic structure and different source mechanisms of earthquakes in different regions of the globe, every site has its individual behavior in pre-earthquake activity what creates individual "imprint" on the ionosphere behavior at every given point. Just this so called "mask" of the ionosphere variability before earthquake in the given point creates opportunity to detect anomalous behavior of electron concentration in ionosphere basing not only on statistical processing procedure but applying the pattern recognition technique what facilitates the automatic recognition of short-term ionospheric precursors of earthquakes. Such kind of precursor mask was created using the GPS TEC variation around the time of 9 earthquakes with magnitude from M6.0 till M6.9 which took place in Greece within the time interval 2006-2011. The major anomaly revealed in the relative deviation of the vertical TEC was the positive anomaly appearing at ~04PM UT one day before the seismic shock and lasting nearly 12 hours till ~04AM UT. To validate this approach it was decided to check the mask in real-time monitoring of earthquakes in Greece starting from the 1 of December 2012 for the earthquakes with magnitude more than 4.5. During this period (till 9 of January 2013) 4 cases of seismic shocks were registered, including the largest one M5.7 on 8 of January. For all of them the mask confirmed its validity and 6 of December event was predicted in advance.

  13. Multi-Parameter Observation and Detection of Pre-Earthquake Signals in Seismically Active Areas

    NASA Technical Reports Server (NTRS)

    Ouzounov, D.; Pulinets, S.; Parrot, M.; Liu, J. Y.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    The recent large earthquakes (M9.0 Tohoku, 03/2011; M7.0 Haiti, 01/2010; M6.7 L Aquila, 04/2008; and M7.9 Wenchuan 05/2008) have renewed interest in pre-anomalous seismic signals associated with them. Recent workshops (DEMETER 2006, 2011 and VESTO 2009 ) have shown that there were precursory atmospheric /ionospheric signals observed in space prior to these events. Our initial results indicate that no single pre-earthquake observation (seismic, magnetic field, electric field, thermal infrared [TIR], or GPS/TEC) can provide a consistent and successful global scale early warning. This is most likely due to complexity and chaotic nature of earthquakes and the limitation in existing ground (temporal/spatial) and global satellite observations. In this study we analyze preseismic temporal and spatial variations (gas/radon counting rate, atmospheric temperature and humidity change, long-wave radiation transitions and ionospheric electron density/plasma variations) which we propose occur before the onset of major earthquakes:. We propose an Integrated Space -- Terrestrial Framework (ISTF), as a different approach for revealing pre-earthquake phenomena in seismically active areas. ISTF is a sensor web of a coordinated observation infrastructure employing multiple sensors that are distributed on one or more platforms; data from satellite sensors (Terra, Aqua, POES, DEMETER and others) and ground observations, e.g., Global Positioning System, Total Electron Content (GPS/TEC). As a theoretical guide we use the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model to explain the generation of multiple earthquake precursors. Using our methodology, we evaluated retrospectively the signals preceding the most devastated earthquakes during 2005-2011. We observed a correlation between both atmospheric and ionospheric anomalies preceding most of these earthquakes. The second phase of our validation include systematic retrospective analysis for more than 100 major earthquakes (M>5.9) in Taiwan and Japan. We have found anomalous behavior before all of these events with no false negatives. Calculated false alarm ratio for the for the same month over the entire period of analysis (2003-2009) is less than 10% and was d as the earthquakes. The commonalities in detecting atmospheric/ionospheric anomalies show that they may exist over both land and sea in regions of maximum stress (i.e., along plate boundaries) Our results indicate that the ISTF model could provide a capability to observe pre-earthquake atmospheric/ionospheric signals by combining this information into a common framework.

  14. The nature of nurture and the future of evodevo: toward a theory of developmental evolution.

    PubMed

    Moczek, Armin P

    2012-07-01

    This essay has three parts. First, I posit that much research in contemporary evodevo remains steeped in a traditional framework that views traits and trait differences as being caused by genes and genetic variation, and the environment as providing an external context in which development and evolution unfold. Second, I discuss three attributes of organismal development and evolution, broadly applicable to all organisms and traits that call into question the usefulness of gene- and genome-centric views of development and evolution. I then focus on the third and main aim of this essay and ask: what conceptual and empirical opportunities exist that would permit evodevo research to transcend the traditional boundaries inherited from its parent disciplines and to move toward the development of a more comprehensive and realistic theory of developmental evolution? Here, I focus on three conceptual frameworks, the theory of facilitated variation, the theory of evolution by genetic accommodation, and the theory of niche construction. I conclude that combined they provide a rich, interlocking framework within which to revise existing and develop novel empirical approaches toward a better understanding of the nature of developmental evolution. Examples of such approaches are highlighted, and the consequences of expanding existing frameworks are discussed.

  15. The United States Army’s Full-Spectrum Training Strategy Challenge

    DTIC Science & Technology

    2012-05-17

    the high operational tempo for deployed and deploying units. Soldiers and units conducted repeat deployments to both Iraq and Afghanistan with little...prevalent in Strategic Operational Design and associated variations of design. Further complicating Israeli ideas of warfare was the notion that...Army to develop an overarching operational framework that finally replaces Airland Battle, like EBO or some variation of operational design. But one

  16. Bayesian segmentation of atrium wall using globally-optimal graph cuts on 3D meshes.

    PubMed

    Veni, Gopalkrishna; Fu, Zhisong; Awate, Suyash P; Whitaker, Ross T

    2013-01-01

    Efficient segmentation of the left atrium (LA) wall from delayed enhancement MRI is challenging due to inconsistent contrast, combined with noise, and high variation in atrial shape and size. We present a surface-detection method that is capable of extracting the atrial wall by computing an optimal a-posteriori estimate. This estimation is done on a set of nested meshes, constructed from an ensemble of segmented training images, and graph cuts on an associated multi-column, proper-ordered graph. The graph/mesh is a part of a template/model that has an associated set of learned intensity features. When this mesh is overlaid onto a test image, it produces a set of costs which lead to an optimal segmentation. The 3D mesh has an associated weighted, directed multi-column graph with edges that encode smoothness and inter-surface penalties. Unlike previous graph-cut methods that impose hard constraints on the surface properties, the proposed method follows from a Bayesian formulation resulting in soft penalties on spatial variation of the cuts through the mesh. The novelty of this method also lies in the construction of proper-ordered graphs on complex shapes for choosing among distinct classes of base shapes for automatic LA segmentation. We evaluate the proposed segmentation framework on simulated and clinical cardiac MRI.

  17. Automatic detection of key innovations, rate shifts, and diversity-dependence on phylogenetic trees.

    PubMed

    Rabosky, Daniel L

    2014-01-01

    A number of methods have been developed to infer differential rates of species diversification through time and among clades using time-calibrated phylogenetic trees. However, we lack a general framework that can delineate and quantify heterogeneous mixtures of dynamic processes within single phylogenies. I developed a method that can identify arbitrary numbers of time-varying diversification processes on phylogenies without specifying their locations in advance. The method uses reversible-jump Markov Chain Monte Carlo to move between model subspaces that vary in the number of distinct diversification regimes. The model assumes that changes in evolutionary regimes occur across the branches of phylogenetic trees under a compound Poisson process and explicitly accounts for rate variation through time and among lineages. Using simulated datasets, I demonstrate that the method can be used to quantify complex mixtures of time-dependent, diversity-dependent, and constant-rate diversification processes. I compared the performance of the method to the MEDUSA model of rate variation among lineages. As an empirical example, I analyzed the history of speciation and extinction during the radiation of modern whales. The method described here will greatly facilitate the exploration of macroevolutionary dynamics across large phylogenetic trees, which may have been shaped by heterogeneous mixtures of distinct evolutionary processes.

  18. Automatic Detection of Key Innovations, Rate Shifts, and Diversity-Dependence on Phylogenetic Trees

    PubMed Central

    Rabosky, Daniel L.

    2014-01-01

    A number of methods have been developed to infer differential rates of species diversification through time and among clades using time-calibrated phylogenetic trees. However, we lack a general framework that can delineate and quantify heterogeneous mixtures of dynamic processes within single phylogenies. I developed a method that can identify arbitrary numbers of time-varying diversification processes on phylogenies without specifying their locations in advance. The method uses reversible-jump Markov Chain Monte Carlo to move between model subspaces that vary in the number of distinct diversification regimes. The model assumes that changes in evolutionary regimes occur across the branches of phylogenetic trees under a compound Poisson process and explicitly accounts for rate variation through time and among lineages. Using simulated datasets, I demonstrate that the method can be used to quantify complex mixtures of time-dependent, diversity-dependent, and constant-rate diversification processes. I compared the performance of the method to the MEDUSA model of rate variation among lineages. As an empirical example, I analyzed the history of speciation and extinction during the radiation of modern whales. The method described here will greatly facilitate the exploration of macroevolutionary dynamics across large phylogenetic trees, which may have been shaped by heterogeneous mixtures of distinct evolutionary processes. PMID:24586858

  19. The genetic diversity of Epstein-Barr virus in the setting of transplantation relative to non-transplant settings: A feasibility study.

    PubMed

    Allen, Upton D; Hu, Pingzhao; Pereira, Sergio L; Robinson, Joan L; Paton, Tara A; Beyene, Joseph; Khodai-Booran, Nasser; Dipchand, Anne; Hébert, Diane; Ng, Vicky; Nalpathamkalam, Thomas; Read, Stanley

    2016-02-01

    This study examines EBV strains from transplant patients and patients with IM by sequencing major EBV genes. We also used NGS to detect EBV DNA within total genomic DNA, and to evaluate its genetic variation. Sanger sequencing of major EBV genes was used to compare SNVs from samples taken from transplant patients vs. patients with IM. We sequenced EBV DNA from a healthy EBV-seropositive individual on a HiSeq 2000 instrument. Data were mapped to the EBV reference genomes (AG876 and B95-8). The number of EBNA2 SNVs was higher than for EBNA1 and the other genes sequenced within comparable reference coordinates. For EBNA2, there was a median of 15 SNV among transplant samples compared with 10 among IM samples (p = 0.036). EBNA1 showed little variation between samples. For NGS, we identified 640 and 892 variants at an unadjusted p value of 5 × 10(-8) for AG876 and B95-8 genomes, respectively. We used complementary sequence strategies to examine EBV genetic diversity and its application to transplantation. The results provide the framework for further characterization of EBV strains and related outcomes after organ transplantation. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Onsager's variational principle in soft matter.

    PubMed

    Doi, Masao

    2011-07-20

    In the celebrated paper on the reciprocal relation for the kinetic coefficients in irreversible processes, Onsager (1931 Phys. Rev. 37 405) extended Rayleigh's principle of the least energy dissipation to general irreversible processes. In this paper, I shall show that this variational principle gives us a very convenient framework for deriving many established equations which describe the nonlinear and non-equilibrium phenomena in soft matter, such as phase separation kinetics in solutions, gel dynamics, molecular modeling for viscoelasticity nemato-hydrodynamics, etc. Onsager's variational principle can therefore be regarded as a solid general basis for soft matter physics.

  1. Nonperturbative calculations in the framework of variational perturbation theory in QCD

    NASA Astrophysics Data System (ADS)

    Solovtsova, O. P.

    2017-07-01

    We discuss applications of the method based on the variational perturbation theory to perform calculations down to the lowest energy scale. The variational series is different from the conventional perturbative expansion and can be used to go beyond the weak-coupling regime. We apply this method to investigate the Borel representation of the light Adler function constructed from the τ data and to determine the residual condensates. It is shown that within the method suggested the optimal values of these lower dimension condensates are close to zero.

  2. Heterothermy in two mole-rat species subjected to interacting thermoregulatory challenges.

    PubMed

    Boyles, Justin G; Verburgt, Luke; McKechnie, Andrew E; Bennett, Nigel C

    2012-02-01

    Maintaining a high and constant body temperature (T(b) ) is often viewed as a fundamental benefit of endothermy, but variation in T(b) is likely the norm rather than an exception among endotherms. Thus, attempts to elucidate which factors cause T(b) of endotherms to deviate away from the T(b) that maximizes performance are becoming more common. One approach relies on an adaptive framework of thermoregulation, used for a long time to predict variation in T(b) of ectotherms, as a starting point to make predictions about the factors that should lead to thermoregulatory variation in endotherms. Here we test the predictions that when confronted with thermoregulatory challenges endotherms should (1) become more heterothermic, (2) lower their T(b) setpoint, and/or (3) increase behavioral thermoregulation (e.g., activity levels or social thermoregulation). We exposed two species of relatively homeothermic mole-rats to two such challenges: (a) ambient temperatures (T(a)) well below the thermoneutral zone and (b) increased heat loss caused by the removal of dorsal fur. In general, our results support the adaptive framework of endothermic thermoregulation with each species conforming to some of the predictions. For example, Mashona mole-rats (Fukomys darlingi) increased heterothermy as T(a) decreased, highveld mole-rats (Cryptomys hottentotus pretoriae) displayed lower T(b) 's after shaving, and both species increased behavioral thermoregulation as T(a) decreased. This suggests that there is some merit in extending the adaptive framework to endotherms. However, none of the three predictions we tested was supported under all experimental conditions, reiterating that attempts to determine universal factors causing variation in T(b) of endotherms may prove challenging. © 2011 WILEY PERIODICALS, INC.

  3. Marginal Shape Deep Learning: Applications to Pediatric Lung Field Segmentation.

    PubMed

    Mansoor, Awais; Cerrolaza, Juan J; Perez, Geovanny; Biggs, Elijah; Nino, Gustavo; Linguraru, Marius George

    2017-02-11

    Representation learning through deep learning (DL) architecture has shown tremendous potential for identification, localization, and texture classification in various medical imaging modalities. However, DL applications to segmentation of objects especially to deformable objects are rather limited and mostly restricted to pixel classification. In this work, we propose marginal shape deep learning (MaShDL), a framework that extends the application of DL to deformable shape segmentation by using deep classifiers to estimate the shape parameters. MaShDL combines the strength of statistical shape models with the automated feature learning architecture of DL. Unlike the iterative shape parameters estimation approach of classical shape models that often leads to a local minima, the proposed framework is robust to local minima optimization and illumination changes. Furthermore, since the direct application of DL framework to a multi-parameter estimation problem results in a very high complexity, our framework provides an excellent run-time performance solution by independently learning shape parameter classifiers in marginal eigenspaces in the decreasing order of variation. We evaluated MaShDL for segmenting the lung field from 314 normal and abnormal pediatric chest radiographs and obtained a mean Dice similarity coefficient of 0.927 using only the four highest modes of variation (compared to 0.888 with classical ASM 1 (p-value=0.01) using same configuration). To the best of our knowledge this is the first demonstration of using DL framework for parametrized shape learning for the delineation of deformable objects.

  4. Marginal shape deep learning: applications to pediatric lung field segmentation

    NASA Astrophysics Data System (ADS)

    Mansoor, Awais; Cerrolaza, Juan J.; Perez, Geovany; Biggs, Elijah; Nino, Gustavo; Linguraru, Marius George

    2017-02-01

    Representation learning through deep learning (DL) architecture has shown tremendous potential for identification, local- ization, and texture classification in various medical imaging modalities. However, DL applications to segmentation of objects especially to deformable objects are rather limited and mostly restricted to pixel classification. In this work, we propose marginal shape deep learning (MaShDL), a framework that extends the application of DL to deformable shape segmentation by using deep classifiers to estimate the shape parameters. MaShDL combines the strength of statistical shape models with the automated feature learning architecture of DL. Unlike the iterative shape parameters estimation approach of classical shape models that often leads to a local minima, the proposed framework is robust to local minima optimization and illumination changes. Furthermore, since the direct application of DL framework to a multi-parameter estimation problem results in a very high complexity, our framework provides an excellent run-time performance solution by independently learning shape parameter classifiers in marginal eigenspaces in the decreasing order of variation. We evaluated MaShDL for segmenting the lung field from 314 normal and abnormal pediatric chest radiographs and obtained a mean Dice similarity coefficient of 0:927 using only the four highest modes of variation (compared to 0:888 with classical ASM1 (p-value=0:01) using same configuration). To the best of our knowledge this is the first demonstration of using DL framework for parametrized shape learning for the delineation of deformable objects.

  5. Marginal Shape Deep Learning: Applications to Pediatric Lung Field Segmentation

    PubMed Central

    Mansoor, Awais; Cerrolaza, Juan J.; Perez, Geovanny; Biggs, Elijah; Nino, Gustavo; Linguraru, Marius George

    2017-01-01

    Representation learning through deep learning (DL) architecture has shown tremendous potential for identification, localization, and texture classification in various medical imaging modalities. However, DL applications to segmentation of objects especially to deformable objects are rather limited and mostly restricted to pixel classification. In this work, we propose marginal shape deep learning (MaShDL), a framework that extends the application of DL to deformable shape segmentation by using deep classifiers to estimate the shape parameters. MaShDL combines the strength of statistical shape models with the automated feature learning architecture of DL. Unlike the iterative shape parameters estimation approach of classical shape models that often leads to a local minima, the proposed framework is robust to local minima optimization and illumination changes. Furthermore, since the direct application of DL framework to a multi-parameter estimation problem results in a very high complexity, our framework provides an excellent run-time performance solution by independently learning shape parameter classifiers in marginal eigenspaces in the decreasing order of variation. We evaluated MaShDL for segmenting the lung field from 314 normal and abnormal pediatric chest radiographs and obtained a mean Dice similarity coefficient of 0.927 using only the four highest modes of variation (compared to 0.888 with classical ASM1 (p-value=0.01) using same configuration). To the best of our knowledge this is the first demonstration of using DL framework for parametrized shape learning for the delineation of deformable objects. PMID:28592911

  6. Parameterization models for pesticide exposure via crop consumption.

    PubMed

    Fantke, Peter; Wieland, Peter; Juraske, Ronnie; Shaddick, Gavin; Itoiz, Eva Sevigné; Friedrich, Rainer; Jolliet, Olivier

    2012-12-04

    An approach for estimating human exposure to pesticides via consumption of six important food crops is presented that can be used to extend multimedia models applied in health risk and life cycle impact assessment. We first assessed the variation of model output (pesticide residues per kg applied) as a function of model input variables (substance, crop, and environmental properties) including their possible correlations using matrix algebra. We identified five key parameters responsible for between 80% and 93% of the variation in pesticide residues, namely time between substance application and crop harvest, degradation half-lives in crops and on crop surfaces, overall residence times in soil, and substance molecular weight. Partition coefficients also play an important role for fruit trees and tomato (Kow), potato (Koc), and lettuce (Kaw, Kow). Focusing on these parameters, we develop crop-specific models by parametrizing a complex fate and exposure assessment framework. The parametric models thereby reflect the framework's physical and chemical mechanisms and predict pesticide residues in harvest using linear combinations of crop, crop surface, and soil compartments. Parametric model results correspond well with results from the complex framework for 1540 substance-crop combinations with total deviations between a factor 4 (potato) and a factor 66 (lettuce). Predicted residues also correspond well with experimental data previously used to evaluate the complex framework. Pesticide mass in harvest can finally be combined with reduction factors accounting for food processing to estimate human exposure from crop consumption. All parametric models can be easily implemented into existing assessment frameworks.

  7. Improving the Efficiency and Effectiveness of Community Detection via Prior-Induced Equivalent Super-Network.

    PubMed

    Yang, Liang; Jin, Di; He, Dongxiao; Fu, Huazhu; Cao, Xiaochun; Fogelman-Soulie, Francoise

    2017-03-29

    Due to the importance of community structure in understanding network and a surge of interest aroused on community detectability, how to improve the community identification performance with pairwise prior information becomes a hot topic. However, most existing semi-supervised community detection algorithms only focus on improving the accuracy but ignore the impacts of priors on speeding detection. Besides, they always require to tune additional parameters and cannot guarantee pairwise constraints. To address these drawbacks, we propose a general, high-speed, effective and parameter-free semi-supervised community detection framework. By constructing the indivisible super-nodes according to the connected subgraph of the must-link constraints and by forming the weighted super-edge based on network topology and cannot-link constraints, our new framework transforms the original network into an equivalent but much smaller Super-Network. Super-Network perfectly ensures the must-link constraints and effectively encodes cannot-link constraints. Furthermore, the time complexity of super-network construction process is linear in the original network size, which makes it efficient. Meanwhile, since the constructed super-network is much smaller than the original one, any existing community detection algorithm is much faster when using our framework. Besides, the overall process will not introduce any additional parameters, making it more practical.

  8. Face liveness detection using shearlet-based feature descriptors

    NASA Astrophysics Data System (ADS)

    Feng, Litong; Po, Lai-Man; Li, Yuming; Yuan, Fang

    2016-07-01

    Face recognition is a widely used biometric technology due to its convenience but it is vulnerable to spoofing attacks made by nonreal faces such as photographs or videos of valid users. The antispoof problem must be well resolved before widely applying face recognition in our daily life. Face liveness detection is a core technology to make sure that the input face is a live person. However, this is still very challenging using conventional liveness detection approaches of texture analysis and motion detection. The aim of this paper is to propose a feature descriptor and an efficient framework that can be used to effectively deal with the face liveness detection problem. In this framework, new feature descriptors are defined using a multiscale directional transform (shearlet transform). Then, stacked autoencoders and a softmax classifier are concatenated to detect face liveness. We evaluated this approach using the CASIA Face antispoofing database and replay-attack database. The experimental results show that our approach performs better than the state-of-the-art techniques following the provided protocols of these databases, and it is possible to significantly enhance the security of the face recognition biometric system. In addition, the experimental results also demonstrate that this framework can be easily extended to classify different spoofing attacks.

  9. Price responsiveness of demand for cigarettes: does rationality matter?

    PubMed

    Laporte, Audrey

    2006-01-01

    Meta-analysis is applied to aggregate-level studies that model the demand for cigarettes using static, myopic, or rational addiction frameworks in an attempt to synthesize key findings in the literature and to identify determinants of the variation in reported price elasticity estimates across studies. The results suggest that the rational addiction framework produces statistically similar estimates to the static framework but that studies that use the myopic framework tend to report more elastic price effects. Studies that applied panel data techniques or controlled for cross-border smuggling reported more elastic price elasticity estimates, whereas the use of instrumental variable techniques and time trends or time dummy variables produced less elastic estimates. The finding that myopic models produce different estimates than either of the other two model frameworks underscores that careful attention must be given to time series properties of the data.

  10. Molecular spectrum of somaclonal variation in regenerated rice revealed by whole-genome sequencing.

    PubMed

    Miyao, Akio; Nakagome, Mariko; Ohnuma, Takako; Yamagata, Harumi; Kanamori, Hiroyuki; Katayose, Yuichi; Takahashi, Akira; Matsumoto, Takashi; Hirochika, Hirohiko

    2012-01-01

    Somaclonal variation is a phenomenon that results in the phenotypic variation of plants regenerated from cell culture. One of the causes of somaclonal variation in rice is the transposition of retrotransposons. However, many aspects of the mechanisms that result in somaclonal variation remain undefined. To detect genome-wide changes in regenerated rice, we analyzed the whole-genome sequences of three plants independently regenerated from cultured cells originating from a single seed stock. Many single-nucleotide polymorphisms (SNPs) and insertions and deletions (indels) were detected in the genomes of the regenerated plants. The transposition of only Tos17 among 43 transposons examined was detected in the regenerated plants. Therefore, the SNPs and indels contribute to the somaclonal variation in regenerated rice in addition to the transposition of Tos17. The observed molecular spectrum was similar to that of the spontaneous mutations in Arabidopsis thaliana. However, the base change ratio was estimated to be 1.74 × 10(-6) base substitutions per site per regeneration, which is 248-fold greater than the spontaneous mutation rate of A. thaliana.

  11. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis

    NASA Astrophysics Data System (ADS)

    Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  12. Pneumothorax detection in chest radiographs using local and global texture signatures

    NASA Astrophysics Data System (ADS)

    Geva, Ofer; Zimmerman-Moreno, Gali; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit

    2015-03-01

    A novel framework for automatic detection of pneumothorax abnormality in chest radiographs is presented. The suggested method is based on a texture analysis approach combined with supervised learning techniques. The proposed framework consists of two main steps: at first, a texture analysis process is performed for detection of local abnormalities. Labeled image patches are extracted in the texture analysis procedure following which local analysis values are incorporated into a novel global image representation. The global representation is used for training and detection of the abnormality at the image level. The presented global representation is designed based on the distinctive shape of the lung, taking into account the characteristics of typical pneumothorax abnormalities. A supervised learning process was performed on both the local and global data, leading to trained detection system. The system was tested on a dataset of 108 upright chest radiographs. Several state of the art texture feature sets were experimented with (Local Binary Patterns, Maximum Response filters). The optimal configuration yielded sensitivity of 81% with specificity of 87%. The results of the evaluation are promising, establishing the current framework as a basis for additional improvements and extensions.

  13. Detecting glaucomatous change in visual fields: Analysis with an optimization framework.

    PubMed

    Yousefi, Siamak; Goldbaum, Michael H; Varnousfaderani, Ehsan S; Belghith, Akram; Jung, Tzyy-Ping; Medeiros, Felipe A; Zangwill, Linda M; Weinreb, Robert N; Liebmann, Jeffrey M; Girkin, Christopher A; Bowd, Christopher

    2015-12-01

    Detecting glaucomatous progression is an important aspect of glaucoma management. The assessment of longitudinal series of visual fields, measured using Standard Automated Perimetry (SAP), is considered the reference standard for this effort. We seek efficient techniques for determining progression from longitudinal visual fields by formulating the problem as an optimization framework, learned from a population of glaucoma data. The longitudinal data from each patient's eye were used in a convex optimization framework to find a vector that is representative of the progression direction of the sample population, as a whole. Post-hoc analysis of longitudinal visual fields across the derived vector led to optimal progression (change) detection. The proposed method was compared to recently described progression detection methods and to linear regression of instrument-defined global indices, and showed slightly higher sensitivities at the highest specificities than other methods (a clinically desirable result). The proposed approach is simpler, faster, and more efficient for detecting glaucomatous changes, compared to our previously proposed machine learning-based methods, although it provides somewhat less information. This approach has potential application in glaucoma clinics for patient monitoring and in research centers for classification of study participants. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    PubMed Central

    Elgendi, Mohamed

    2016-01-01

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852

  15. Coarse-graining errors and numerical optimization using a relative entropy framework

    NASA Astrophysics Data System (ADS)

    Chaimovich, Aviel; Shell, M. Scott

    2011-03-01

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, Srel, that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework.

  16. Intentional Voice Command Detection for Trigger-Free Speech Interface

    NASA Astrophysics Data System (ADS)

    Obuchi, Yasunari; Sumiyoshi, Takashi

    In this paper we introduce a new framework of audio processing, which is essential to achieve a trigger-free speech interface for home appliances. If the speech interface works continually in real environments, it must extract occasional voice commands and reject everything else. It is extremely important to reduce the number of false alarms because the number of irrelevant inputs is much larger than the number of voice commands even for heavy users of appliances. The framework, called Intentional Voice Command Detection, is based on voice activity detection, but enhanced by various speech/audio processing techniques such as emotion recognition. The effectiveness of the proposed framework is evaluated using a newly-collected large-scale corpus. The advantages of combining various features were tested and confirmed, and the simple LDA-based classifier demonstrated acceptable performance. The effectiveness of various methods of user adaptation is also discussed.

  17. (Quickly) Testing the Tester via Path Coverage

    NASA Technical Reports Server (NTRS)

    Groce, Alex

    2009-01-01

    The configuration complexity and code size of an automated testing framework may grow to a point that the tester itself becomes a significant software artifact, prone to poor configuration and implementation errors. Unfortunately, testing the tester by using old versions of the software under test (SUT) may be impractical or impossible: test framework changes may have been motivated by interface changes in the tested system, or fault detection may become too expensive in terms of computing time to justify running until errors are detected on older versions of the software. We propose the use of path coverage measures as a "quick and dirty" method for detecting many faults in complex test frameworks. We also note the possibility of using techniques developed to diversify state-space searches in model checking to diversify test focus, and an associated classification of tester changes into focus-changing and non-focus-changing modifications.

  18. Stepwise and stagewise approaches for spatial cluster detection.

    PubMed

    Xu, Jiale; Gangnon, Ronald E

    2016-05-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either a hypothesis testing framework or a Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with a tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic areas. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A New Framework of Removing Salt and Pepper Impulse Noise for the Noisy Image Including Many Noise-Free White and Black Pixels

    NASA Astrophysics Data System (ADS)

    Li, Song; Wang, Caizhu; Li, Yeqiu; Wang, Ling; Sakata, Shiro; Sekiya, Hiroo; Kuroiwa, Shingo

    In this paper, we propose a new framework of removing salt and pepper impulse noise. In our proposed framework, the most important point is that the number of noise-free white and black pixels in a noisy image can be determined by using the noise rates estimated by Fuzzy Impulse Noise Detection and Reduction Method (FINDRM) and Efficient Detail-Preserving Approach (EDPA). For the noisy image includes many noise-free white and black pixels, the detected noisy pixel from the FINDRM is re-checked by using the alpha-trimmed mean. Finally, the impulse noise filtering phase of the FINDRM is used to restore the image. Simulation results show that for the noisy image including many noise-free white and black pixels, the proposed framework can decrease the False Hit Rate (FHR) efficiently compared with the FINDRM. Therefore, the proposed framework can be used more widely than the FINDRM.

  20. The international experience of bacterial screen testing of platelet components with an automated microbial detection system: a need for consensus testing and reporting guidelines.

    PubMed

    Benjamin, Richard J; McDonald, Carl P

    2014-04-01

    The BacT/ALERT microbial detection system (bioMerieux, Inc, Durham, NC) is in routine use in many blood centers as a prerelease test for platelet collections. Published reports document wide variation in practices and outcomes. A systematic review of the English literature was performed to describe publications assessing the use of the BacT/ALERT culture system on platelet collections as a routine screen test of more than 10000 platelet components. Sixteen publications report the use of confirmatory testing to substantiate initial positive culture results but use varying nomenclature to classify the results. Preanalytical and analytical variables that may affect the outcomes differ widely between centers. Incomplete description of protocol details complicates comparison between sites. Initial positive culture results range from 539 to 10606 per million (0.054%-1.061%) and confirmed positive from 127 to 1035 per million (0.013%-0.104%) donations. False-negative results determined by outdate culture range from 662 to 2173 per million (0.066%-0.217%) and by septic reactions from 0 to 66 per million (0%-0.007%) collections. Current culture protocols represent pragmatic compromises between optimizing analytical sensitivity and ensuring the timely availability of platelets for clinical needs. Insights into the effect of protocol variations on outcomes are generally restricted to individual sites that implement limited changes to their protocols over time. Platelet manufacturers should reassess the adequacy of their BacT/ALERT screening protocols in light of the growing international experience and provide detailed documentation of all variables that may affect culture outcomes when reporting results. We propose a framework for a standardized nomenclature for reporting of the results of BacT/ALERT screening. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Real-time source deformation modeling through GNSS permanent stations at Merapi volcano (Indonesia

    NASA Astrophysics Data System (ADS)

    Beauducel, F.; Nurnaning, A.; Iguchi, M.; Fahmi, A. A.; Nandaka, M. A.; Sumarti, S.; Subandriyo, S.; Metaxian, J. P.

    2014-12-01

    Mt. Merapi (Java, Indonesia) is one of the most active and dangerous volcano in the world. A first GPS repetition network was setup and periodically measured since 1993, allowing detecting a deep magma reservoir, quantifying magma flux in conduit and identifying shallow discontinuities around the former crater (Beauducel and Cornet, 1999;Beauducel et al., 2000, 2006). After the 2010 centennial eruption, when this network was almost completely destroyed, Indonesian and Japanese teams installed a new continuous GPS network for monitoring purpose (Iguchi et al., 2011), consisting of 3 stations located at the volcano flanks, plus a reference station at the Yogyakarta Observatory (BPPTKG).In the framework of DOMERAPI project (2013-2016) we have completed this network with 5 additional stations, which are located on the summit area and volcano surrounding. The new stations are 1-Hz sampling, GNSS (GPS + GLONASS) receivers, and near real-time data streaming to the Observatory. An automatic processing has been developed and included in the WEBOBS system (Beauducel et al., 2010) based on GIPSY software computing precise daily moving solutions every hour, and for different time scales (2 months, 1 and 5 years), time series and velocity vectors. A real-time source modeling estimation has also been implemented. It uses the depth-varying point source solution (Mogi, 1958; Williams and Wadge, 1998) in a systematic inverse problem model exploration that displays location, volume variation and 3-D probability map.The operational system should be able to better detect and estimate the location and volume variations of possible magma sources, and to follow magma transfer towards the surface. This should help monitoring and contribute to decision making during future unrest or eruption.

  2. Fuzzy boundaries: color and gene flow patterns among parapatric lineages of the western shovel-nosed snake and taxonomic implication

    USGS Publications Warehouse

    Wood, Dustin A.; Fisher, Robert N.; Vandergast, Amy G.

    2014-01-01

    Accurate delineation of lineage diversity is increasingly important, as species distributions are becoming more reduced and threatened. During the last century, the subspecies category was often used to denote phenotypic variation within a species range and to provide a framework for understanding lineage differentiation, often considered incipient speciation. While this category has largely fallen into disuse, previously recognized subspecies often serve as important units for conservation policy and management when other information is lacking. In this study, we evaluated phenotypic subspecies hypotheses within shovel-nosed snakes on the basis of genetic data and considered how evolutionary processes such as gene flow influenced possible incongruence between phenotypic and genetic patterns. We used both traditional phylogenetic and Bayesian clustering analyses to infer range-wide genetic structure and spatially explicit analyses to detect possible boundary locations of lineage contact. Multilocus analyses supported three historically isolated groups with low to moderate levels of contemporary gene exchange. Genetic data did not support phenotypic subspecies as exclusive groups, and we detected patterns of discordance in areas where three subspecies are presumed to be in contact. Based on genetic and phenotypic evidence, we suggested that species-level diversity is underestimated in this group and we proposed that two species be recognized, Chionactis occipitalis and C. annulata. In addition, we recommend retention of two subspecific designations within C. annulata (C. a. annulata and C. a. klauberi) that reflect regional shifts in both genetic and phenotypic variation within the species. Our results highlight the difficultly in validating taxonomic boundaries within lineages that are evolving under a time-dependent, continuous process.

  3. Detecting Mechanisms of Karyotype Evolution in Heterotaxis (Orchidaceae)

    PubMed Central

    Olmos Simões, André; Ojeda Alayon, Dario Isidro; de Barros, Fábio; Forni-Martins, Eliana Regina

    2016-01-01

    The karyotype is shaped by different chromosome rearrangements during species evolution. However, determining which rearrangements are responsible for karyotype changes is a challenging task and the combination of a robust phylogeny with refined karyotype characterization, GS measurements and bioinformatic modelling is necessary. Here, this approach was applied in Heterotaxis to determine what chromosome rearrangements were responsible for the dysploidy variation. We used two datasets (nrDNA and cpDNA, both under MP and BI) to infer the phylogenetic relationships among Heterotaxis species and the closely related genera Nitidobulbon and Ornithidium. Such phylogenies were used as framework to infer how karyotype evolution occurred using statistical methods. The nrDNA recovered Ornithidium, Nitidobulbon and Heterotaxis as monophyletic under both MP and BI; while cpDNA could not completely separate the three genera under both methods. Based on the GS, we recovered two groups within Heterotaxis: (1) "small GS", corresponding to the Sessilis grade, composed of plants with smaller genomes and smaller morphological structure, and (2) "large GS", corresponding to the Discolor clade, composed of plants with large genomes and robust morphological structures. The robust karyotype modeling, using both nrDNA phylogenies, allowed us to infer that the ancestral Heterotaxis karyotype presented 2n = 40, probably with a proximal 45S rDNA on a metacentric chromosome pair. The chromosome number variation was caused by ascending dysploidy (chromosome fission involving the proximal 45S rDNA site resulting in two acrocentric chromosome pairs holding a terminal 45S rDNA), with subsequent descending dysploidy (fusion) in two species, H. maleolens and H. sessilis. However, besides dysploidy, our analysis detected another important chromosome rearrangement in the Orchidaceae: chromosome inversion, that promoted 5S rDNA site duplication and relocation. PMID:27832130

  4. Fuzzy boundaries: color and gene flow patterns among parapatric lineages of the western shovel-nosed snake and taxonomic implication.

    PubMed

    Wood, Dustin A; Fisher, Robert N; Vandergast, Amy G

    2014-01-01

    Accurate delineation of lineage diversity is increasingly important, as species distributions are becoming more reduced and threatened. During the last century, the subspecies category was often used to denote phenotypic variation within a species range and to provide a framework for understanding lineage differentiation, often considered incipient speciation. While this category has largely fallen into disuse, previously recognized subspecies often serve as important units for conservation policy and management when other information is lacking. In this study, we evaluated phenotypic subspecies hypotheses within shovel-nosed snakes on the basis of genetic data and considered how evolutionary processes such as gene flow influenced possible incongruence between phenotypic and genetic patterns. We used both traditional phylogenetic and Bayesian clustering analyses to infer range-wide genetic structure and spatially explicit analyses to detect possible boundary locations of lineage contact. Multilocus analyses supported three historically isolated groups with low to moderate levels of contemporary gene exchange. Genetic data did not support phenotypic subspecies as exclusive groups, and we detected patterns of discordance in areas where three subspecies are presumed to be in contact. Based on genetic and phenotypic evidence, we suggested that species-level diversity is underestimated in this group and we proposed that two species be recognized, Chionactis occipitalis and C. annulata. In addition, we recommend retention of two subspecific designations within C. annulata (C. a. annulata and C. a. klauberi) that reflect regional shifts in both genetic and phenotypic variation within the species. Our results highlight the difficultly in validating taxonomic boundaries within lineages that are evolving under a time-dependent, continuous process.

  5. Fuzzy Boundaries: Color and Gene Flow Patterns among Parapatric Lineages of the Western Shovel-Nosed Snake and Taxonomic Implication

    PubMed Central

    Wood, Dustin A.; Fisher, Robert N.; Vandergast, Amy G.

    2014-01-01

    Accurate delineation of lineage diversity is increasingly important, as species distributions are becoming more reduced and threatened. During the last century, the subspecies category was often used to denote phenotypic variation within a species range and to provide a framework for understanding lineage differentiation, often considered incipient speciation. While this category has largely fallen into disuse, previously recognized subspecies often serve as important units for conservation policy and management when other information is lacking. In this study, we evaluated phenotypic subspecies hypotheses within shovel-nosed snakes on the basis of genetic data and considered how evolutionary processes such as gene flow influenced possible incongruence between phenotypic and genetic patterns. We used both traditional phylogenetic and Bayesian clustering analyses to infer range-wide genetic structure and spatially explicit analyses to detect possible boundary locations of lineage contact. Multilocus analyses supported three historically isolated groups with low to moderate levels of contemporary gene exchange. Genetic data did not support phenotypic subspecies as exclusive groups, and we detected patterns of discordance in areas where three subspecies are presumed to be in contact. Based on genetic and phenotypic evidence, we suggested that species-level diversity is underestimated in this group and we proposed that two species be recognized, Chionactis occipitalis and C. annulata. In addition, we recommend retention of two subspecific designations within C. annulata (C. a. annulata and C. a. klauberi) that reflect regional shifts in both genetic and phenotypic variation within the species. Our results highlight the difficultly in validating taxonomic boundaries within lineages that are evolving under a time-dependent, continuous process. PMID:24848638

  6. A conceptual framework for implementation fidelity

    PubMed Central

    Carroll, Christopher; Patterson, Malcolm; Wood, Stephen; Booth, Andrew; Rick, Jo; Balain, Shashi

    2007-01-01

    Background Implementation fidelity refers to the degree to which an intervention or programme is delivered as intended. Only by understanding and measuring whether an intervention has been implemented with fidelity can researchers and practitioners gain a better understanding of how and why an intervention works, and the extent to which outcomes can be improved. Discussion The authors undertook a critical review of existing conceptualisations of implementation fidelity and developed a new conceptual framework for understanding and measuring the process. The resulting theoretical framework requires testing by empirical research. Summary Implementation fidelity is an important source of variation affecting the credibility and utility of research. The conceptual framework presented here offers a means for measuring this variable and understanding its place in the process of intervention implementation. PMID:18053122

  7. Machine Learning Methods for Attack Detection in the Smart Grid.

    PubMed

    Ozay, Mete; Esnaola, Inaki; Yarman Vural, Fatos Tunay; Kulkarni, Sanjeev R; Poor, H Vincent

    2016-08-01

    Attack detection problems in the smart grid are posed as statistical learning problems for different attack scenarios in which the measurements are observed in batch or online settings. In this approach, machine learning algorithms are used to classify measurements as being either secure or attacked. An attack detection framework is provided to exploit any available prior knowledge about the system and surmount constraints arising from the sparse structure of the problem in the proposed approach. Well-known batch and online learning algorithms (supervised and semisupervised) are employed with decision- and feature-level fusion to model the attack detection problem. The relationships between statistical and geometric properties of attack vectors employed in the attack scenarios and learning algorithms are analyzed to detect unobservable attacks using statistical learning methods. The proposed algorithms are examined on various IEEE test systems. Experimental analyses show that machine learning algorithms can detect attacks with performances higher than attack detection algorithms that employ state vector estimation methods in the proposed attack detection framework.

  8. Single-cell copy number variation detection

    PubMed Central

    2011-01-01

    Detection of chromosomal aberrations from a single cell by array comparative genomic hybridization (single-cell array CGH), instead of from a population of cells, is an emerging technique. However, such detection is challenging because of the genome artifacts and the DNA amplification process inherent to the single cell approach. Current normalization algorithms result in inaccurate aberration detection for single-cell data. We propose a normalization method based on channel, genome composition and recurrent genome artifact corrections. We demonstrate that the proposed channel clone normalization significantly improves the copy number variation detection in both simulated and real single-cell array CGH data. PMID:21854607

  9. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction.

    PubMed

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  10. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    NASA Astrophysics Data System (ADS)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  11. Data-Driven Information Extraction from Chinese Electronic Medical Records

    PubMed Central

    Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q.

    2015-01-01

    Objective This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Materials and Methods Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. Results The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. Discussion In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). Conclusions The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica. PMID:26295801

  12. Data-Driven Information Extraction from Chinese Electronic Medical Records.

    PubMed

    Xu, Dong; Zhang, Meizhuo; Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q

    2015-01-01

    This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica.

  13. Application of heteroduplex analysis for detecting variation within the growth hormone 2 gene in Salmo trutta L. (brown trout).

    PubMed

    Gross, R; Nilsson, J

    1995-03-01

    A new method to detect variation at a single copy nuclear gene in brown trout, Salmo trutta L., is provided. The technique entails (i) selective gene amplification by the polymerase chain reaction (PCR), (ii) digestion of amplification products by restriction endonucleases to obtain fragments of suitable size, (iii) hybridization with heterologous DNA followed by denaturation and reannealing to obtain heteroduplex molecules, and (iv) screening for variation in polyacrylamide gels. Variation was studied within a growth hormone 2 gene 1489 bp segment and polymorphism was detected in two HinfI-digested fragments. Formation of different heteroduplex patterns in experimental mixtures of digested amplification products from brown trout and Atlantic salmon, Salmo salar L., allowed us to determine the genotype of the brown trout. Polymorphism was observed in four out of six studied populations.

  14. Investigating variations in implementation fidelity of an organizational-level occupational health intervention.

    PubMed

    Augustsson, Hanna; von Thiele Schwarz, Ulrica; Stenfors-Hayes, Terese; Hasson, Henna

    2015-06-01

    The workplace has been suggested as an important arena for health promotion, but little is known about how the organizational setting influences the implementation of interventions. The aims of this study are to evaluate implementation fidelity in an organizational-level occupational health intervention and to investigate possible explanations for variations in fidelity between intervention units. The intervention consisted of an integration of health promotion, occupational health and safety, and a system for continuous improvements (Kaizen) and was conducted in a quasi-experimental design at a Swedish hospital. Implementation fidelity was evaluated with the Conceptual Framework for Implementation Fidelity and implementation factors used to investigate variations in fidelity with the Framework for Evaluating Organizational-level Interventions. A multi-method approach including interviews, Kaizen notes, and questionnaires was applied. Implementation fidelity differed between units even though the intervention was introduced and supported in the same way. Important differences in all elements proposed in the model for evaluating organizational-level interventions, i.e., context, intervention, and mental models, were found to explain the differences in fidelity. Implementation strategies may need to be adapted depending on the local context. Implementation fidelity, as well as pre-intervention implementation elements, is likely to affect the implementation success and needs to be assessed in intervention research. The high variation in fidelity across the units indicates the need for adjustments to the type of designs used to assess the effects of interventions. Thus, rather than using designs that aim to control variation, it may be necessary to use those that aim at exploring and explaining variation, such as adapted study designs.

  15. A general framework for the solvatochromism of pyridinium phenolate betaine dyes

    NASA Astrophysics Data System (ADS)

    Rezende, Marcos Caroli; Aracena, Andrés

    2013-02-01

    A general framework for the solvatochromic behavior of pyridinium phenolate betaine dyes is presented, based on the variations with the medium of the electrophilic Fukui functions of their electron-pair donor and acceptor moieties. The model explains the ‘anomalous' solvatochromic behavior of large betaines, which change their behavior from negative to inverted, when electron-pair donor and acceptor groups are separated by a conjugated chain of variable size.

  16. Metal-organic framework tethering PNIPAM for ON-OFF controlled release in solution.

    PubMed

    Nagata, Shunjiro; Kokado, Kenta; Sada, Kazuki

    2015-05-21

    A smart metal-organic framework (MOF) exhibiting controlled release was achieved by modification with a thermoresponsive polymer (PNIPAM) via a surface-selective post-synthetic modification technique. Simple temperature variation readily switches "open" (lower temperature) and "closed" (higher temperature) states of the polymer-modified MOF through conformational change of PNIPAM grafted onto the MOF, resulting in controlled release of the included guest molecules such as resorufin, caffeine, and procainamide.

  17. Effect of assessment scale on spatial and temporal variations in CH4, C02, and N20 fluxes in a forested wetland

    Treesearch

    Zhaohua Dai; Carl Trettin; Changsheng Li; Harbin Li; Ge Sun; Devendra Amatya

    2011-01-01

    Emissions of methane (CH4), carbon dioxide (CO2), and nitrous oxide (N2O) from a forested watershed (160 ha) in South Carolina, USA, were estimated with a spatially explicit watershed-scale modeling framework that utilizes the spatial variations in physical and biogeochemical characteristics across watersheds. The target watershed (WS80) consisting of wetland (23%) and...

  18. Certified Reduced Basis Model Characterization: a Frequentistic Uncertainty Framework

    DTIC Science & Technology

    2011-01-11

    14) It then follows that the Legendre coefficient random vector, (Z [0], Z [1], . . . , Z [I])(ω), is (I+1)– variate normally distributed with mean (δ...I. Note each two-sided inequality represents two constraints. 3. PDE-Based Statistical Inference We now proceed to the parametrized partial...appearance of defects or geometric variations relative to an initial baseline, or perhaps manufacturing departures from nominal specifications; if our

  19. Developing Theory to Guide Building Practitioners’ Capacity to Implement Evidence-Based Interventions

    PubMed Central

    Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C.; Escoffery, Cam T.; Herrmann, Alison K.; Thatcher, Esther; Hartman, Marieke A.; Fernandez, Maria

    2017-01-01

    Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners’ capacity to adopt and implement a variety of EBIs across diverse practice contexts. PMID:26500080

  20. Combining formal and functional approaches to topic structure.

    PubMed

    Zellers, Margaret; Post, Brechtje

    2012-03-01

    Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to adopt the insights of both PTI's qualitative analysis and EP's quantitative analysis and combine them into a multiple-methods approach. One realm in which it is possible to combine these frameworks is in the analysis of discourse topic structure and the prosodic cues relevant to it. By combining a quantitative and a qualitative approach to discourse topic structure, it is possible to give a better account of the observed variation in prosody, for example in the case of fundamental frequency (F0) peak timing, which can be explained in terms of pitch accent distribution over different topic structure categories. Similarly, local and global patterns in speech rate variation can be better explained and motivated by adopting insights from both PTI and EP in the study of topic structure. Combining PTI and EP can provide better accounts of speech data as well as opening up new avenues of investigation which would not have been possible in either approach alone.

  1. Toward an Integrative Understanding of Social Behavior: New Models and New Opportunities

    PubMed Central

    Blumstein, Daniel T.; Ebensperger, Luis A.; Hayes, Loren D.; Vásquez, Rodrigo A.; Ahern, Todd H.; Burger, Joseph Robert; Dolezal, Adam G.; Dosmann, Andy; González-Mariscal, Gabriela; Harris, Breanna N.; Herrera, Emilio A.; Lacey, Eileen A.; Mateo, Jill; McGraw, Lisa A.; Olazábal, Daniel; Ramenofsky, Marilyn; Rubenstein, Dustin R.; Sakhai, Samuel A.; Saltzman, Wendy; Sainz-Borgo, Cristina; Soto-Gamboa, Mauricio; Stewart, Monica L.; Wey, Tina W.; Wingfield, John C.; Young, Larry J.

    2010-01-01

    Social interactions among conspecifics are a fundamental and adaptively significant component of the biology of numerous species. Such interactions give rise to group living as well as many of the complex forms of cooperation and conflict that occur within animal groups. Although previous conceptual models have focused on the ecological causes and fitness consequences of variation in social interactions, recent developments in endocrinology, neuroscience, and molecular genetics offer exciting opportunities to develop more integrated research programs that will facilitate new insights into the physiological causes and consequences of social variation. Here, we propose an integrative framework of social behavior that emphasizes relationships between ultimate-level function and proximate-level mechanism, thereby providing a foundation for exploring the full diversity of factors that underlie variation in social interactions, and ultimately sociality. In addition to identifying new model systems for the study of human psychopathologies, this framework provides a mechanistic basis for predicting how social behavior will change in response to environmental variation. We argue that the study of non-model organisms is essential for implementing this integrative model of social behavior because such species can be studied simultaneously in the lab and field, thereby allowing integration of rigorously controlled experimental manipulations with detailed observations of the ecological contexts in which interactions among conspecifics occur. PMID:20661457

  2. Developing Theory to Guide Building Practitioners' Capacity to Implement Evidence-Based Interventions.

    PubMed

    Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C; Escoffery, Cam T; Herrmann, Alison K; Thatcher, Esther; Hartman, Marieke A; Fernandez, Maria E

    2017-02-01

    Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners' capacity to adopt and implement a variety of EBIs across diverse practice contexts.

  3. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  4. No special K! A signal detection framework for the strategic regulation of memory accuracy.

    PubMed

    Higham, Philip A

    2007-02-01

    Two experiments investigated criterion setting and metacognitive processes underlying the strategic regulation of accuracy on the Scholastic Aptitude Test (SAT) using Type-2 signal detection theory (SDT). In Experiment 1, report bias was manipulated by penalizing participants either 0.25 (low incentive) or 4 (high incentive) points for each error. Best guesses to unanswered items were obtained so that Type-2 signal detection indices of discrimination and bias could be calculated. The same incentive manipulation was used in Experiment 2, only the test was computerized, confidence ratings were taken so that receiver operating characteristic (ROC) curves could be generated, and feedback was manipulated. The results of both experiments demonstrated that SDT provides a viable alternative to A. Koriat and M. Goldsmith's (1996c) framework of monitoring and control and reveals information about the regulation of accuracy that their framework does not. For example, ROC analysis indicated that the threshold model implied by formula scoring is inadequate. Instead, performance on the SAT should be modeled with an equal-variance Gaussian, Type-2 signal detection model. ((c) 2007 APA, all rights reserved).

  5. Automated Melanoma Recognition in Dermoscopy Images via Very Deep Residual Networks.

    PubMed

    Yu, Lequan; Chen, Hao; Dou, Qi; Qin, Jing; Heng, Pheng-Ann

    2017-04-01

    Automated melanoma recognition in dermoscopy images is a very challenging task due to the low contrast of skin lesions, the huge intraclass variation of melanomas, the high degree of visual similarity between melanoma and non-melanoma lesions, and the existence of many artifacts in the image. In order to meet these challenges, we propose a novel method for melanoma recognition by leveraging very deep convolutional neural networks (CNNs). Compared with existing methods employing either low-level hand-crafted features or CNNs with shallower architectures, our substantially deeper networks (more than 50 layers) can acquire richer and more discriminative features for more accurate recognition. To take full advantage of very deep networks, we propose a set of schemes to ensure effective training and learning under limited training data. First, we apply the residual learning to cope with the degradation and overfitting problems when a network goes deeper. This technique can ensure that our networks benefit from the performance gains achieved by increasing network depth. Then, we construct a fully convolutional residual network (FCRN) for accurate skin lesion segmentation, and further enhance its capability by incorporating a multi-scale contextual information integration scheme. Finally, we seamlessly integrate the proposed FCRN (for segmentation) and other very deep residual networks (for classification) to form a two-stage framework. This framework enables the classification network to extract more representative and specific features based on segmented results instead of the whole dermoscopy images, further alleviating the insufficiency of training data. The proposed framework is extensively evaluated on ISBI 2016 Skin Lesion Analysis Towards Melanoma Detection Challenge dataset. Experimental results demonstrate the significant performance gains of the proposed framework, ranking the first in classification and the second in segmentation among 25 teams and 28 teams, respectively. This study corroborates that very deep CNNs with effective training mechanisms can be employed to solve complicated medical image analysis tasks, even with limited training data.

  6. A trait-based framework for stream algal communities.

    PubMed

    Lange, Katharina; Townsend, Colin Richard; Matthaei, Christoph David

    2016-01-01

    The use of trait-based approaches to detect effects of land use and climate change on terrestrial plant and aquatic phytoplankton communities is increasing, but such a framework is still needed for benthic stream algae. Here we present a conceptual framework of morphological, physiological, behavioural and life-history traits relating to resource acquisition and resistance to disturbance. We tested this approach by assessing the relationships between multiple anthropogenic stressors and algal traits at 43 stream sites. Our "natural experiment" was conducted along gradients of agricultural land-use intensity (0-95% of the catchment in high-producing pasture) and hydrological alteration (0-92% streamflow reduction resulting from water abstraction for irrigation) as well as related physicochemical variables (total nitrogen concentration and deposited fine sediment). Strategic choice of study sites meant that agricultural intensity and hydrological alteration were uncorrelated. We studied the relationships of seven traits (with 23 trait categories) to our environmental predictor variables using general linear models and an information-theoretic model-selection approach. Life form, nitrogen fixation and spore formation were key traits that showed the strongest relationships with environmental stressors. Overall, FI (farming intensity) exerted stronger effects on algal communities than hydrological alteration. The large-bodied, non-attached, filamentous algae that dominated under high farming intensities have limited dispersal abilities but may cope with unfavourable conditions through the formation of spores. Antagonistic interactions between FI and flow reduction were observed for some trait variables, whereas no interactions occurred for nitrogen concentration and fine sediment. Our conceptual framework was well supported by tests of ten specific hypotheses predicting effects of resource supply and disturbance on algal traits. Our study also shows that investigating a fairly comprehensive set of traits can help shed light on the drivers of algal community composition in situations where multiple stressors are operating. Further, to understand non-linear and non-additive effects of such drivers, communities need to be studied along multiple gradients of natural variation or anthropogenic stressors.

  7. a New Object-Based Framework to Detect Shodows in High-Resolution Satellite Imagery Over Urban Areas

    NASA Astrophysics Data System (ADS)

    Tatar, N.; Saadatseresht, M.; Arefi, H.; Hadavand, A.

    2015-12-01

    In this paper a new object-based framework to detect shadow areas in high resolution satellite images is proposed. To produce shadow map in pixel level state of the art supervised machine learning algorithms are employed. Automatic ground truth generation based on Otsu thresholding on shadow and non-shadow indices is used to train the classifiers. It is followed by segmenting the image scene and create image objects. To detect shadow objects, a majority voting on pixel-based shadow detection result is designed. GeoEye-1 multi-spectral image over an urban area in Qom city of Iran is used in the experiments. Results shows the superiority of our proposed method over traditional pixel-based, visually and quantitatively.

  8. Variational mode decomposition based approach for accurate classification of color fundus images with hemorrhages

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim; Shmuel, Amir

    2017-11-01

    Diabetic retinopathy is a disease that can cause a loss of vision. An early and accurate diagnosis helps to improve treatment of the disease and prognosis. One of the earliest characteristics of diabetic retinopathy is the appearance of retinal hemorrhages. The purpose of this study is to design a fully automated system for the detection of hemorrhages in a retinal image. In the first stage of our proposed system, a retinal image is processed with variational mode decomposition (VMD) to obtain the first variational mode, which captures the high frequency components of the original image. In the second stage, four texture descriptors are extracted from the first variational mode. Finally, a classifier trained with all computed texture descriptors is used to distinguish between images of healthy and unhealthy retinas with hemorrhages. Experimental results showed evidence of the effectiveness of the proposed system for detection of hemorrhages in the retina, since a perfect detection rate was achieved. Our proposed system for detecting diabetic retinopathy is simple and easy to implement. It requires only short processing time, and it yields higher accuracy in comparison with previously proposed methods for detecting diabetic retinopathy.

  9. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.

  10. An Ultrastable Europium(III)-Organic Framework with the Capacity of Discriminating Fe2+/Fe3+ Ions in Various Solutions.

    PubMed

    Wen, Guo-Xuan; Wu, Ya-Pan; Dong, Wen-Wen; Zhao, Jun; Li, Dong-Sheng; Zhang, Jian

    2016-10-05

    An ultrastable luminescent europium-organic framework, {[Eu(L)(H 2 O) 2 ]·NMP·H 2 O} n (CTGU-2; NMP = N-methyl-2-pyrrolidone), can first detect Fe 2+ /Fe 3+ cations in different medium systems with high selectivity and sensitivity, and it also exhibits high sensitivity for Cr 2 O 7 2- anion and acetone with a wide linear range and a low detection limit.

  11. An anionic Na(i)-organic framework platform: separation of organic dyes and post-modification for highly sensitive detection of picric acid.

    PubMed

    Chen, Di-Ming; Tian, Jia-Yue; Wang, Zhuo-Wei; Liu, Chun-Sen; Chen, Min; Du, Miao

    2017-09-26

    A cage-based anionic Na(i)-organic framework with a unique Na 9 cluster-based secondary building unit and a cage-in-cage structure was constructed. The selective separation of dyes with different charges and sizes was investigated. Furthermore, the Rh6G@MOF composite could be applied as a recyclable fluorescent sensor for detecting picric acid (PA) with high sensitivity and selectivity.

  12. Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography.

    PubMed

    Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T

    2013-12-01

    Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Contact line motion over substrates with spatially non-uniform properties

    NASA Astrophysics Data System (ADS)

    Ajaev, Vladimir; Gatapova, Elizaveta; Kabov, Oleg

    2017-11-01

    We develop mathematical models of moving contact lines over flat solid surfaces with spatial variation of temperature and wetting properties under the conditions when evaporation is significant. The gas phase is assumed to be pure vapor and a lubrication-type framework is employed for describing viscous flow in the liquid. Marangoni stresses at the liquid surface arise as a result of temperature variation in the vapor phase, non-equilibrium effects during evaporation at the interface, and Kelvin effect. The relative importance of these three factors is determined. Variation of wetting properties is modeled through a two-component disjoining pressure, with the main focus on spatially periodic patterns leading to time-periodic variation of the contact line speed.

  14. Atlas of Variations in Medical Practice in Spain: the Spanish National Health Service under scrutiny.

    PubMed

    Bernal-Delgado, Enrique; García-Armesto, Sandra; Peiró, Salvador

    2014-01-01

    Early in the 2000s, a countrywide health services research initiative was launched under the acronym of Atlas VPM: Atlas of Variations in Medical Practice in the Spanish National Health System. This initiative aimed at describing systematic and unwarranted variations in medical practice at geographic level-building upon the seminal experience of the Dartmouth Atlas of Health Care. The paper aims at explaining the Spanish Atlas experience, built upon the pioneer Dartmouth inspiration. A few selected examples will be used along the following sections to illustrate the outlined conceptual framework, the different factors that may affect variation, and some methodological challenges. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Glucose Metabolism during Resting State Reveals Abnormal Brain Networks Organization in the Alzheimer’s Disease and Mild Cognitive Impairment

    PubMed Central

    Martínez-Montes, Eduardo

    2013-01-01

    This paper aims to study the abnormal patterns of brain glucose metabolism co-variations in Alzheimer disease (AD) and Mild Cognitive Impairment (MCI) patients compared to Normal healthy controls (NC) using the Alzheimer Disease Neuroimaging Initiative (ADNI) database. The local cerebral metabolic rate for glucose (CMRgl) in a set of 90 structures belonging to the AAL atlas was obtained from Fluro-Deoxyglucose Positron Emission Tomography data in resting state. It is assumed that brain regions whose CMRgl values are significantly correlated are functionally associated; therefore, when metabolism is altered in a single region, the alteration will affect the metabolism of other brain areas with which it interrelates. The glucose metabolism network (represented by the matrix of the CMRgl co-variations among all pairs of structures) was studied using the graph theory framework. The highest concurrent fluctuations in CMRgl were basically identified between homologous cortical regions in all groups. Significant differences in CMRgl co-variations in AD and MCI groups as compared to NC were found. The AD and MCI patients showed aberrant patterns in comparison to NC subjects, as detected by global and local network properties (global and local efficiency, clustering index, and others). MCI network’s attributes showed an intermediate position between NC and AD, corroborating it as a transitional stage from normal aging to Alzheimer disease. Our study is an attempt at exploring the complex association between glucose metabolism, CMRgl covariations and the attributes of the brain network organization in AD and MCI. PMID:23894356

  16. Improving the quantification of contrast enhanced ultrasound using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Rizzo, Gaia; Tonietto, Matteo; Castellaro, Marco; Raffeiner, Bernd; Coran, Alessandro; Fiocco, Ugo; Stramare, Roberto; Grisan, Enrico

    2017-03-01

    Contrast Enhanced Ultrasound (CEUS) is a sensitive imaging technique to assess tissue vascularity, that can be useful in the quantification of different perfusion patterns. This can be particularly important in the early detection and staging of arthritis. In a recent study we have shown that a Gamma-variate can accurately quantify synovial perfusion and it is flexible enough to describe many heterogeneous patterns. Moreover, we have shown that through a pixel-by-pixel analysis the quantitative information gathered characterizes more effectively the perfusion. However, the SNR ratio of the data and the nonlinearity of the model makes the parameter estimation difficult. Using classical non-linear-leastsquares (NLLS) approach the number of unreliable estimates (those with an asymptotic coefficient of variation greater than a user-defined threshold) is significant, thus affecting the overall description of the perfusion kinetics and of its heterogeneity. In this work we propose to solve the parameter estimation at the pixel level within a Bayesian framework using Variational Bayes (VB), and an automatic and data-driven prior initialization. When evaluating the pixels for which both VB and NLLS provided reliable estimates, we demonstrated that the parameter values provided by the two methods are well correlated (Pearson's correlation between 0.85 and 0.99). Moreover, the mean number of unreliable pixels drastically reduces from 54% (NLLS) to 26% (VB), without increasing the computational time (0.05 s/pixel for NLLS and 0.07 s/pixel for VB). When considering the efficiency of the algorithms as computational time per reliable estimate, VB outperforms NLLS (0.11 versus 0.25 seconds per reliable estimate respectively).

  17. Airplane detection based on fusion framework by combining saliency model with Deep Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Dou, Hao; Sun, Xiao; Li, Bin; Deng, Qianqian; Yang, Xubo; Liu, Di; Tian, Jinwen

    2018-03-01

    Aircraft detection from very high resolution remote sensing images, has gained more increasing interest in recent years due to the successful civil and military applications. However, several problems still exist: 1) how to extract the high-level features of aircraft; 2) locating objects within such a large image is difficult and time consuming; 3) A common problem of multiple resolutions of satellite images still exists. In this paper, inspirited by biological visual mechanism, the fusion detection framework is proposed, which fusing the top-down visual mechanism (deep CNN model) and bottom-up visual mechanism (GBVS) to detect aircraft. Besides, we use multi-scale training method for deep CNN model to solve the problem of multiple resolutions. Experimental results demonstrate that our method can achieve a better detection result than the other methods.

  18. Natural epigenetic variation within and among six subspecies of the house sparrow, Passer domesticus.

    PubMed

    Riyahi, Sepand; Vilatersana, Roser; Schrey, Aaron W; Ghorbani Node, Hassan; Aliabadian, Mansour; Senar, Juan Carlos

    2017-11-01

    Epigenetic modifications can respond rapidly to environmental changes and can shape phenotypic variation in accordance with environmental stimuli. One of the most studied epigenetic marks is DNA methylation. In the present study, we used the methylation-sensitive amplified polymorphism (MSAP) technique to investigate the natural variation in DNA methylation within and among subspecies of the house sparrow, Passer domesticus We focused on five subspecies from the Middle East because they show great variation in many ecological traits and because this region is the probable origin for the house sparrow's commensal relationship with humans. We analysed house sparrows from Spain as an outgroup. The level of variation in DNA methylation was similar among the five house sparrow subspecies from the Middle East despite high phenotypic and environmental variation, but the non-commensal subspecies was differentiated from the other four (commensal) Middle Eastern subspecies. Further, the European subspecies was differentiated from all other subspecies in DNA methylation. Our results indicate that variation in DNA methylation does not strictly follow subspecies designations. We detected a correlation between methylation level and some morphological traits, such as standardized bill length, and we suggest that part of the high morphological variation in the native populations of the house sparrow is influenced by differentially methylated regions in specific loci throughout the genome. We also detected 10 differentially methylated loci among subspecies and three loci that differentiated between commensal or non-commensal status. Therefore, the MSAP technique detected larger scale differences among the European and non-commensal subspecies, but did not detect finer scale differences among the other Middle Eastern subspecies. © 2017. Published by The Company of Biologists Ltd.

  19. Global assessment of genomic variation in cattle by genome resequencing and high-throughput genotyping

    PubMed Central

    2011-01-01

    Background Integration of genomic variation with phenotypic information is an effective approach for uncovering genotype-phenotype associations. This requires an accurate identification of the different types of variation in individual genomes. Results We report the integration of the whole genome sequence of a single Holstein Friesian bull with data from single nucleotide polymorphism (SNP) and comparative genomic hybridization (CGH) array technologies to determine a comprehensive spectrum of genomic variation. The performance of resequencing SNP detection was assessed by combining SNPs that were identified to be either in identity by descent (IBD) or in copy number variation (CNV) with results from SNP array genotyping. Coding insertions and deletions (indels) were found to be enriched for size in multiples of 3 and were located near the N- and C-termini of proteins. For larger indels, a combination of split-read and read-pair approaches proved to be complementary in finding different signatures. CNVs were identified on the basis of the depth of sequenced reads, and by using SNP and CGH arrays. Conclusions Our results provide high resolution mapping of diverse classes of genomic variation in an individual bovine genome and demonstrate that structural variation surpasses sequence variation as the main component of genomic variability. Better accuracy of SNP detection was achieved with little loss of sensitivity when algorithms that implemented mapping quality were used. IBD regions were found to be instrumental for calculating resequencing SNP accuracy, while SNP detection within CNVs tended to be less reliable. CNV discovery was affected dramatically by platform resolution and coverage biases. The combined data for this study showed that at a moderate level of sequencing coverage, an ensemble of platforms and tools can be applied together to maximize the accurate detection of sequence and structural variants. PMID:22082336

  20. An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials.

    PubMed

    Jaciw, Andrew P; Lin, Li; Ma, Boya

    2016-10-18

    Prior research has investigated design parameters for assessing average program impacts on achievement outcomes with cluster randomized trials (CRTs). Less is known about parameters important for assessing differential impacts. This article develops a statistical framework for designing CRTs to assess differences in impact among student subgroups and presents initial estimates of critical parameters. Effect sizes and minimum detectable effect sizes for average and differential impacts are calculated before and after conditioning on effects of covariates using results from several CRTs. Relative sensitivities to detect average and differential impacts are also examined. Student outcomes from six CRTs are analyzed. Achievement in math, science, reading, and writing. The ratio of between-cluster variation in the slope of the moderator divided by total variance-the "moderator gap variance ratio"-is important for designing studies to detect differences in impact between student subgroups. This quantity is the analogue of the intraclass correlation coefficient. Typical values were .02 for gender and .04 for socioeconomic status. For studies considered, in many cases estimates of differential impact were larger than of average impact, and after conditioning on effects of covariates, similar power was achieved for detecting average and differential impacts of the same size. Measuring differential impacts is important for addressing questions of equity, generalizability, and guiding interpretation of subgroup impact findings. Adequate power for doing this is in some cases reachable with CRTs designed to measure average impacts. Continuing collection of parameters for assessing differential impacts is the next step. © The Author(s) 2016.

  1. A Neutrality Test for Detecting Selection on DNA Methylation Using Single Methylation Polymorphism Frequency Spectrum

    PubMed Central

    Wang, Jun; Fan, Chuanzhu

    2015-01-01

    Inheritable epigenetic mutations (epimutations) can contribute to transmittable phenotypic variation. Thus, epimutations can be subject to natural selection and impact the fitness and evolution of organisms. Based on the framework of the modified Tajima’s D test for DNA mutations, we developed a neutrality test with the statistic “Dm” to detect selection forces on DNA methylation mutations using single methylation polymorphisms. With computer simulation and empirical data analysis, we compared the Dm test with the original and modified Tajima’s D tests and demonstrated that the Dm test is suitable for detecting selection on epimutations and outperforms original/modified Tajima’s D tests. Due to the higher resetting rate of epimutations, the interpretation of Dm on epimutations and Tajima’s D test on DNA mutations could be different in inferring natural selection. Analyses using simulated and empirical genome-wide polymorphism data suggested that genes under genetic and epigenetic selections behaved differently. We applied the Dm test to recently originated Arabidopsis and human genes, and showed that newly evolved genes contain higher level of rare epialleles, suggesting that epimutation may play a role in origination and evolution of genes and genomes. Overall, we demonstrate the utility of the Dm test to detect whether the loci are under selection regarding DNA methylation. Our analytical metrics and methodology could contribute to our understanding of evolutionary processes of genes and genomes in the field of epigenetics. The Perl script for the “Dm” test is available at http://fanlab.wayne.edu/ (last accessed December 18, 2014). PMID:25539727

  2. Improving removal-based estimates of abundance by sampling a population of spatially distinct subpopulations

    USGS Publications Warehouse

    Dorazio, R.M.; Jelks, H.L.; Jordan, F.

    2005-01-01

     A statistical modeling framework is described for estimating the abundances of spatially distinct subpopulations of animals surveyed using removal sampling. To illustrate this framework, hierarchical models are developed using the Poisson and negative-binomial distributions to model variation in abundance among subpopulations and using the beta distribution to model variation in capture probabilities. These models are fitted to the removal counts observed in a survey of a federally endangered fish species. The resulting estimates of abundance have similar or better precision than those computed using the conventional approach of analyzing the removal counts of each subpopulation separately. Extension of the hierarchical models to include spatial covariates of abundance is straightforward and may be used to identify important features of an animal's habitat or to predict the abundance of animals at unsampled locations.

  3. Luminescent Li-based metal-organic framework tailored for the selective detection of explosive nitroaromatic compounds: direct observation of interaction sites.

    PubMed

    Kim, Tae Kyung; Lee, Jae Hwa; Moon, Dohyun; Moon, Hoi Ri

    2013-01-18

    A luminescent lithium metal-organic framework (MOF) is constructed from the solvothermal reaction of Li(+) and a well-designed organic ligand, bis(4-carboxyphenyl)-N-methylamine (H(2)CPMA). A Li-based MOF can detect an explosive aromatic compound containing nitro groups as an explosophore, by showing a dramatic color change with concurrent luminescence quenching in the solid state. The detection sites are proven directly through single-crystal-to-single-crystal transformations, which show strong interactions between the aromatic rings of the electron-rich CPMA(2-) molecules and the electron-deficient nitrobenzene.

  4. Replicating human expertise of mechanical ventilation waveform analysis in detecting patient-ventilator cycling asynchrony using machine learning.

    PubMed

    Gholami, Behnood; Phan, Timothy S; Haddad, Wassim M; Cason, Andrew; Mullis, Jerry; Price, Levi; Bailey, James M

    2018-06-01

    - Acute respiratory failure is one of the most common problems encountered in intensive care units (ICU) and mechanical ventilation is the mainstay of supportive therapy for such patients. A mismatch between ventilator delivery and patient demand is referred to as patient-ventilator asynchrony (PVA). An important hurdle in addressing PVA is the lack of a reliable framework for continuously and automatically monitoring the patient and detecting various types of PVA. - The problem of replicating human expertise of waveform analysis for detecting cycling asynchrony (i.e., delayed termination, premature termination, or none) was investigated in a pilot study involving 11 patients in the ICU under invasive mechanical ventilation. A machine learning framework is used to detect cycling asynchrony based on waveform analysis. - A panel of five experts with experience in PVA evaluated a total of 1377 breath cycles from 11 mechanically ventilated critical care patients. The majority vote was used to label each breath cycle according to cycling asynchrony type. The proposed framework accurately detected the presence or absence of cycling asynchrony with sensitivity (specificity) of 89% (99%), 94% (98%), and 97% (93%) for delayed termination, premature termination, and no cycling asynchrony, respectively. The system showed strong agreement with human experts as reflected by the kappa coefficients of 0.90, 0.91, and 0.90 for delayed termination, premature termination, and no cycling asynchrony, respectively. - The pilot study establishes the feasibility of using a machine learning framework to provide waveform analysis equivalent to an expert human. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Using areas of known occupancy to identify sources of variation in detection probability of raptors: taking time lowers replication effort for surveys.

    PubMed

    Murn, Campbell; Holloway, Graham J

    2016-10-01

    Species occurring at low density can be difficult to detect and if not properly accounted for, imperfect detection will lead to inaccurate estimates of occupancy. Understanding sources of variation in detection probability and how they can be managed is a key part of monitoring. We used sightings data of a low-density and elusive raptor (white-headed vulture Trigonoceps occipitalis ) in areas of known occupancy (breeding territories) in a likelihood-based modelling approach to calculate detection probability and the factors affecting it. Because occupancy was known a priori to be 100%, we fixed the model occupancy parameter to 1.0 and focused on identifying sources of variation in detection probability. Using detection histories from 359 territory visits, we assessed nine covariates in 29 candidate models. The model with the highest support indicated that observer speed during a survey, combined with temporal covariates such as time of year and length of time within a territory, had the highest influence on the detection probability. Averaged detection probability was 0.207 (s.e. 0.033) and based on this the mean number of visits required to determine within 95% confidence that white-headed vultures are absent from a breeding area is 13 (95% CI: 9-20). Topographical and habitat covariates contributed little to the best models and had little effect on detection probability. We highlight that low detection probabilities of some species means that emphasizing habitat covariates could lead to spurious results in occupancy models that do not also incorporate temporal components. While variation in detection probability is complex and influenced by effects at both temporal and spatial scales, temporal covariates can and should be controlled as part of robust survey methods. Our results emphasize the importance of accounting for detection probability in occupancy studies, particularly during presence/absence studies for species such as raptors that are widespread and occur at low densities.

  6. Skeletal Strength and Skeletogenetic Mechanisms Over Phanerozoic Time

    NASA Astrophysics Data System (ADS)

    Constantz, B. R.

    2004-12-01

    Mineralized skeletons have a remarkable range of mechanical properties with respect to strength and durability. Measurements of skeletal mechanical properties show that taxonomic groups with relatively simple, `physiochemically-dominated' modes of mineralization possess skeletal strengths and durabilities that are among the lowest of any known mineralized skeletons. Organisms with relatively sophisticated, `biologically-dominated' modes of mineralization have mechanical properties among the highest values known for any materials. These extraordinarily strong and durable skeletal materials are found in mollusks, echinoderms, vertebrates, and arthropods, which are groups with primarily mobile ecological habits. These skeletons are frequently lightweight, non-massive skeletons with little phenotypic variation. By contrast, dominant reef framework builders and reef sediment formers, with physiochemically-dominated modes of mineralization, have non-mobile ecological habits and construct massive, phenotypically plastic skeletons, possessing extremely poor mechanical properties. Endolithic organisms that further degrade the mechanical properties of the mineralized skeletons of reef builders frequently ravage their massive skeletons. As a result, the skeletons of these groups commonly fragment, and play a central role in reef establishment and maintenance, as they are incorporated in reefal, wave-resistant carbonate buildups. Scleractinian corals have a physiochemically-dominated mode of mineralization and are the dominant modern reef framework builders. Mechanical properties of modern aragonitic scleractinian coral skeletons, tested alive, demonstrate skeletal strengths that are orders of magnitude lower than those seen in mollusks, echinoderms, vertebrates, and arthropods. Rudist bivalves, the dominant reef framework-building group of the Cretaceous, show prolific, massive, highly variable, calcific skeletal elements with structures similar to some reef-forming modern, non-mobile mollusks and the skeletons of other organisms with physiochemically-dominated modes of mineralization. Many aspects of the ecological habits of reef-framework building scleractinians and rudsits are similar, including relatively high skeletal growth rates, which produce massive skeletons and wave-resistant structures with entrapped bioclastic sediments. The principal adaptive role of mineralization in reef framework building groups appears to be the rapid production of massive, brittle, wave-resistant mineralized skeletons. The physiochemically-dominated mode of mineralization of these reef framework builders appears to have made them susceptible to secular variations in Phanerozoic seawater during `calcite' and `aragonite' seas, favoring scleractinians in aragonite seas and rudists during the Cretaceous calcite episode. By contrast, most mobile mollusks, echinoderms, vertebrates, and arthropods appear relatively unaffected by secular variations in seawater chemistry over the Phanerozoic

  7. A tunable azine covalent organic framework platform for visible light-induced hydrogen generation

    PubMed Central

    Vyas, Vijay S.; Haase, Frederik; Stegbauer, Linus; Savasci, Gökcen; Podjaski, Filip; Ochsenfeld, Christian; Lotsch, Bettina V.

    2015-01-01

    Hydrogen evolution from photocatalytic reduction of water holds promise as a sustainable source of carbon-free energy. Covalent organic frameworks (COFs) present an interesting new class of photoactive materials, which combine three key features relevant to the photocatalytic process, namely crystallinity, porosity and tunability. Here we synthesize a series of water- and photostable 2D azine-linked COFs from hydrazine and triphenylarene aldehydes with varying number of nitrogen atoms. The electronic and steric variations in the precursors are transferred to the resulting frameworks, thus leading to a progressively enhanced light-induced hydrogen evolution with increasing nitrogen content in the frameworks. Our results demonstrate that by the rational design of COFs on a molecular level, it is possible to precisely adjust their structural and optoelectronic properties, thus resulting in enhanced photocatalytic activities. This is expected to spur further interest in these photofunctional frameworks where rational supramolecular engineering may lead to new material applications. PMID:26419805

  8. A SVM framework for fault detection of the braking system in a high speed train

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Li, Yan-Fu; Zio, Enrico

    2017-03-01

    In April 2015, the number of operating High Speed Trains (HSTs) in the world has reached 3603. An efficient, effective and very reliable braking system is evidently very critical for trains running at a speed around 300 km/h. Failure of a highly reliable braking system is a rare event and, consequently, informative recorded data on fault conditions are scarce. This renders the fault detection problem a classification problem with highly unbalanced data. In this paper, a Support Vector Machine (SVM) framework, including feature selection, feature vector selection, model construction and decision boundary optimization, is proposed for tackling this problem. Feature vector selection can largely reduce the data size and, thus, the computational burden. The constructed model is a modified version of the least square SVM, in which a higher cost is assigned to the error of classification of faulty conditions than the error of classification of normal conditions. The proposed framework is successfully validated on a number of public unbalanced datasets. Then, it is applied for the fault detection of braking systems in HST: in comparison with several SVM approaches for unbalanced datasets, the proposed framework gives better results.

  9. Spatio-temporal variation in click production rates of beaked whales: Implications for passive acoustic density estimation.

    PubMed

    Warren, Victoria E; Marques, Tiago A; Harris, Danielle; Thomas, Len; Tyack, Peter L; Aguilar de Soto, Natacha; Hickmott, Leigh S; Johnson, Mark P

    2017-03-01

    Passive acoustic monitoring has become an increasingly prevalent tool for estimating density of marine mammals, such as beaked whales, which vocalize often but are difficult to survey visually. Counts of acoustic cues (e.g., vocalizations), when corrected for detection probability, can be translated into animal density estimates by applying an individual cue production rate multiplier. It is essential to understand variation in these rates to avoid biased estimates. The most direct way to measure cue production rate is with animal-mounted acoustic recorders. This study utilized data from sound recording tags deployed on Blainville's (Mesoplodon densirostris, 19 deployments) and Cuvier's (Ziphius cavirostris, 16 deployments) beaked whales, in two locations per species, to explore spatial and temporal variation in click production rates. No spatial or temporal variation was detected within the average click production rate of Blainville's beaked whales when calculated over dive cycles (including silent periods between dives); however, spatial variation was detected when averaged only over vocal periods. Cuvier's beaked whales exhibited significant spatial and temporal variation in click production rates within vocal periods and when silent periods were included. This evidence of variation emphasizes the need to utilize appropriate cue production rates when estimating density from passive acoustic data.

  10. Using the ECD Framework to Support Evidentiary Reasoning in the Context of a Simulation Study for Detecting Learner Differences in Epistemic Games

    ERIC Educational Resources Information Center

    Sweet, Shauna J.; Rupp, Andre A.

    2012-01-01

    The "evidence-centered design" (ECD) framework is a powerful tool that supports careful and critical thinking about the identification and accumulation of evidence in assessment contexts. In this paper, we demonstrate how the ECD framework provides critical support for designing simulation studies to investigate statistical methods…

  11. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    ERIC Educational Resources Information Center

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  12. Early warning by near-real time disturbance monitoring (Invited)

    NASA Astrophysics Data System (ADS)

    Verbesselt, J.; Zeileis, A.; Herold, M.

    2013-12-01

    Near real-time monitoring of ecosystem disturbances is critical for rapidly assessing and addressing impacts on carbon dynamics, biodiversity, and socio-ecological processes. Satellite remote sensing enables cost-effective and accurate monitoring at frequent time steps over large areas. Yet, generic methods to detect disturbances within newly captured satellite images are lacking. We propose a multi-purpose time-series-based disturbance detection approach that identifies and models stable historical variation to enable change detection within newly acquired data. Satellite image time series of vegetation greenness provide a global record of terrestrial vegetation productivity over the past decades. Here, we assess and demonstrate the method by applying it to (1) real-world satellite greenness image time series between February 2000 and July 2011 covering Somalia to detect drought-related vegetation disturbances (2) landsat image time series to detect forest disturbances. First, results illustrate that disturbances are successfully detected in near real-time while being robust to seasonality and noise. Second, major drought-related disturbance corresponding with most drought-stressed regions in Somalia are detected from mid-2010 onwards. Third, the method can be applied to landsat image time series having a lower temporal data density. Furthermore the method can analyze in-situ or satellite data time series of biophysical indicators from local to global scale since it is fast, does not depend on thresholds and does not require time series gap filling. While the data and methods used are appropriate for proof-of-concept development of global scale disturbance monitoring, specific applications (e.g., drought or deforestation monitoring) mandates integration within an operational monitoring framework. Furthermore, the real-time monitoring method is implemented in open-source environment and is freely available in the BFAST package for R software. Information illustrating how to apply the method on satellite image time series are available at http://bfast.R-Forge.R-project.org/ and the example section of the bfastmonitor() function within the BFAST package.

  13. Cone-beam CT image contrast and attenuation-map linearity improvement (CALI) for brain stereotactic radiosurgery procedures

    NASA Astrophysics Data System (ADS)

    Hashemi, Sayed Masoud; Lee, Young; Eriksson, Markus; Nordström, Hâkan; Mainprize, James; Grouza, Vladimir; Huynh, Christopher; Sahgal, Arjun; Song, William Y.; Ruschin, Mark

    2017-03-01

    A Contrast and Attenuation-map (CT-number) Linearity Improvement (CALI) framework is proposed for cone-beam CT (CBCT) images used for brain stereotactic radiosurgery (SRS). The proposed framework is used together with our high spatial resolution iterative reconstruction algorithm and is tailored for the Leksell Gamma Knife ICON (Elekta, Stockholm, Sweden). The incorporated CBCT system in ICON facilitates frameless SRS planning and treatment delivery. The ICON employs a half-cone geometry to accommodate the existing treatment couch. This geometry increases the amount of artifacts and together with other physical imperfections causes image inhomogeneity and contrast reduction. Our proposed framework includes a preprocessing step, involving a shading and beam-hardening artifact correction, and a post-processing step to correct the dome/capping artifact caused by the spatial variations in x-ray energy generated by bowtie-filter. Our shading correction algorithm relies solely on the acquired projection images (i.e. no prior information required) and utilizes filtered-back-projection (FBP) reconstructed images to generate a segmented bone and soft-tissue map. Ideal projections are estimated from the segmented images and a smoothed version of the difference between the ideal and measured projections is used in correction. The proposed beam-hardening and dome artifact corrections are segmentation free. The CALI was tested on CatPhan, as well as patient images acquired on the ICON system. The resulting clinical brain images show substantial improvements in soft contrast visibility, revealing structures such as ventricles and lesions which were otherwise un-detectable in FBP-reconstructed images. The linearity of the reconstructed attenuation-map was also improved, resulting in more accurate CT#.

  14. Adhesion mechanisms at the interface between Y-TZP and veneering ceramic with and without modifier.

    PubMed

    Monaco, Carlo; Tucci, Antonella; Esposito, Leonardo; Scotti, Roberto

    2014-11-01

    This study investigated the mechanism of action at the interface between a commercially available Y-TZP and its veneering ceramic after final firing. Particular attention was paid, from a microstructural point of view, to evaluating the effects of different surface treatments carried out on the zirconia. In total, 32 specimens of presintered zirconia Y-TZP (LavaFrame, 3M ESPE, Germany) were cut with a low-speed diamond blade. The specimens were divided in two major groups, for testing after fracture or after mirror finishing, and were sintered following the manufacturer's instructions. Each major group was then randomly divided into four subgroups, according to using or not using the dedicated framework modifier, with or without a preliminary silica coating (CoJet, 3M ESPE). A suitable veneering ceramic was used for each group (Lava Ceram Overlay Porcelain, 3M ESPE). A detailed microstructural study of the interfaces of the zirconia-veneering ceramic was performed using a scanning electron microscope equipped with an energy-dispersive X-ray spectrometer to evaluate chemical variation at the interfaces. When the framework modifier was not applied on the Y-TZP surface, microdetachments, porosities, and openings in the ceramic layer were observed at the interlayers. A degree of diffusion of different elements through the interfaces from both the zirconia and veneering layers was detected. Application of the framework modifier can increase the wettability of the zirconia surfaces, allowing a continuous contact with the veneering layer. The micro-analysis performed showed the presence of a reaction area at the interface between the different materials. the increase of the wettability of the zirconia surface could improve the adhesion at interface with the veneering ceramic and reduce the clinical failure as chipping or delamination. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Diffusion Filters for Variational Data Assimilation of Sea Surface Temperature in an Intermediate Climate Model

    DTIC Science & Technology

    2015-01-01

    over data-dense regions. After that, a perfect twin data assimilation experiment framework is designed to study the effect of the GDF on the state...is designed to study the effect of the GDF on the state estimation based on an intermediate coupled model. In this framework, the assimilation model...observation. Considering = , (4) is equal to () = 1 2 + 1 2 ( − ) −1 ( − ) . (5) The effect of in (5) can

  16. DoD Lead System Integrator (LSI) Transformation - Creating a Model Based Acquisition Framework (MBAF)

    DTIC Science & Technology

    2014-04-30

    cost to acquire systems as design maturity could be verified incrementally as the system was developed vice waiting for specific large “ big bang ...Framework (MBAF) be applied to simulate or optimize process variations on programs? LSI Roles and Responsibilities A review of the roles and...the model/process optimization process. It is the current intent that NAVAIR will use the model to run simulations on process changes in an attempt to

  17. Application of HF Doppler measurements for the investigation of internal atmospheric waves in the ionosphere

    NASA Astrophysics Data System (ADS)

    Petrova, I. R.; Bochkarev, V. V.; Latipov, R. R.

    2009-09-01

    We present results of the spectral analysis of data series of Doppler frequency shifted signals reflected from the ionosphere, using experimental data received at Kazan University, Russia. Spectra of variations with periods from 1 min to 60 days have been calculated and analyzed for different scales of periods. The power spectral density for spring and winter differs by a factor of 3-4. Local maxima of variation amplitude are detected, which are statistically significant. The periods of these amplitude increases range from 6 to 12 min for winter, and from 24 to 48 min for autumn. Properties of spectra for variations with the periods of 1-72 h have been analyzed. The maximum of variation intensity for all seasons and frequencies corresponds to the period of 24 h. Spectra of variations with periods from 3 to 60 days have been calculated. The maxima periods of power spectral density have been detected by the MUSIC method for the high spectral resolution. The detected periods correspond to planetary wave periods. Analysis of spectra for days with different level of geomagnetic activity shows that the intensity of variations for days with a high level of geomagnetic activity is higher.

  18. Building Extraction from Remote Sensing Data Using Fully Convolutional Networks

    NASA Astrophysics Data System (ADS)

    Bittner, K.; Cui, S.; Reinartz, P.

    2017-05-01

    Building detection and footprint extraction are highly demanded for many remote sensing applications. Though most previous works have shown promising results, the automatic extraction of building footprints still remains a nontrivial topic, especially in complex urban areas. Recently developed extensions of the CNN framework made it possible to perform dense pixel-wise classification of input images. Based on these abilities we propose a methodology, which automatically generates a full resolution binary building mask out of a Digital Surface Model (DSM) using a Fully Convolution Network (FCN) architecture. The advantage of using the depth information is that it provides geometrical silhouettes and allows a better separation of buildings from background as well as through its invariance to illumination and color variations. The proposed framework has mainly two steps. Firstly, the FCN is trained on a large set of patches consisting of normalized DSM (nDSM) as inputs and available ground truth building mask as target outputs. Secondly, the generated predictions from FCN are viewed as unary terms for a Fully connected Conditional Random Fields (FCRF), which enables us to create a final binary building mask. A series of experiments demonstrate that our methodology is able to extract accurate building footprints which are close to the buildings original shapes to a high degree. The quantitative and qualitative analysis show the significant improvements of the results in contrast to the multy-layer fully connected network from our previous work.

  19. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.

  20. Toward Failure Modeling In Complex Dynamic Systems: Impact of Design and Manufacturing Variations

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; McAdams, Daniel A.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes during a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the. modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle vibration monitoring systems.

  1. Multimedia content management in MPEG-21 framework

    NASA Astrophysics Data System (ADS)

    Smith, John R.

    2002-07-01

    MPEG-21 is an emerging standard from MPEG that specifies a framework for transactions of multimedia content. MPEG-21 defines the fundamental concept known as a digital item, which is the unit of transaction in the multimedia framework. A digital item can be used to package content for such as a digital photograph, a video clip or movie, a musical recording with graphics and liner notes, a photo album, and so on. The packaging of the media resources, corresponding identifiers, and associated metadata is provided in the declaration of the digital item. The digital item declaration allows for more effective transaction, distribution, and management of multimedia content and corresponding metadata, rights expressions, variations of media resources. In this paper, we describe various challenges for multimedia content management in the MPEG-21 framework.

  2. Zeolitic imidazolate framework-coated acoustic sensors for room temperature detection of carbon dioxide and methane

    DOE PAGES

    Devkota, Jagannath; Kim, Ki-Joong; Ohodnicki, Paul R.; ...

    2018-01-01

    The integration of nanoporous materials such as metal organic frameworks (MOFs) with sensitive transducers can result in robust sensing platforms for monitoring gases and chemical vapors for a range of applications.

  3. A Security Framework for Online Distance Learning and Training.

    ERIC Educational Resources Information Center

    Furnell, S. M.; Onions, P. D.; Bleimann, U.; Gojny, U.; Knahl, M.; Roder, H. F.; Sanders, P. W.

    1998-01-01

    Presents a generic reference model for online distance learning and discusses security issues for each stage (enrollment, study, completion, termination, suspension). Discusses a security framework (authentication and accountability, access control, intrusion detection, network communications, nonrepudiation, learning resources provider…

  4. Framework for performance evaluation of face, text, and vehicle detection and tracking in video: data, metrics, and protocol.

    PubMed

    Kasturi, Rangachar; Goldgof, Dmitry; Soundararajan, Padmanabhan; Manohar, Vasant; Garofolo, John; Bowers, Rachel; Boonstra, Matthew; Korzhova, Valentina; Zhang, Jing

    2009-02-01

    Common benchmark data sets, standardized performance metrics, and baseline algorithms have demonstrated considerable impact on research and development in a variety of application domains. These resources provide both consumers and developers of technology with a common framework to objectively compare the performance of different algorithms and algorithmic improvements. In this paper, we present such a framework for evaluating object detection and tracking in video: specifically for face, text, and vehicle objects. This framework includes the source video data, ground-truth annotations (along with guidelines for annotation), performance metrics, evaluation protocols, and tools including scoring software and baseline algorithms. For each detection and tracking task and supported domain, we developed a 50-clip training set and a 50-clip test set. Each data clip is approximately 2.5 minutes long and has been completely spatially/temporally annotated at the I-frame level. Each task/domain, therefore, has an associated annotated corpus of approximately 450,000 frames. The scope of such annotation is unprecedented and was designed to begin to support the necessary quantities of data for robust machine learning approaches, as well as a statistically significant comparison of the performance of algorithms. The goal of this work was to systematically address the challenges of object detection and tracking through a common evaluation framework that permits a meaningful objective comparison of techniques, provides the research community with sufficient data for the exploration of automatic modeling techniques, encourages the incorporation of objective evaluation into the development process, and contributes useful lasting resources of a scale and magnitude that will prove to be extremely useful to the computer vision research community for years to come.

  5. Symbol Synchronization for Diffusion-Based Molecular Communications.

    PubMed

    Jamali, Vahid; Ahmadzadeh, Arman; Schober, Robert

    2017-12-01

    Symbol synchronization refers to the estimation of the start of a symbol interval and is needed for reliable detection. In this paper, we develop several symbol synchronization schemes for molecular communication (MC) systems where we consider some practical challenges, which have not been addressed in the literature yet. In particular, we take into account that in MC systems, the transmitter may not be equipped with an internal clock and may not be able to emit molecules with a fixed release frequency. Such restrictions hold for practical nanotransmitters, e.g., modified cells, where the lengths of the symbol intervals may vary due to the inherent randomness in the availability of food and energy for molecule generation, the process for molecule production, and the release process. To address this issue, we develop two synchronization-detection frameworks which both employ two types of molecule. In the first framework, one type of molecule is used for symbol synchronization and the other one is used for data detection, whereas in the second framework, both types of molecule are used for joint symbol synchronization and data detection. For both frameworks, we first derive the optimal maximum likelihood (ML) symbol synchronization schemes as performance upper bounds. Since ML synchronization entails high complexity, for each framework, we also propose three low-complexity suboptimal schemes, namely a linear filter-based scheme, a peak observation-based scheme, and a threshold-trigger scheme, which are suitable for MC systems with limited computational capabilities. Furthermore, we study the relative complexity and the constraints associated with the proposed schemes and the impact of the insertion and deletion errors that arise due to imperfect synchronization. Our simulation results reveal the effectiveness of the proposed synchronization schemes and suggest that the end-to-end performance of MC systems significantly depends on the accuracy of the symbol synchronization.

  6. A model for seasonal changes in GPS positions and seismic wave speeds due to thermoelastic and hydrologic variations

    USGS Publications Warehouse

    Tsai, V.C.

    2011-01-01

    It is known that GPS time series contain a seasonal variation that is not due to tectonic motions, and it has recently been shown that crustal seismic velocities may also vary seasonally. In order to explain these changes, a number of hypotheses have been given, among which thermoelastic and hydrology-induced stresses and strains are leading candidates. Unfortunately, though, since a general framework does not exist for understanding such seasonal variations, it is currently not possible to quickly evaluate the plausibility of these hypotheses. To fill this gap in the literature, I generalize a two-dimensional thermoelastic strain model to provide an analytic solution for the displacements and wave speed changes due to either thermoelastic stresses or hydrologic loading, which consists of poroelastic stresses and purely elastic stresses. The thermoelastic model assumes a periodic surface temperature, and the hydrologic models similarly assume a periodic near-surface water load. Since all three models are two-dimensional and periodic, they are expected to only approximate any realistic scenario; but the models nonetheless provide a quantitative framework for estimating the effects of thermoelastic and hydrologic variations. Quantitative comparison between the models and observations is further complicated by the large uncertainty in some of the relevant parameters. Despite this uncertainty, though, I find that maximum realistic thermoelastic effects are unlikely to explain a large fraction of the observed annual variation in a typical GPS displacement time series or of the observed annual variations in seismic wave speeds in southern California. Hydrologic loading, on the other hand, may be able to explain a larger fraction of both the annual variations in displacements and seismic wave speeds. Neither model is likely to explain all of the seismic wave speed variations inferred from observations. However, more definitive conclusions cannot be made until the model parameters are better constrained. Copyright ?? 2011 by the American Geophysical Union.

  7. Fitness consequences of maternal and embryonic responses to environmental variation: using reptiles as models for studies of developmental plasticity.

    PubMed

    Warner, Daniel A

    2014-11-01

    Environmental factors strongly influence phenotypic variation within populations. The environment contributes to this variation in two ways: (1) by acting as a determinant of phenotypic variation (i.e., plastic responses) and (2) as an agent of selection that "chooses" among existing phenotypes. Understanding how these two environmental forces contribute to phenotypic variation is a major goal in the field of evolutionary biology and a primary objective of my research program. The objective of this article is to provide a framework to guide studies of environmental sources of phenotypic variation (specifically, developmental plasticity and maternal effects, and their adaptive significance). Two case studies from my research on reptiles are used to illustrate the general approaches I have taken to address these conceptual topics. Some key points for advancing our understanding of environmental influences on phenotypic variation include (1) merging laboratory-based research that identifies specific environmental effects with field studies to validate ecological relevance; (2) using controlled experimental approaches that mimic complex environments found in nature; (3) integrating data across biological fields (e.g., genetics, morphology, physiology, behavior, and ecology) under an evolutionary framework to provide novel insights into the underlying mechanisms that generate phenotypic variation; (4) assessing fitness consequences using measurements of survival and/or reproductive success across ontogeny (from embryos to adults) and under multiple ecologically-meaningful contexts; and (5) quantifying the strength and form of natural selection in multiple populations over multiple periods of time to understand the spatial and temporal consistency of phenotypic selection. Research programs that focus on organisms that are amenable to these approaches will provide the most promise for advancing our understanding of the environmental factors that generate the remarkable phenotypic diversity observed within populations. © The Author 2014. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  8. Diel variation in detection and vocalization rates of king (Rallus elegans) and clapper (Rallus crepitans) rails in intracoastal waterways

    USGS Publications Warehouse

    Stiffler, Lydia L.; Anderson, James T.; Welsh, Amy B.; Harding, Sergio R.; Costanzo, Gary R.; Katzner, Todd

    2017-01-01

    Surveys for secretive marsh birds could be improved with refinements to address regional and species-specific variation in detection probabilities and optimal times of day to survey. Diel variation in relation to naïve occupancy, detection rates, and vocalization rates of King (Rallus elegans) and Clapper (R. crepitans) rails were studied in intracoastal waterways in Virginia, USA. Autonomous acoustic devices recorded vocalizations of King and Clapper rails at 75 locations for 48-hr periods within a marsh complex. Naïve King and Clapper rail occupancy did not vary hourly at either the marsh or the study area level. Combined King and Clapper rail detections and vocalizations varied across marshes, decreased as the sampling season progressed, and, for detections, was greatest during low rising tides (P < 0.01). Hourly variation in vocalization and detection rates did not show a pattern but occurred between 7.8% of pairwise comparisons for detections and 10.5% of pairwise comparisons for vocalizations (P < 0.01). Higher rates of detections and vocalizations occurred during the hours of 00:00–00:59, 05:00–05:59, 14:00–15:59, and lower rates during the hours of 07:00–09:59. Although statistically significant, because there were no patterns in these hourly differences, they may not be biologically relevant and are of little use to management. In fact, these findings demonstrate that surveys for King and Clapper rails in Virginia intracoastal waterways may be effectively conducted throughout the day.

  9. Conditional Variational Autoencoder for Prediction and Feature Recovery Applied to Intrusion Detection in IoT.

    PubMed

    Lopez-Martin, Manuel; Carro, Belen; Sanchez-Esguevillas, Antonio; Lloret, Jaime

    2017-08-26

    The purpose of a Network Intrusion Detection System is to detect intrusive, malicious activities or policy violations in a host or host's network. In current networks, such systems are becoming more important as the number and variety of attacks increase along with the volume and sensitiveness of the information exchanged. This is of particular interest to Internet of Things networks, where an intrusion detection system will be critical as its economic importance continues to grow, making it the focus of future intrusion attacks. In this work, we propose a new network intrusion detection method that is appropriate for an Internet of Things network. The proposed method is based on a conditional variational autoencoder with a specific architecture that integrates the intrusion labels inside the decoder layers. The proposed method is less complex than other unsupervised methods based on a variational autoencoder and it provides better classification results than other familiar classifiers. More important, the method can perform feature reconstruction, that is, it is able to recover missing features from incomplete training datasets. We demonstrate that the reconstruction accuracy is very high, even for categorical features with a high number of distinct values. This work is unique in the network intrusion detection field, presenting the first application of a conditional variational autoencoder and providing the first algorithm to perform feature recovery.

  10. Conditional Variational Autoencoder for Prediction and Feature Recovery Applied to Intrusion Detection in IoT

    PubMed Central

    Carro, Belen; Sanchez-Esguevillas, Antonio

    2017-01-01

    The purpose of a Network Intrusion Detection System is to detect intrusive, malicious activities or policy violations in a host or host’s network. In current networks, such systems are becoming more important as the number and variety of attacks increase along with the volume and sensitiveness of the information exchanged. This is of particular interest to Internet of Things networks, where an intrusion detection system will be critical as its economic importance continues to grow, making it the focus of future intrusion attacks. In this work, we propose a new network intrusion detection method that is appropriate for an Internet of Things network. The proposed method is based on a conditional variational autoencoder with a specific architecture that integrates the intrusion labels inside the decoder layers. The proposed method is less complex than other unsupervised methods based on a variational autoencoder and it provides better classification results than other familiar classifiers. More important, the method can perform feature reconstruction, that is, it is able to recover missing features from incomplete training datasets. We demonstrate that the reconstruction accuracy is very high, even for categorical features with a high number of distinct values. This work is unique in the network intrusion detection field, presenting the first application of a conditional variational autoencoder and providing the first algorithm to perform feature recovery. PMID:28846608

  11. A Decision Theoretic Approach to Evaluate Radiation Detection Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nobles, Mallory A.; Sego, Landon H.; Cooley, Scott K.

    2013-07-01

    There are a variety of sensor systems deployed at U.S. border crossings and ports of entry that scan for illicit nuclear material. In this work, we develop a framework for comparing the performance of detection algorithms that interpret the output of these scans and determine when secondary screening is needed. We optimize each algorithm to minimize its risk, or expected loss. We measure an algorithm’s risk by considering its performance over a sample, the probability distribution of threat sources, and the consequence of detection errors. While it is common to optimize algorithms by fixing one error rate and minimizing another,more » our framework allows one to simultaneously consider multiple types of detection errors. Our framework is flexible and easily adapted to many different assumptions regarding the probability of a vehicle containing illicit material, and the relative consequences of a false positive and false negative errors. Our methods can therefore inform decision makers of the algorithm family and parameter values which best reduce the threat from illicit nuclear material, given their understanding of the environment at any point in time. To illustrate the applicability of our methods, in this paper, we compare the risk from two families of detection algorithms and discuss the policy implications of our results.« less

  12. Diff-seq: A high throughput sequencing-based mismatch detection assay for DNA variant enrichment and discovery

    PubMed Central

    Karas, Vlad O; Sinnott-Armstrong, Nicholas A; Varghese, Vici; Shafer, Robert W; Greenleaf, William J; Sherlock, Gavin

    2018-01-01

    Abstract Much of the within species genetic variation is in the form of single nucleotide polymorphisms (SNPs), typically detected by whole genome sequencing (WGS) or microarray-based technologies. However, WGS produces mostly uninformative reads that perfectly match the reference, while microarrays require genome-specific reagents. We have developed Diff-seq, a sequencing-based mismatch detection assay for SNP discovery without the requirement for specialized nucleic-acid reagents. Diff-seq leverages the Surveyor endonuclease to cleave mismatched DNA molecules that are generated after cross-annealing of a complex pool of DNA fragments. Sequencing libraries enriched for Surveyor-cleaved molecules result in increased coverage at the variant sites. Diff-seq detected all mismatches present in an initial test substrate, with specific enrichment dependent on the identity and context of the variation. Application to viral sequences resulted in increased observation of variant alleles in a biologically relevant context. Diff-Seq has the potential to increase the sensitivity and efficiency of high-throughput sequencing in the detection of variation. PMID:29361139

  13. Influences of Availability on Parameter Estimates from Site Occupancy Models with Application to Submersed Aquatic Vegetation

    USGS Publications Warehouse

    Gray, Brian R.; Holland, Mark D.; Yi, Feng; Starcevich, Leigh Ann Harrod

    2013-01-01

    Site occupancy models are commonly used by ecologists to estimate the probabilities of species site occupancy and of species detection. This study addresses the influence on site occupancy and detection estimates of variation in species availability among surveys within sites. Such variation in availability may result from temporary emigration, nonavailability of the species for detection, and sampling sites spatially when species presence is not uniform within sites. We demonstrate, using Monte Carlo simulations and aquatic vegetation data, that variation in availability and heterogeneity in the probability of availability may yield biases in the expected values of the site occupancy and detection estimates that have traditionally been associated with low-detection probabilities and heterogeneity in those probabilities. These findings confirm that the effects of availability may be important for ecologists and managers, and that where such effects are expected, modification of sampling designs and/or analytical methods should be considered. Failure to limit the effects of availability may preclude reliable estimation of the probability of site occupancy.

  14. An automatic fall detection framework using data fusion of Doppler radar and motion sensor network.

    PubMed

    Liu, Liang; Popescu, Mihail; Skubic, Marjorie; Rantz, Marilyn

    2014-01-01

    This paper describes the ongoing work of detecting falls in independent living senior apartments. We have developed a fall detection system with Doppler radar sensor and implemented ceiling radar in real senior apartments. However, the detection accuracy on real world data is affected by false alarms inherent in the real living environment, such as motions from visitors. To solve this issue, this paper proposes an improved framework by fusing the Doppler radar sensor result with a motion sensor network. As a result, performance is significantly improved after the data fusion by discarding the false alarms generated by visitors. The improvement of this new method is tested on one week of continuous data from an actual elderly person who frequently falls while living in her senior home.

  15. Statistical process control and verifying positional accuracy of a cobra motion couch using step-wedge quality assurance tool.

    PubMed

    Binny, Diana; Lancaster, Craig M; Trapp, Jamie V; Crowe, Scott B

    2017-09-01

    This study utilizes process control techniques to identify action limits for TomoTherapy couch positioning quality assurance tests. A test was introduced to monitor accuracy of the applied couch offset detection in the TomoTherapy Hi-Art treatment system using the TQA "Step-Wedge Helical" module and MVCT detector. Individual X-charts, process capability (cp), probability (P), and acceptability (cpk) indices were used to monitor a 4-year couch IEC offset data to detect systematic and random errors in the couch positional accuracy for different action levels. Process capability tests were also performed on the retrospective data to define tolerances based on user-specified levels. A second study was carried out whereby physical couch offsets were applied using the TQA module and the MVCT detector was used to detect the observed variations. Random and systematic variations were observed for the SPC-based upper and lower control limits, and investigations were carried out to maintain the ongoing stability of the process for a 4-year and a three-monthly period. Local trend analysis showed mean variations up to ±0.5 mm in the three-monthly analysis period for all IEC offset measurements. Variations were also observed in the detected versus applied offsets using the MVCT detector in the second study largely in the vertical direction, and actions were taken to remediate this error. Based on the results, it was recommended that imaging shifts in each coordinate direction be only applied after assessing the machine for applied versus detected test results using the step helical module. User-specified tolerance levels of at least ±2 mm were recommended for a test frequency of once every 3 months to improve couch positional accuracy. SPC enables detection of systematic variations prior to reaching machine tolerance levels. Couch encoding system recalibrations reduced variations to user-specified levels and a monitoring period of 3 months using SPC facilitated in detecting systematic and random variations. SPC analysis for couch positional accuracy enabled greater control in the identification of errors, thereby increasing confidence levels in daily treatment setups. © 2017 Royal Brisbane and Women's Hospital, Metro North Hospital and Health Service. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  16. Incorporating imperfect detection into joint models of communites: A response to Warton et al.

    USGS Publications Warehouse

    Beissinger, Steven R.; Iknayan, Kelly J.; Guillera-Arroita, Gurutzeta; Zipkin, Elise; Dorazio, Robert; Royle, Andy; Kery, Marc

    2016-01-01

    Warton et al. [1] advance community ecology by describing a statistical framework that can jointly model abundances (or distributions) across many taxa to quantify how community properties respond to environmental variables. This framework specifies the effects of both measured and unmeasured (latent) variables on the abundance (or occurrence) of each species. Latent variables are random effects that capture the effects of both missing environmental predictors and correlations in parameter values among different species. As presented in Warton et al., however, the joint modeling framework fails to account for the common problem of detection or measurement errors that always accompany field sampling of abundance or occupancy, and are well known to obscure species- and community-level inferences.

  17. A vision framework for the localization of soccer players and ball on the pitch using Handycams

    NASA Astrophysics Data System (ADS)

    Vilas, Tiago; Rodrigues, J. M. F.; Cardoso, P. J. S.; Silva, Bruno

    2015-03-01

    The current performance requirements in soccer make imperative the use of new technologies for game observation and analysis, such that detailed information about the teams' actions is provided. This paper summarizes a framework to collect the soccer players and ball positions using one or more Full HD Handycams, placed no more than 20cm apart in the stands, as well as how this framework connects to the FootData project. The system was based on four main modules: the detection and delimitation of the soccer pitch, the ball and the players detection and assignment to their teams, the tracking of players and ball and finally the computation of their localization (in meters) in the pitch.

  18. Highly selective and sensitive trimethylamine gas sensor based on cobalt imidazolate framework material.

    PubMed

    Chen, Er-Xia; Fu, Hong-Ru; Lin, Rui; Tan, Yan-Xi; Zhang, Jian

    2014-12-24

    A cobalt imidazolate (im) framework material [Co(im)2]n was employed to use as a trimethylamine (TMA) gas sensor and the [Co(im)2]n sensor can be easily fabricated by using Ag-Pd interdigitated electrodes. Gas sensing measurement indicated that the [Co(im)2]n sensor shows excellent selectivity, high gas response and a low detection limit level of 2 ppm to TMA at 75 °C. The good selectivity and high response to TMA of the sensor based on [Co(im)2]n may be attributed to the weak interaction between the TMA molecules and the [Co(im)2]n framework. That may provide an ideal candidate for detecting freshness of fish and seafood.

  19. Integration of a Self-Coherence Algorithm into DISAT for Forced Oscillation Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, James D.; Tuffner, Francis K.; Amidan, Brett G.

    2015-03-03

    With the increasing number of phasor measurement units on the power system, behaviors typically not observable on the power system are becoming more apparent. Oscillatory behavior on the power system, notably forced oscillations, are one such behavior. However, the large amounts of data coming from the PMUs makes manually detecting and locating these oscillations difficult. To automate portions of the process, an oscillation detection routine was coded into the Data Integrity and Situational Awareness Tool (DISAT) framework. Integration into the DISAT framework allows forced oscillations to be detected and information about the event provided to operational engineers. The oscillation detectionmore » algorithm integrates with the data handling and atypical data detecting capabilities of DISAT, building off of a standard library of functions. This report details that integration with information on the algorithm, some implementation issues, and some sample results from the western United States’ power grid.« less

  20. Text Detection, Tracking and Recognition in Video: A Comprehensive Survey.

    PubMed

    Yin, Xu-Cheng; Zuo, Ze-Yu; Tian, Shu; Liu, Cheng-Lin

    2016-04-14

    Intelligent analysis of video data is currently in wide demand because video is a major source of sensory data in our lives. Text is a prominent and direct source of information in video, while recent surveys of text detection and recognition in imagery [1], [2] focus mainly on text extraction from scene images. Here, this paper presents a comprehensive survey of text detection, tracking and recognition in video with three major contributions. First, a generic framework is proposed for video text extraction that uniformly describes detection, tracking, recognition, and their relations and interactions. Second, within this framework, a variety of methods, systems and evaluation protocols of video text extraction are summarized, compared, and analyzed. Existing text tracking techniques, tracking based detection and recognition techniques are specifically highlighted. Third, related applications, prominent challenges, and future directions for video text extraction (especially from scene videos and web videos) are also thoroughly discussed.

Top