Identification of subsurface structures using electromagnetic data and shape priors
NASA Astrophysics Data System (ADS)
Tveit, Svenn; Bakr, Shaaban A.; Lien, Martha; Mannseth, Trond
2015-03-01
We consider the inverse problem of identifying large-scale subsurface structures using the controlled source electromagnetic method. To identify structures in the subsurface where the contrast in electric conductivity can be small, regularization is needed to bias the solution towards preserving structural information. We propose to combine two approaches for regularization of the inverse problem. In the first approach we utilize a model-based, reduced, composite representation of the electric conductivity that is highly flexible, even for a moderate number of degrees of freedom. With a low number of parameters, the inverse problem is efficiently solved using a standard, second-order gradient-based optimization algorithm. Further regularization is obtained using structural prior information, available, e.g., from interpreted seismic data. The reduced conductivity representation is suitable for incorporation of structural prior information. Such prior information cannot, however, be accurately modeled with a gaussian distribution. To alleviate this, we incorporate the structural information using shape priors. The shape prior technique requires the choice of kernel function, which is application dependent. We argue for using the conditionally positive definite kernel which is shown to have computational advantages over the commonly applied gaussian kernel for our problem. Numerical experiments on various test cases show that the methodology is able to identify fairly complex subsurface electric conductivity distributions while preserving structural prior information during the inversion.
Kurrant, Douglas; Fear, Elise; Baran, Anastasia; LoVetri, Joe
2017-12-01
The authors have developed a method to combine a patient-specific map of tissue structure and average dielectric properties with microwave tomography. The patient-specific map is acquired with radar-based techniques and serves as prior information for microwave tomography. The impact that the degree of structural detail included in this prior information has on image quality was reported in a previous investigation. The aim of the present study is to extend this previous work by identifying and quantifying the impact that errors in the prior information have on image quality, including the reconstruction of internal structures and lesions embedded in fibroglandular tissue. This study also extends the work of others reported in literature by emulating a clinical setting with a set of experiments that incorporate heterogeneity into both the breast interior and glandular region, as well as prior information related to both fat and glandular structures. Patient-specific structural information is acquired using radar-based methods that form a regional map of the breast. Errors are introduced to create a discrepancy in the geometry and electrical properties between the regional map and the model used to generate the data. This permits the impact that errors in the prior information have on image quality to be evaluated. Image quality is quantitatively assessed by measuring the ability of the algorithm to reconstruct both internal structures and lesions embedded in fibroglandular tissue. The study is conducted using both 2D and 3D numerical breast models constructed from MRI scans. The reconstruction results demonstrate robustness of the method relative to errors in the dielectric properties of the background regional map, and to misalignment errors. These errors do not significantly influence the reconstruction accuracy of the underlying structures, or the ability of the algorithm to reconstruct malignant tissue. Although misalignment errors do not significantly impact the quality of the reconstructed fat and glandular structures for the 3D scenarios, the dielectric properties are reconstructed less accurately within the glandular structure for these cases relative to the 2D cases. However, general agreement between the 2D and 3D results was found. A key contribution of this paper is the detailed analysis of the impact of prior information errors on the reconstruction accuracy and ability to detect tumors. The results support the utility of acquiring patient-specific information with radar-based techniques and incorporating this information into MWT. The method is robust to errors in the dielectric properties of the background regional map, and to misalignment errors. Completion of this analysis is an important step toward developing the method into a practical diagnostic tool. © 2017 American Association of Physicists in Medicine.
Cooley, Richard L.
1982-01-01
Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.
Torres, Craig; Jones, Rachael; Boelter, Fred; Poole, James; Dell, Linda; Harper, Paul
2014-01-01
Bayesian Decision Analysis (BDA) uses Bayesian statistics to integrate multiple types of exposure information and classify exposures within the exposure rating categorization scheme promoted in American Industrial Hygiene Association (AIHA) publications. Prior distributions for BDA may be developed from existing monitoring data, mathematical models, or professional judgment. Professional judgments may misclassify exposures. We suggest that a structured qualitative risk assessment (QLRA) method can provide consistency and transparency in professional judgments. In this analysis, we use a structured QLRA method to define prior distributions (priors) for BDA. We applied this approach at three semiconductor facilities in South Korea, and present an evaluation of the performance of structured QLRA for determination of priors, and an evaluation of occupational exposures using BDA. Specifically, the structured QLRA was applied to chemical agents in similar exposure groups to identify provisional risk ratings. Standard priors were developed for each risk rating before review of historical monitoring data. Newly collected monitoring data were used to update priors informed by QLRA or historical monitoring data, and determine the posterior distribution. Exposure ratings were defined by the rating category with the highest probability--i.e., the most likely. We found the most likely exposure rating in the QLRA-informed priors to be consistent with historical and newly collected monitoring data, and the posterior exposure ratings developed with QLRA-informed priors to be equal to or greater than those developed with data-informed priors in 94% of comparisons. Overall, exposures at these facilities are consistent with well-controlled work environments. That is, the 95th percentile of exposure distributions are ≤50% of the occupational exposure limit (OEL) for all chemical-SEG combinations evaluated; and are ≤10% of the limit for 94% of chemical-SEG combinations evaluated.
NASA Astrophysics Data System (ADS)
Park, Gilsoon; Hong, Jinwoo; Lee, Jong-Min
2018-03-01
In human brain, Corpus Callosum (CC) is the largest white matter structure, connecting between right and left hemispheres. Structural features such as shape and size of CC in midsagittal plane are of great significance for analyzing various neurological diseases, for example Alzheimer's disease, autism and epilepsy. For quantitative and qualitative studies of CC in brain MR images, robust segmentation of CC is important. In this paper, we present a novel method for CC segmentation. Our approach is based on deep neural networks and the prior information generated from multi-atlas images. Deep neural networks have recently shown good performance in various image processing field. Convolutional neural networks (CNN) have shown outstanding performance for classification and segmentation in medical image fields. We used convolutional neural networks for CC segmentation. Multi-atlas based segmentation model have been widely used in medical image segmentation because atlas has powerful information about the target structure we want to segment, consisting of MR images and corresponding manual segmentation of the target structure. We combined the prior information, such as location and intensity distribution of target structure (i.e. CC), made from multi-atlas images in CNN training process for more improving training. The CNN with prior information showed better segmentation performance than without.
Gradient-based reliability maps for ACM-based segmentation of hippocampus.
Zarpalas, Dimitrios; Gkontra, Polyxeni; Daras, Petros; Maglaveras, Nicos
2014-04-01
Automatic segmentation of deep brain structures, such as the hippocampus (HC), in MR images has attracted considerable scientific attention due to the widespread use of MRI and to the principal role of some structures in various mental disorders. In this literature, there exists a substantial amount of work relying on deformable models incorporating prior knowledge about structures' anatomy and shape information. However, shape priors capture global shape characteristics and thus fail to model boundaries of varying properties; HC boundaries present rich, poor, and missing gradient regions. On top of that, shape prior knowledge is blended with image information in the evolution process, through global weighting of the two terms, again neglecting the spatially varying boundary properties, causing segmentation faults. An innovative method is hereby presented that aims to achieve highly accurate HC segmentation in MR images, based on the modeling of boundary properties at each anatomical location and the inclusion of appropriate image information for each of those, within an active contour model framework. Hence, blending of image information and prior knowledge is based on a local weighting map, which mixes gradient information, regional and whole brain statistical information with a multi-atlas-based spatial distribution map of the structure's labels. Experimental results on three different datasets demonstrate the efficacy and accuracy of the proposed method.
Knowledge Structures of Entering Computer Networking Students and Their Instructors
ERIC Educational Resources Information Center
DiCerbo, Kristen E.
2007-01-01
Students bring prior knowledge to their learning experiences. This prior knowledge is known to affect how students encode and later retrieve new information learned. Teachers and content developers can use information about students' prior knowledge to create more effective lessons and materials. In many content areas, particularly the sciences,…
Exploiting Genome Structure in Association Analysis
Kim, Seyoung
2014-01-01
Abstract A genome-wide association study involves examining a large number of single-nucleotide polymorphisms (SNPs) to identify SNPs that are significantly associated with the given phenotype, while trying to reduce the false positive rate. Although haplotype-based association methods have been proposed to accommodate correlation information across nearby SNPs that are in linkage disequilibrium, none of these methods directly incorporated the structural information such as recombination events along chromosome. In this paper, we propose a new approach called stochastic block lasso for association mapping that exploits prior knowledge on linkage disequilibrium structure in the genome such as recombination rates and distances between adjacent SNPs in order to increase the power of detecting true associations while reducing false positives. Following a typical linear regression framework with the genotypes as inputs and the phenotype as output, our proposed method employs a sparsity-enforcing Laplacian prior for the regression coefficients, augmented by a first-order Markov process along the sequence of SNPs that incorporates the prior information on the linkage disequilibrium structure. The Markov-chain prior models the structural dependencies between a pair of adjacent SNPs, and allows us to look for association SNPs in a coupled manner, combining strength from multiple nearby SNPs. Our results on HapMap-simulated datasets and mouse datasets show that there is a significant advantage in incorporating the prior knowledge on linkage disequilibrium structure for marker identification under whole-genome association. PMID:21548809
Integrating prior information into microwave tomography Part 1: Impact of detail on image quality.
Kurrant, Douglas; Baran, Anastasia; LoVetri, Joe; Fear, Elise
2017-12-01
The authors investigate the impact that incremental increases in the level of detail of patient-specific prior information have on image quality and the convergence behavior of an inversion algorithm in the context of near-field microwave breast imaging. A methodology is presented that uses image quality measures to characterize the ability of the algorithm to reconstruct both internal structures and lesions embedded in fibroglandular tissue. The approach permits key aspects that impact the quality of reconstruction of these structures to be identified and quantified. This provides insight into opportunities to improve image reconstruction performance. Patient-specific information is acquired using radar-based methods that form a regional map of the breast. This map is then incorporated into a microwave tomography algorithm. Previous investigations have demonstrated the effectiveness of this approach to improve image quality when applied to data generated with two-dimensional (2D) numerical models. The present study extends this work by generating prior information that is customized to vary the degree of structural detail to facilitate the investigation of the role of prior information in image formation. Numerical 2D breast models constructed from magnetic resonance (MR) scans, and reconstructions formed with a three-dimensional (3D) numerical breast model are used to assess if trends observed for the 2D results can be extended to 3D scenarios. For the blind reconstruction scenario (i.e., no prior information), the breast surface is not accurately identified and internal structures are not clearly resolved. A substantial improvement in image quality is achieved by incorporating the skin surface map and constraining the imaging domain to the breast. Internal features within the breast appear in the reconstructed image. However, it is challenging to discriminate between adipose and glandular regions and there are inaccuracies in both the structural properties of the glandular region and the dielectric properties reconstructed within this structure. Using a regional map with a skin layer only marginally improves this situation. Increasing the structural detail in the prior information to include internal features leads to reconstructions for which the interface that delineates the fat and gland regions can be inferred. Different features within the glandular region corresponding to tissues with varying relative permittivity values, such as a lesion embedded within glandular structure, emerge in the reconstructed images. Including knowledge of the breast surface and skin layer leads to a substantial improvement in image quality compared to the blind case, but the images have limited diagnostic utility for applications such as tumor response tracking. The diagnostic utility of the reconstruction technique is improved considerably when patient-specific structural information is used. This qualitative observation is supported quantitatively with image metrics. © 2017 American Association of Physicists in Medicine.
Self-prior strategy for organ reconstruction in fluorescence molecular tomography
Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen
2017-01-01
The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy. PMID:29082094
Self-prior strategy for organ reconstruction in fluorescence molecular tomography.
Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen
2017-10-01
The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy.
Hippocampus segmentation using locally weighted prior based level set
NASA Astrophysics Data System (ADS)
Achuthan, Anusha; Rajeswari, Mandava
2015-12-01
Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.
Network inference using informative priors
Mukherjee, Sach; Speed, Terence P.
2008-01-01
Recent years have seen much interest in the study of systems characterized by multiple interacting components. A class of statistical models called graphical models, in which graphs are used to represent probabilistic relationships between variables, provides a framework for formal inference regarding such systems. In many settings, the object of inference is the network structure itself. This problem of “network inference” is well known to be a challenging one. However, in scientific settings there is very often existing information regarding network connectivity. A natural idea then is to take account of such information during inference. This article addresses the question of incorporating prior information into network inference. We focus on directed models called Bayesian networks, and use Markov chain Monte Carlo to draw samples from posterior distributions over network structures. We introduce prior distributions on graphs capable of capturing information regarding network features including edges, classes of edges, degree distributions, and sparsity. We illustrate our approach in the context of systems biology, applying our methods to network inference in cancer signaling. PMID:18799736
Network inference using informative priors.
Mukherjee, Sach; Speed, Terence P
2008-09-23
Recent years have seen much interest in the study of systems characterized by multiple interacting components. A class of statistical models called graphical models, in which graphs are used to represent probabilistic relationships between variables, provides a framework for formal inference regarding such systems. In many settings, the object of inference is the network structure itself. This problem of "network inference" is well known to be a challenging one. However, in scientific settings there is very often existing information regarding network connectivity. A natural idea then is to take account of such information during inference. This article addresses the question of incorporating prior information into network inference. We focus on directed models called Bayesian networks, and use Markov chain Monte Carlo to draw samples from posterior distributions over network structures. We introduce prior distributions on graphs capable of capturing information regarding network features including edges, classes of edges, degree distributions, and sparsity. We illustrate our approach in the context of systems biology, applying our methods to network inference in cancer signaling.
Ali, Anjum A; Dale, Anders M; Badea, Alexandra; Johnson, G Allan
2005-08-15
We present the automated segmentation of magnetic resonance microscopy (MRM) images of the C57BL/6J mouse brain into 21 neuroanatomical structures, including the ventricular system, corpus callosum, hippocampus, caudate putamen, inferior colliculus, internal capsule, globus pallidus, and substantia nigra. The segmentation algorithm operates on multispectral, three-dimensional (3D) MR data acquired at 90-microm isotropic resolution. Probabilistic information used in the segmentation is extracted from training datasets of T2-weighted, proton density-weighted, and diffusion-weighted acquisitions. Spatial information is employed in the form of prior probabilities of occurrence of a structure at a location (location priors) and the pairwise probabilities between structures (contextual priors). Validation using standard morphometry indices shows good consistency between automatically segmented and manually traced data. Results achieved in the mouse brain are comparable with those achieved in human brain studies using similar techniques. The segmentation algorithm shows excellent potential for routine morphological phenotyping of mouse models.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.
Dunham, Kylee; Grand, James B.
2016-01-01
We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.
Dissecting effects of complex mixtures: who's afraid of informative priors?
Thomas, Duncan C; Witte, John S; Greenland, Sander
2007-03-01
Epidemiologic studies commonly investigate multiple correlated exposures, which are difficult to analyze appropriately. Hierarchical modeling provides a promising approach for analyzing such data by adding a higher-level structure or prior model for the exposure effects. This prior model can incorporate additional information on similarities among the correlated exposures and can be parametric, semiparametric, or nonparametric. We discuss the implications of applying these models and argue for their expanded use in epidemiology. While a prior model adds assumptions to the conventional (first-stage) model, all statistical methods (including conventional methods) make strong intrinsic assumptions about the processes that generated the data. One should thus balance prior modeling assumptions against assumptions of validity, and use sensitivity analyses to understand their implications. In doing so - and by directly incorporating into our analyses information from other studies or allied fields - we can improve our ability to distinguish true causes of disease from noise and bias.
Impact of Information Technology Governance Structures on Strategic Alignment
ERIC Educational Resources Information Center
Gordon, Fitzroy R.
2013-01-01
This dissertation is a study of the relationship between Information Technology (IT) strategic alignment and IT governance structure within the organization. This dissertation replicates Asante (2010) among a different population where the prior results continue to hold, the non-experimental approach explored two research questions but include two…
Bayes factors for testing inequality constrained hypotheses: Issues with prior specification.
Mulder, Joris
2014-02-01
Several issues are discussed when testing inequality constrained hypotheses using a Bayesian approach. First, the complexity (or size) of the inequality constrained parameter spaces can be ignored. This is the case when using the posterior probability that the inequality constraints of a hypothesis hold, Bayes factors based on non-informative improper priors, and partial Bayes factors based on posterior priors. Second, the Bayes factor may not be invariant for linear one-to-one transformations of the data. This can be observed when using balanced priors which are centred on the boundary of the constrained parameter space with a diagonal covariance structure. Third, the information paradox can be observed. When testing inequality constrained hypotheses, the information paradox occurs when the Bayes factor of an inequality constrained hypothesis against its complement converges to a constant as the evidence for the first hypothesis accumulates while keeping the sample size fixed. This paradox occurs when using Zellner's g prior as a result of too much prior shrinkage. Therefore, two new methods are proposed that avoid these issues. First, partial Bayes factors are proposed based on transformed minimal training samples. These training samples result in posterior priors that are centred on the boundary of the constrained parameter space with the same covariance structure as in the sample. Second, a g prior approach is proposed by letting g go to infinity. This is possible because the Jeffreys-Lindley paradox is not an issue when testing inequality constrained hypotheses. A simulation study indicated that the Bayes factor based on this g prior approach converges fastest to the true inequality constrained hypothesis. © 2013 The British Psychological Society.
Lateral orbitofrontal cortex anticipates choices and integrates prior with current information
Nogueira, Ramon; Abolafia, Juan M.; Drugowitsch, Jan; Balaguer-Ballester, Emili; Sanchez-Vives, Maria V.; Moreno-Bote, Rubén
2017-01-01
Adaptive behavior requires integrating prior with current information to anticipate upcoming events. Brain structures related to this computation should bring relevant signals from the recent past into the present. Here we report that rats can integrate the most recent prior information with sensory information, thereby improving behavior on a perceptual decision-making task with outcome-dependent past trial history. We find that anticipatory signals in the orbitofrontal cortex about upcoming choice increase over time and are even present before stimulus onset. These neuronal signals also represent the stimulus and relevant second-order combinations of past state variables. The encoding of choice, stimulus and second-order past state variables resides, up to movement onset, in overlapping populations. The neuronal representation of choice before stimulus onset and its build-up once the stimulus is presented suggest that orbitofrontal cortex plays a role in transforming immediate prior and stimulus information into choices using a compact state-space representation. PMID:28337990
Thorlund, Kristian; Thabane, Lehana; Mills, Edward J
2013-01-11
Multiple treatment comparison (MTC) meta-analyses are commonly modeled in a Bayesian framework, and weakly informative priors are typically preferred to mirror familiar data driven frequentist approaches. Random-effects MTCs have commonly modeled heterogeneity under the assumption that the between-trial variance for all involved treatment comparisons are equal (i.e., the 'common variance' assumption). This approach 'borrows strength' for heterogeneity estimation across treatment comparisons, and thus, ads valuable precision when data is sparse. The homogeneous variance assumption, however, is unrealistic and can severely bias variance estimates. Consequently 95% credible intervals may not retain nominal coverage, and treatment rank probabilities may become distorted. Relaxing the homogeneous variance assumption may be equally problematic due to reduced precision. To regain good precision, moderately informative variance priors or additional mathematical assumptions may be necessary. In this paper we describe four novel approaches to modeling heterogeneity variance - two novel model structures, and two approaches for use of moderately informative variance priors. We examine the relative performance of all approaches in two illustrative MTC data sets. We particularly compare between-study heterogeneity estimates and model fits, treatment effect estimates and 95% credible intervals, and treatment rank probabilities. In both data sets, use of moderately informative variance priors constructed from the pair wise meta-analysis data yielded the best model fit and narrower credible intervals. Imposing consistency equations on variance estimates, assuming variances to be exchangeable, or using empirically informed variance priors also yielded good model fits and narrow credible intervals. The homogeneous variance model yielded high precision at all times, but overall inadequate estimates of between-trial variances. Lastly, treatment rankings were similar among the novel approaches, but considerably different when compared with the homogenous variance approach. MTC models using a homogenous variance structure appear to perform sub-optimally when between-trial variances vary between comparisons. Using informative variance priors, assuming exchangeability or imposing consistency between heterogeneity variances can all ensure sufficiently reliable and realistic heterogeneity estimation, and thus more reliable MTC inferences. All four approaches should be viable candidates for replacing or supplementing the conventional homogeneous variance MTC model, which is currently the most widely used in practice.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291
Super resolution reconstruction of μ-CT image of rock sample using neighbour embedding algorithm
NASA Astrophysics Data System (ADS)
Wang, Yuzhu; Rahman, Sheik S.; Arns, Christoph H.
2018-03-01
X-ray computed tomography (μ-CT) is considered to be the most effective way to obtain the inner structure of rock sample without destructions. However, its limited resolution hampers its ability to probe sub-micro structures which is critical for flow transportation of rock sample. In this study, we propose an innovative methodology to improve the resolution of μ-CT image using neighbour embedding algorithm where low frequency information is provided by μ-CT image itself while high frequency information is supplemented by high resolution scanning electron microscopy (SEM) image. In order to obtain prior for reconstruction, a large number of image patch pairs contain high- and low- image patches are extracted from the Gaussian image pyramid generated by SEM image. These image patch pairs contain abundant information about tomographic evolution of local porous structures under different resolution spaces. Relying on the assumption of self-similarity of porous structure, this prior information can be used to supervise the reconstruction of high resolution μ-CT image effectively. The experimental results show that the proposed method is able to achieve the state-of-the-art performance.
SU-E-J-71: Spatially Preserving Prior Knowledge-Based Treatment Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, H; Xing, L
2015-06-15
Purpose: Prior knowledge-based treatment planning is impeded by the use of a single dose volume histogram (DVH) curve. Critical spatial information is lost from collapsing the dose distribution into a histogram. Even similar patients possess geometric variations that becomes inaccessible in the form of a single DVH. We propose a simple prior knowledge-based planning scheme that extracts features from prior dose distribution while still preserving the spatial information. Methods: A prior patient plan is not used as a mere starting point for a new patient but rather stopping criteria are constructed. Each structure from the prior patient is partitioned intomore » multiple shells. For instance, the PTV is partitioned into an inner, middle, and outer shell. Prior dose statistics are then extracted for each shell and translated into the appropriate Dmin and Dmax parameters for the new patient. Results: The partitioned dose information from a prior case has been applied onto 14 2-D prostate cases. Using prior case yielded final DVHs that was comparable to manual planning, even though the DVH for the prior case was different from the DVH for the 14 cases. Solely using a single DVH for the entire organ was also performed for comparison but showed a much poorer performance. Different ways of translating the prior dose statistics into parameters for the new patient was also tested. Conclusion: Prior knowledge-based treatment planning need to salvage the spatial information without transforming the patients on a voxel to voxel basis. An efficient balance between the anatomy and dose domain is gained through partitioning the organs into multiple shells. The use of prior knowledge not only serves as a starting point for a new case but the information extracted from the partitioned shells are also translated into stopping criteria for the optimization problem at hand.« less
Bayesian bivariate meta-analysis of diagnostic test studies with interpretable priors.
Guo, Jingyi; Riebler, Andrea; Rue, Håvard
2017-08-30
In a bivariate meta-analysis, the number of diagnostic studies involved is often very low so that frequentist methods may result in problems. Using Bayesian inference is particularly attractive as informative priors that add a small amount of information can stabilise the analysis without overwhelming the data. However, Bayesian analysis is often computationally demanding and the selection of the prior for the covariance matrix of the bivariate structure is crucial with little data. The integrated nested Laplace approximations method provides an efficient solution to the computational issues by avoiding any sampling, but the important question of priors remain. We explore the penalised complexity (PC) prior framework for specifying informative priors for the variance parameters and the correlation parameter. PC priors facilitate model interpretation and hyperparameter specification as expert knowledge can be incorporated intuitively. We conduct a simulation study to compare the properties and behaviour of differently defined PC priors to currently used priors in the field. The simulation study shows that the PC prior seems beneficial for the variance parameters. The use of PC priors for the correlation parameter results in more precise estimates when specified in a sensible neighbourhood around the truth. To investigate the usage of PC priors in practice, we reanalyse a meta-analysis using the telomerase marker for the diagnosis of bladder cancer and compare the results with those obtained by other commonly used modelling approaches. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Bayesian Semiparametric Structural Equation Models with Latent Variables
ERIC Educational Resources Information Center
Yang, Mingan; Dunson, David B.
2010-01-01
Structural equation models (SEMs) with latent variables are widely useful for sparse covariance structure modeling and for inferring relationships among latent variables. Bayesian SEMs are appealing in allowing for the incorporation of prior information and in providing exact posterior distributions of unknowns, including the latent variables. In…
ERIC Educational Resources Information Center
Cooper, Melanie M.; Underwood, Sonia M.; Hilley, Caleb Z.
2012-01-01
Lewis structures are a simplified two dimensional "cartoon" of molecular structure that allow a knowledgeable user to predict the types of properties a particular substance may exhibit. However, prior research shows that many students fail to recognize these structure-property connections and are unable to decode the information…
Automatic Bayes Factors for Testing Equality- and Inequality-Constrained Hypotheses on Variances.
Böing-Messing, Florian; Mulder, Joris
2018-05-03
In comparing characteristics of independent populations, researchers frequently expect a certain structure of the population variances. These expectations can be formulated as hypotheses with equality and/or inequality constraints on the variances. In this article, we consider the Bayes factor for testing such (in)equality-constrained hypotheses on variances. Application of Bayes factors requires specification of a prior under every hypothesis to be tested. However, specifying subjective priors for variances based on prior information is a difficult task. We therefore consider so-called automatic or default Bayes factors. These methods avoid the need for the user to specify priors by using information from the sample data. We present three automatic Bayes factors for testing variances. The first is a Bayes factor with equal priors on all variances, where the priors are specified automatically using a small share of the information in the sample data. The second is the fractional Bayes factor, where a fraction of the likelihood is used for automatic prior specification. The third is an adjustment of the fractional Bayes factor such that the parsimony of inequality-constrained hypotheses is properly taken into account. The Bayes factors are evaluated by investigating different properties such as information consistency and large sample consistency. Based on this evaluation, it is concluded that the adjusted fractional Bayes factor is generally recommendable for testing equality- and inequality-constrained hypotheses on variances.
Big Data in the Information Age: Exploring the Intellectual Foundation of Communication Theory
ERIC Educational Resources Information Center
Borkovich, Debra J.; Noah, Philip D.
2014-01-01
Big Data are structured, semi-structured, unstructured, and raw data that are revolutionizing how we think about and use information in the 21st century. Big Data represents a paradigm shift from our prior use of traditional data assets over the past 30+ years, such as numeric and textual data, to generating and accessing petabytes and beyond of…
2013-01-01
Background Multiple treatment comparison (MTC) meta-analyses are commonly modeled in a Bayesian framework, and weakly informative priors are typically preferred to mirror familiar data driven frequentist approaches. Random-effects MTCs have commonly modeled heterogeneity under the assumption that the between-trial variance for all involved treatment comparisons are equal (i.e., the ‘common variance’ assumption). This approach ‘borrows strength’ for heterogeneity estimation across treatment comparisons, and thus, ads valuable precision when data is sparse. The homogeneous variance assumption, however, is unrealistic and can severely bias variance estimates. Consequently 95% credible intervals may not retain nominal coverage, and treatment rank probabilities may become distorted. Relaxing the homogeneous variance assumption may be equally problematic due to reduced precision. To regain good precision, moderately informative variance priors or additional mathematical assumptions may be necessary. Methods In this paper we describe four novel approaches to modeling heterogeneity variance - two novel model structures, and two approaches for use of moderately informative variance priors. We examine the relative performance of all approaches in two illustrative MTC data sets. We particularly compare between-study heterogeneity estimates and model fits, treatment effect estimates and 95% credible intervals, and treatment rank probabilities. Results In both data sets, use of moderately informative variance priors constructed from the pair wise meta-analysis data yielded the best model fit and narrower credible intervals. Imposing consistency equations on variance estimates, assuming variances to be exchangeable, or using empirically informed variance priors also yielded good model fits and narrow credible intervals. The homogeneous variance model yielded high precision at all times, but overall inadequate estimates of between-trial variances. Lastly, treatment rankings were similar among the novel approaches, but considerably different when compared with the homogenous variance approach. Conclusions MTC models using a homogenous variance structure appear to perform sub-optimally when between-trial variances vary between comparisons. Using informative variance priors, assuming exchangeability or imposing consistency between heterogeneity variances can all ensure sufficiently reliable and realistic heterogeneity estimation, and thus more reliable MTC inferences. All four approaches should be viable candidates for replacing or supplementing the conventional homogeneous variance MTC model, which is currently the most widely used in practice. PMID:23311298
Heuristics as Bayesian inference under extreme priors.
Parpart, Paula; Jones, Matt; Love, Bradley C
2018-05-01
Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Order priors for Bayesian network discovery with an application to malware phylogeny
Oyen, Diane; Anderson, Blake; Sentz, Kari; ...
2017-09-15
Here, Bayesian networks have been used extensively to model and discover dependency relationships among sets of random variables. We learn Bayesian network structure with a combination of human knowledge about the partial ordering of variables and statistical inference of conditional dependencies from observed data. Our approach leverages complementary information from human knowledge and inference from observed data to produce networks that reflect human beliefs about the system as well as to fit the observed data. Applying prior beliefs about partial orderings of variables is an approach distinctly different from existing methods that incorporate prior beliefs about direct dependencies (or edges)more » in a Bayesian network. We provide an efficient implementation of the partial-order prior in a Bayesian structure discovery learning algorithm, as well as an edge prior, showing that both priors meet the local modularity requirement necessary for an efficient Bayesian discovery algorithm. In benchmark studies, the partial-order prior improves the accuracy of Bayesian network structure learning as well as the edge prior, even though order priors are more general. Our primary motivation is in characterizing the evolution of families of malware to aid cyber security analysts. For the problem of malware phylogeny discovery, we find that our algorithm, compared to existing malware phylogeny algorithms, more accurately discovers true dependencies that are missed by other algorithms.« less
Order priors for Bayesian network discovery with an application to malware phylogeny
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oyen, Diane; Anderson, Blake; Sentz, Kari
Here, Bayesian networks have been used extensively to model and discover dependency relationships among sets of random variables. We learn Bayesian network structure with a combination of human knowledge about the partial ordering of variables and statistical inference of conditional dependencies from observed data. Our approach leverages complementary information from human knowledge and inference from observed data to produce networks that reflect human beliefs about the system as well as to fit the observed data. Applying prior beliefs about partial orderings of variables is an approach distinctly different from existing methods that incorporate prior beliefs about direct dependencies (or edges)more » in a Bayesian network. We provide an efficient implementation of the partial-order prior in a Bayesian structure discovery learning algorithm, as well as an edge prior, showing that both priors meet the local modularity requirement necessary for an efficient Bayesian discovery algorithm. In benchmark studies, the partial-order prior improves the accuracy of Bayesian network structure learning as well as the edge prior, even though order priors are more general. Our primary motivation is in characterizing the evolution of families of malware to aid cyber security analysts. For the problem of malware phylogeny discovery, we find that our algorithm, compared to existing malware phylogeny algorithms, more accurately discovers true dependencies that are missed by other algorithms.« less
NASA Astrophysics Data System (ADS)
Wang, He; Zhang, Wen-Hao; Wong, K. Y. Michael; Wu, Si
Extensive studies suggest that the brain integrates multisensory signals in a Bayesian optimal way. However, it remains largely unknown how the sensory reliability and the prior information shape the neural architecture. In this work, we propose a biologically plausible neural field model, which can perform optimal multisensory integration and encode the whole profile of the posterior. Our model is composed of two modules, each for one modality. The crosstalks between the two modules can be carried out through feedforwad cross-links and reciprocal connections. We found that the reciprocal couplings are crucial to optimal multisensory integration in that the reciprocal coupling pattern is shaped by the correlation in the joint prior distribution of the sensory stimuli. A perturbative approach is developed to illustrate the relation between the prior information and features in coupling patterns quantitatively. Our results show that a decentralized architecture based on reciprocal connections is able to accommodate complex correlation structures across modalities and utilize this prior information in optimal multisensory integration. This work is supported by the Research Grants Council of Hong Kong (N_HKUST606/12 and 605813) and National Basic Research Program of China (2014CB846101) and the Natural Science Foundation of China (31261160495).
24 CFR 3285.903 - Permits, alterations, and on-site structures.
Code of Federal Regulations, 2010 CFR
2010-04-01
... HOUSING AND URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS Optional Information for... from property lines and public roads are met. (b) Alterations. Prior to making any alteration to a home...) Installation of on-site structures. Each accessory building and structure is designed to support all of its own...
MRAC Control with Prior Model Knowledge for Asymmetric Damaged Aircraft
Zhang, Jing
2015-01-01
This paper develops a novel state-tracking multivariable model reference adaptive control (MRAC) technique utilizing prior knowledge of plant models to recover control performance of an asymmetric structural damaged aircraft. A modification of linear model representation is given. With prior knowledge on structural damage, a polytope linear parameter varying (LPV) model is derived to cover all concerned damage conditions. An MRAC method is developed for the polytope model, of which the stability and asymptotic error convergence are theoretically proved. The proposed technique reduces the number of parameters to be adapted and thus decreases computational cost and requires less input information. The method is validated by simulations on NASA generic transport model (GTM) with damage. PMID:26180839
Limited-angle multi-energy CT using joint clustering prior and sparsity regularization
NASA Astrophysics Data System (ADS)
Zhang, Huayu; Xing, Yuxiang
2016-03-01
In this article, we present an easy-to-implement Multi-energy CT scanning strategy and a corresponding reconstruction method, which facilitate spectral CT imaging by improving the data efficiency the number-of-energy- channel fold without introducing visible limited-angle artifacts caused by reducing projection views. Leveraging the structure coherence at different energies, we first pre-reconstruct a prior structure information image using projection data from all energy channels. Then, we perform a k-means clustering on the prior image to generate a sparse dictionary representation for the image, which severs as a structure information constraint. We com- bine this constraint with conventional compressed sensing method and proposed a new model which we referred as Joint Clustering Prior and Sparsity Regularization (CPSR). CPSR is a convex problem and we solve it by Alternating Direction Method of Multipliers (ADMM). We verify our CPSR reconstruction method with a numerical simulation experiment. A dental phantom with complicate structures of teeth and soft tissues is used. X-ray beams from three spectra of different peak energies (120kVp, 90kVp, 60kVp) irradiate the phantom to form tri-energy projections. Projection data covering only 75◦ from each energy spectrum are collected for reconstruction. Independent reconstruction for each energy will cause severe limited-angle artifacts even with the help of compressed sensing approaches. Our CPSR provides us with images free of the limited-angle artifact. All edge details are well preserved in our experimental study.
Informative priors based on transcription factor structural class improve de novo motif discovery.
Narlikar, Leelavati; Gordân, Raluca; Ohler, Uwe; Hartemink, Alexander J
2006-07-15
An important problem in molecular biology is to identify the locations at which a transcription factor (TF) binds to DNA, given a set of DNA sequences believed to be bound by that TF. In previous work, we showed that information in the DNA sequence of a binding site is sufficient to predict the structural class of the TF that binds it. In particular, this suggests that we can predict which locations in any DNA sequence are more likely to be bound by certain classes of TFs than others. Here, we argue that traditional methods for de novo motif finding can be significantly improved by adopting an informative prior probability that a TF binding site occurs at each sequence location. To demonstrate the utility of such an approach, we present priority, a powerful new de novo motif finding algorithm. Using data from TRANSFAC, we train three classifiers to recognize binding sites of basic leucine zipper, forkhead, and basic helix loop helix TFs. These classifiers are used to equip priority with three class-specific priors, in addition to a default prior to handle TFs of other classes. We apply priority and a number of popular motif finding programs to sets of yeast intergenic regions that are reported by ChIP-chip to be bound by particular TFs. priority identifies motifs the other methods fail to identify, and correctly predicts the structural class of the TF recognizing the identified binding sites. Supplementary material and code can be found at http://www.cs.duke.edu/~amink/.
Contribution of prior semantic knowledge to new episodic learning in amnesia.
Kan, Irene P; Alexander, Michael P; Verfaellie, Mieke
2009-05-01
We evaluated whether prior semantic knowledge would enhance episodic learning in amnesia. Subjects studied prices that are either congruent or incongruent with prior price knowledge for grocery and household items and then performed a forced-choice recognition test for the studied prices. Consistent with a previous report, healthy controls' performance was enhanced by price knowledge congruency; however, only a subset of amnesic patients experienced the same benefit. Whereas patients with relatively intact semantic systems, as measured by an anatomical measure (i.e., lesion involvement of anterior and lateral temporal lobes), experienced a significant congruency benefit, patients with compromised semantic systems did not experience a congruency benefit. Our findings suggest that when prior knowledge structures are intact, they can support acquisition of new episodic information by providing frameworks into which such information can be incorporated.
The Role of Structure in Learning Non-Euclidean Geometry
ERIC Educational Resources Information Center
Asmuth, Jennifer A.
2009-01-01
How do people learn novel mathematical information that contradicts prior knowledge? The focus of this thesis is the role of structure in the acquisition of knowledge about hyperbolic geometry, a non-Euclidean geometry. In a series of three experiments, I contrast a more holistic structure--training based on closed figures--with a mathematically…
Gundlapalli, Adi V; Redd, Andrew; Carter, Marjorie E; Palmer, Miland; Peterson, Rachel; Samore, Matthew H
2014-01-01
There are limited data on resources utilized by US Veterans prior to their identification as being homeless. We performed visual analytics on longitudinal medical encounter data prior to the official recognition of homelessness in a large cohort of OEF/OIF Veterans. A statistically significant increase in numbers of several categories of visits in the immediate 30 days prior to the recognition of homelessness was noted as compared to an earlier period. This finding has the potential to inform prediction algorithms based on structured data with a view to intervention and mitigation of homelessness among Veterans.
Damage of composite structures: Detection technique, dynamic response and residual strength
NASA Astrophysics Data System (ADS)
Lestari, Wahyu
2001-10-01
Reliable and accurate health monitoring techniques can prevent catastrophic failures of structures. Conventional damage detection methods are based on visual or localized experimental methods and very often require prior information concerning the vicinity of the damage or defect. The structure must also be readily accessible for inspections. The techniques are also labor intensive. In comparison to these methods, health-monitoring techniques that are based on the structural dynamic response offers unique information on failure of structures. However, systematic relations between the experimental data and the defect are not available and frequently, the number of vibration modes needed for an accurate identification of defects is much higher than the number of modes that can be readily identified in the experiment. These motivated us to develop an experimental data based detection method with systematic relationships between the experimentally identified information and the analytical or mathematical model representing the defective structures. The developed technique use changes in vibrational curvature modes and natural frequencies. To avoid misinterpretation of the identified information, we also need to understand the effects of defects on the structural dynamic response prior to developing health-monitoring techniques. In this thesis work we focus on two type of defects in composite structures, namely delamination and edge notch like defect. Effects of nonlinearity due to the presence of defect and due to the axial stretching are studied for beams with delamination. Once defects are detected in a structure, next concern is determining the effects of the defects on the strength of the structure and its residual stiffness under dynamic loading. In this thesis, energy release rate due to dynamic loading in a delaminated structure is studied, which will be a foundation toward determining the residual strength of the structure.
40 CFR 745.326 - Renovation: State and Tribal program requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) TOXIC SUBSTANCES CONTROL ACT LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES... distribution of lead hazard information to owners and occupants of target housing and child-occupied facilities... distributing the lead hazard information to owners and occupants of housing and child-occupied facilities prior...
Prior Consent: Not-So-Strange Bedfellows Plan Library/Computing Partnerships.
ERIC Educational Resources Information Center
McDonough, Kristin
The increasing sophistication of information technologies and the nearly universal access to computing have blurred distinctions among information delivery units on college campuses, forcing institutions to rethink the separate organizational structures that evolved when computing in academe was more localized and less prevalent. Experiences in…
Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis
2010-05-01
The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.
Inferring metabolic networks using the Bayesian adaptive graphical lasso with informative priors.
Peterson, Christine; Vannucci, Marina; Karakas, Cemal; Choi, William; Ma, Lihua; Maletić-Savatić, Mirjana
2013-10-01
Metabolic processes are essential for cellular function and survival. We are interested in inferring a metabolic network in activated microglia, a major neuroimmune cell in the brain responsible for the neuroinflammation associated with neurological diseases, based on a set of quantified metabolites. To achieve this, we apply the Bayesian adaptive graphical lasso with informative priors that incorporate known relationships between covariates. To encourage sparsity, the Bayesian graphical lasso places double exponential priors on the off-diagonal entries of the precision matrix. The Bayesian adaptive graphical lasso allows each double exponential prior to have a unique shrinkage parameter. These shrinkage parameters share a common gamma hyperprior. We extend this model to create an informative prior structure by formulating tailored hyperpriors on the shrinkage parameters. By choosing parameter values for each hyperprior that shift probability mass toward zero for nodes that are close together in a reference network, we encourage edges between covariates with known relationships. This approach can improve the reliability of network inference when the sample size is small relative to the number of parameters to be estimated. When applied to the data on activated microglia, the inferred network includes both known relationships and associations of potential interest for further investigation.
Inferring metabolic networks using the Bayesian adaptive graphical lasso with informative priors
PETERSON, CHRISTINE; VANNUCCI, MARINA; KARAKAS, CEMAL; CHOI, WILLIAM; MA, LIHUA; MALETIĆ-SAVATIĆ, MIRJANA
2014-01-01
Metabolic processes are essential for cellular function and survival. We are interested in inferring a metabolic network in activated microglia, a major neuroimmune cell in the brain responsible for the neuroinflammation associated with neurological diseases, based on a set of quantified metabolites. To achieve this, we apply the Bayesian adaptive graphical lasso with informative priors that incorporate known relationships between covariates. To encourage sparsity, the Bayesian graphical lasso places double exponential priors on the off-diagonal entries of the precision matrix. The Bayesian adaptive graphical lasso allows each double exponential prior to have a unique shrinkage parameter. These shrinkage parameters share a common gamma hyperprior. We extend this model to create an informative prior structure by formulating tailored hyperpriors on the shrinkage parameters. By choosing parameter values for each hyperprior that shift probability mass toward zero for nodes that are close together in a reference network, we encourage edges between covariates with known relationships. This approach can improve the reliability of network inference when the sample size is small relative to the number of parameters to be estimated. When applied to the data on activated microglia, the inferred network includes both known relationships and associations of potential interest for further investigation. PMID:24533172
Cross-sensory reference frame transfer in spatial memory: the case of proprioceptive learning.
Avraamides, Marios N; Sarrou, Mikaella; Kelly, Jonathan W
2014-04-01
In three experiments, we investigated whether the information available to visual perception prior to encoding the locations of objects in a path through proprioception would influence the reference direction from which the spatial memory was formed. Participants walked a path whose orientation was misaligned to the walls of the enclosing room and to the square sheet that covered the path prior to learning (Exp. 1) and, in addition, to the intrinsic structure of a layout studied visually prior to walking the path and to the orientation of stripes drawn on the floor (Exps. 2 and 3). Despite the availability of prior visual information, participants constructed spatial memories that were aligned with the canonical axes of the path, as opposed to the reference directions primed by visual experience. The results are discussed in the context of previous studies documenting transfer of reference frames within and across perceptual modalities.
Using comprehension strategies with authentic text in a college chemistry course
NASA Astrophysics Data System (ADS)
Cain, Stephen Daniel
College science students learn important topics by reading textbooks, which contain dense technical prose. Comprehension strategies are known to increase learning from reading. One class of comprehension strategies, called elaboration strategies, is intended to link new information with prior knowledge. Elaboration strategies have an appeal in science courses where new information frequently depends on previously learned information. The purpose of this study was to determine the effectiveness of an elaboration strategy in an authentic college environment. General chemistry students read text about Lewis structures, figures drawn by chemists to depict molecules, while assigned to use either an elaboration strategy, namely elaborative interrogation, or another strategy, rereading, which served as a placebo control. Two texts of equal length were employed in this pretest-posttest experimental design. One was composed by the researcher. The other was an excerpt from a college textbook and contained a procedure for constructing Lewis structures. Students (N = 252) attending a large community college were randomly assigned to one of the two texts and assigned one of the two strategies. The elaborative interrogation strategy was implemented with instructions to answer why-questions posed throughout the reading. Answering why-questions has been hypothesized to activate prior knowledge of a topic, and thus to aid in cognitively connecting new material with prior knowledge. The rereading strategy was implemented with instructions to read text twice. The use of authentic text was one of only a few instances of applying elaborative interrogation with a textbook. In addition, previous studies have generally focused on the learning of facts contained in prose. The application of elaborative interrogation to procedural text has not been previously reported. Results indicated that the more effective strategy was undetermined when reading authentic text in this setting. However, prior knowledge level was identified as a statistically significant factor for learning from authentic text. That is, students with high prior knowledge learned more, regardless of assigned strategy. Another descriptive study was conducted with a separate student sample (N = 34). Previously reported Lewis structure research was replicated. The trend of difficulty for 50 structures in the earlier work was supported.
Low Resolution Refinement of Atomic Models Against Crystallographic Data.
Nicholls, Robert A; Kovalevskiy, Oleg; Murshudov, Garib N
2017-01-01
This review describes some of the problems encountered during low-resolution refinement and map calculation. Refinement is considered as an application of Bayes' theorem, allowing combination of information from various sources including crystallographic experimental data and prior chemical and structural knowledge. The sources of prior knowledge relevant to macromolecules include basic chemical information such as bonds and angles, structural information from reference models of known homologs, knowledge about secondary structures, hydrogen bonding patterns, and similarity of non-crystallographically related copies of a molecule. Additionally, prior information encapsulating local conformational conservation is exploited, keeping local interatomic distances similar to those in the starting atomic model. The importance of designing an accurate likelihood function-the only link between model parameters and observed data-is emphasized. The review also reemphasizes the importance of phases, and describes how the use of raw observed amplitudes could give a better correlation between the calculated and "true" maps. It is shown that very noisy or absent observations can be replaced by calculated structure factors, weighted according to the accuracy of the atomic model. This approach helps to smoothen the map. However, such replacement should be used sparingly, as the bias toward errors in the model could be too much to avoid. It is in general recommended that, whenever a new map is calculated, map quality should be judged by inspection of the parts of the map where there is no atomic model. It is also noted that it is advisable to work with multiple blurred and sharpened maps, as different parts of a crystal may exhibit different degrees of mobility. Doing so can allow accurate building of atomic models, accounting for overall shape as well as finer structural details. Some of the results described in this review have been implemented in the programs REFMAC5, ProSMART and LORESTR, which are available as part of the CCP4 software suite.
Freyer, Marcus; Ale, Angelique; Schulz, Ralf B; Zientkowska, Marta; Ntziachristos, Vasilis; Englmeier, Karl-Hans
2010-01-01
The recent development of hybrid imaging scanners that integrate fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) allows the utilization of x-ray information as image priors for improving optical tomography reconstruction. To fully capitalize on this capacity, we consider a framework for the automatic and fast detection of different anatomic structures in murine XCT images. To accurately differentiate between different structures such as bone, lung, and heart, a combination of image processing steps including thresholding, seed growing, and signal detection are found to offer optimal segmentation performance. The algorithm and its utilization in an inverse FMT scheme that uses priors is demonstrated on mouse images.
NASA Astrophysics Data System (ADS)
Zhang, Han; Chen, Xuefeng; Du, Zhaohui; Li, Xiang; Yan, Ruqiang
2016-04-01
Fault information of aero-engine bearings presents two particular phenomena, i.e., waveform distortion and impulsive feature frequency band dispersion, which leads to a challenging problem for current techniques of bearing fault diagnosis. Moreover, although many progresses of sparse representation theory have been made in feature extraction of fault information, the theory also confronts inevitable performance degradation due to the fact that relatively weak fault information has not sufficiently prominent and sparse representations. Therefore, a novel nonlocal sparse model (coined NLSM) and its algorithm framework has been proposed in this paper, which goes beyond simple sparsity by introducing more intrinsic structures of feature information. This work adequately exploits the underlying prior information that feature information exhibits nonlocal self-similarity through clustering similar signal fragments and stacking them together into groups. Within this framework, the prior information is transformed into a regularization term and a sparse optimization problem, which could be solved through block coordinate descent method (BCD), is formulated. Additionally, the adaptive structural clustering sparse dictionary learning technique, which utilizes k-Nearest-Neighbor (kNN) clustering and principal component analysis (PCA) learning, is adopted to further enable sufficient sparsity of feature information. Moreover, the selection rule of regularization parameter and computational complexity are described in detail. The performance of the proposed framework is evaluated through numerical experiment and its superiority with respect to the state-of-the-art method in the field is demonstrated through the vibration signals of experimental rig of aircraft engine bearings.
Overview of refinement procedures within REFMAC5: utilizing data from different sources.
Kovalevskiy, Oleg; Nicholls, Robert A; Long, Fei; Carlon, Azzurra; Murshudov, Garib N
2018-03-01
Refinement is a process that involves bringing into agreement the structural model, available prior knowledge and experimental data. To achieve this, the refinement procedure optimizes a posterior conditional probability distribution of model parameters, including atomic coordinates, atomic displacement parameters (B factors), scale factors, parameters of the solvent model and twin fractions in the case of twinned crystals, given observed data such as observed amplitudes or intensities of structure factors. A library of chemical restraints is typically used to ensure consistency between the model and the prior knowledge of stereochemistry. If the observation-to-parameter ratio is small, for example when diffraction data only extend to low resolution, the Bayesian framework implemented in REFMAC5 uses external restraints to inject additional information extracted from structures of homologous proteins, prior knowledge about secondary-structure formation and even data obtained using different experimental methods, for example NMR. The refinement procedure also generates the `best' weighted electron-density maps, which are useful for further model (re)building. Here, the refinement of macromolecular structures using REFMAC5 and related tools distributed as part of the CCP4 suite is discussed.
Towards a general theory of neural computation based on prediction by single neurons.
Fiorillo, Christopher D
2008-10-01
Although there has been tremendous progress in understanding the mechanics of the nervous system, there has not been a general theory of its computational function. Here I present a theory that relates the established biophysical properties of single generic neurons to principles of Bayesian probability theory, reinforcement learning and efficient coding. I suggest that this theory addresses the general computational problem facing the nervous system. Each neuron is proposed to mirror the function of the whole system in learning to predict aspects of the world related to future reward. According to the model, a typical neuron receives current information about the state of the world from a subset of its excitatory synaptic inputs, and prior information from its other inputs. Prior information would be contributed by synaptic inputs representing distinct regions of space, and by different types of non-synaptic, voltage-regulated channels representing distinct periods of the past. The neuron's membrane voltage is proposed to signal the difference between current and prior information ("prediction error" or "surprise"). A neuron would apply a Hebbian plasticity rule to select those excitatory inputs that are the most closely correlated with reward but are the least predictable, since unpredictable inputs provide the neuron with the most "new" information about future reward. To minimize the error in its predictions and to respond only when excitation is "new and surprising," the neuron selects amongst its prior information sources through an anti-Hebbian rule. The unique inputs of a mature neuron would therefore result from learning about spatial and temporal patterns in its local environment, and by extension, the external world. Thus the theory describes how the structure of the mature nervous system could reflect the structure of the external world, and how the complexity and intelligence of the system might develop from a population of undifferentiated neurons, each implementing similar learning algorithms.
Opposite Effects of Context on Immediate Structural and Lexical Processing.
ERIC Educational Resources Information Center
Harris, John W.
The testing of a number of hypotheses about the effect of hearing a prior context sentence on immediate processing of a subsequent target sentence is described. According to the standard deep structure model, higher level processing (e.g. semantic interpretation, integration of context-tarqet information) does not occur immediately as speech is…
Bayesian Structural Equation Modeling: A More Flexible Representation of Substantive Theory
ERIC Educational Resources Information Center
Muthen, Bengt; Asparouhov, Tihomir
2012-01-01
This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed…
Fong, Ted C T; Ho, Rainbow T H
2015-01-01
The aim of this study was to reexamine the dimensionality of the widely used 9-item Utrecht Work Engagement Scale using the maximum likelihood (ML) approach and Bayesian structural equation modeling (BSEM) approach. Three measurement models (1-factor, 3-factor, and bi-factor models) were evaluated in two split samples of 1,112 health-care workers using confirmatory factor analysis and BSEM, which specified small-variance informative priors for cross-loadings and residual covariances. Model fit and comparisons were evaluated by posterior predictive p-value (PPP), deviance information criterion, and Bayesian information criterion (BIC). None of the three ML-based models showed an adequate fit to the data. The use of informative priors for cross-loadings did not improve the PPP for the models. The 1-factor BSEM model with approximately zero residual covariances displayed a good fit (PPP>0.10) to both samples and a substantially lower BIC than its 3-factor and bi-factor counterparts. The BSEM results demonstrate empirical support for the 1-factor model as a parsimonious and reasonable representation of work engagement.
Heudtlass, Peter; Guha-Sapir, Debarati; Speybroeck, Niko
2018-05-31
The crude death rate (CDR) is one of the defining indicators of humanitarian emergencies. When data from vital registration systems are not available, it is common practice to estimate the CDR from household surveys with cluster-sampling design. However, sample sizes are often too small to compare mortality estimates to emergency thresholds, at least in a frequentist framework. Several authors have proposed Bayesian methods for health surveys in humanitarian crises. Here, we develop an approach specifically for mortality data and cluster-sampling surveys. We describe a Bayesian hierarchical Poisson-Gamma mixture model with generic (weakly informative) priors that could be used as default in absence of any specific prior knowledge, and compare Bayesian and frequentist CDR estimates using five different mortality datasets. We provide an interpretation of the Bayesian estimates in the context of an emergency threshold and demonstrate how to interpret parameters at the cluster level and ways in which informative priors can be introduced. With the same set of weakly informative priors, Bayesian CDR estimates are equivalent to frequentist estimates, for all practical purposes. The probability that the CDR surpasses the emergency threshold can be derived directly from the posterior of the mean of the mixing distribution. All observation in the datasets contribute to the estimation of cluster-level estimates, through the hierarchical structure of the model. In a context of sparse data, Bayesian mortality assessments have advantages over frequentist ones already when using only weakly informative priors. More informative priors offer a formal and transparent way of combining new data with existing data and expert knowledge and can help to improve decision-making in humanitarian crises by complementing frequentist estimates.
NASA Astrophysics Data System (ADS)
Norton, Andrew S.
An integral component of managing game species is an understanding of population dynamics and relative abundance. Harvest data are frequently used to estimate abundance of white-tailed deer. Unless harvest age-structure is representative of the population age-structure and harvest vulnerability remains constant from year to year, these data alone are of limited value. Additional model structure and auxiliary information has accommodated this shortcoming. Specifically, integrated age-at-harvest (AAH) state-space population models can formally combine multiple sources of data, and regularization via hierarchical model structure can increase flexibility of model parameters. I collected known fates data, which I evaluated and used to inform trends in survival parameters for an integrated AAH model. I used temperature and snow depth covariates to predict survival outside of the hunting season, and opening weekend temperature and percent of corn harvest covariates to predict hunting season survival. When auxiliary empirical data were unavailable for the AAH model, moderately informative priors provided sufficient information for convergence and parameter estimates. The AAH model was most sensitive to errors in initial abundance, but this error was calibrated after 3 years. Among vital rates, the AAH model was most sensitive to reporting rates (percentage of mortality during the hunting season related to harvest). The AAH model, using only harvest data, was able to track changing abundance trends due to changes in survival rates even when prior models did not inform these changes (i.e. prior models were constant when truth varied). I also compared AAH model results with estimates from the Wisconsin Department of Natural Resources (WIDNR). Trends in abundance estimates from both models were similar, although AAH model predictions were systematically higher than WIDNR estimates in the East study area. When I incorporated auxiliary information (i.e. integrated AAH model) about survival outside the hunting season from known fates data, predicted trends appeared more closely related to what was expected. Disagreements between the AAH model and WIDNR estimates in the East were likely related to biased predictions for reporting and survival rates from the AAH model.
Reading Comprehension Performance of Adolescents with Learning Disabilities.
ERIC Educational Resources Information Center
Snider, Vicki E.
1989-01-01
The study found that instructing 13 learning-disabled junior high students in the necessary prior knowledge (information and vocabulary concepts) led to superior reading comprehension performance. Textually explicit text structure also improved reading comprehension. (DB)
Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J
2015-01-01
Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy. PMID:25628867
Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J
2015-01-01
Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy.
Hommes, J; Rienties, B; de Grave, W; Bos, G; Schuwirth, L; Scherpbier, A
2012-12-01
World-wide, universities in health sciences have transformed their curriculum to include collaborative learning and facilitate the students' learning process. Interaction has been acknowledged to be the synergistic element in this learning context. However, students spend the majority of their time outside their classroom and interaction does not stop outside the classroom. Therefore we studied how informal social interaction influences student learning. Moreover, to explore what really matters in the students learning process, a model was tested how the generally known important constructs-prior performance, motivation and social integration-relate to informal social interaction and student learning. 301 undergraduate medical students participated in this cross-sectional quantitative study. Informal social interaction was assessed using self-reported surveys following the network approach. Students' individual motivation, social integration and prior performance were assessed by the Academic Motivation Scale, the College Adaption Questionnaire and students' GPA respectively. A factual knowledge test represented student' learning. All social networks were positively associated with student learning significantly: friendships (β = 0.11), providing information to other students (β = 0.16), receiving information from other students (β = 0.25). Structural equation modelling revealed a model in which social networks increased student learning (r = 0.43), followed by prior performance (r = 0.31). In contrast to prior literature, students' academic motivation and social integration were not associated with students' learning. Students' informal social interaction is strongly associated with students' learning. These findings underline the need to change our focus from the formal context (classroom) to the informal context to optimize student learning and deliver modern medics.
ERIC Educational Resources Information Center
Rhoads, Christopher
2014-01-01
Recent publications have drawn attention to the idea of utilizing prior information about the correlation structure to improve statistical power in cluster randomized experiments. Because power in cluster randomized designs is a function of many different parameters, it has been difficult for applied researchers to discern a simple rule explaining…
Applying Schema Theory to Mass Media Information Processing: Moving toward a Formal Model.
ERIC Educational Resources Information Center
Wicks, Robert H.
Schema theory may be significant in determining if and how news audiences process information. For any given news topic, people have from none to many schemata (cognitive structures that represent organized knowledge about a given concept or type of stimulus abstracted from prior experience) upon which to draw. Models of how schemata are used…
Bayesian structural equation modeling in sport and exercise psychology.
Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus
2015-08-01
Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach.
Rhodes, Ashley E; Rozell, Timothy G
2017-09-01
Cognitive flexibility is defined as the ability to assimilate previously learned information and concepts to generate novel solutions to new problems. This skill is crucial for success within ill-structured domains such as biology, physiology, and medicine, where many concepts are simultaneously required for understanding a complex problem, yet the problem consists of patterns or combinations of concepts that are not consistently used or needed across all examples. To succeed within ill-structured domains, a student must possess a certain level of cognitive flexibility: rigid thought processes and prepackaged informational retrieval schemes relying on rote memorization will not suffice. In this study, we assessed the cognitive flexibility of undergraduate physiology students using a validated instrument entitled Student's Approaches to Learning (SAL). The SAL evaluates how deeply and in what way information is processed, as well as the investment of time and mental energy that a student is willing to expend by measuring constructs such as elaboration and memorization. Our results indicate that students who rely primarily on memorization when learning new information have a smaller knowledge base about physiological concepts, as measured by a prior knowledge assessment and unit exams. However, students who rely primarily on elaboration when learning new information have a more well-developed knowledge base about physiological concepts, which is displayed by higher scores on a prior knowledge assessment and increased performance on unit exams. Thus students with increased elaboration skills possibly possess a higher level of cognitive flexibility and are more likely to succeed within ill-structured domains. Copyright © 2017 the American Physiological Society.
NASA Astrophysics Data System (ADS)
Koskela, J. J.; Croke, B. W. F.; Koivusalo, H.; Jakeman, A. J.; Kokkonen, T.
2012-11-01
Bayesian inference is used to study the effect of precipitation and model structural uncertainty on estimates of model parameters and confidence limits of predictive variables in a conceptual rainfall-runoff model in the snow-fed Rudbäck catchment (142 ha) in southern Finland. The IHACRES model is coupled with a simple degree day model to account for snow accumulation and melt. The posterior probability distribution of the model parameters is sampled by using the Differential Evolution Adaptive Metropolis (DREAM(ZS)) algorithm and the generalized likelihood function. Precipitation uncertainty is taken into account by introducing additional latent variables that were used as multipliers for individual storm events. Results suggest that occasional snow water equivalent (SWE) observations together with daily streamflow observations do not contain enough information to simultaneously identify model parameters, precipitation uncertainty and model structural uncertainty in the Rudbäck catchment. The addition of an autoregressive component to account for model structure error and latent variables having uniform priors to account for input uncertainty lead to dubious posterior distributions of model parameters. Thus our hypothesis that informative priors for latent variables could be replaced by additional SWE data could not be confirmed. The model was found to work adequately in 1-day-ahead simulation mode, but the results were poor in the simulation batch mode. This was caused by the interaction of parameters that were used to describe different sources of uncertainty. The findings may have lessons for other cases where parameterizations are similarly high in relation to available prior information.
Ter Wal, Anne L.J.; Alexy, Oliver; Block, Jörn; Sandner, Philipp G.
2016-01-01
Open networks give actors non-redundant information that is diverse, while closed networks offer redundant information that is easier to interpret. Integrating arguments about network structure and the similarity of actors’ knowledge, we propose two types of network configurations that combine diversity and ease of interpretation. Closed-diverse networks offer diversity in actors’ knowledge domains and shared third-party ties to help in interpreting that knowledge. In open-specialized networks, structural holes offer diversity, while shared interpretive schema and overlap between received information and actors’ prior knowledge help in interpreting new information without the help of third parties. In contrast, actors in open-diverse networks suffer from information overload due to the lack of shared schema or overlapping prior knowledge for the interpretation of diverse information, and actors in closed-specialized networks suffer from overembeddedness because they cannot access diverse information. Using CrunchBase data on early-stage venture capital investments in the U.S. information technology sector, we test the effect of investors’ social capital on the success of their portfolio ventures. We find that ventures have the highest chances of success if their syndicating investors have either open-specialized or closed-diverse networks. These effects are manifested beyond the direct effects of ventures’ or investors’ quality and are robust to controlling for the possibility that certain investors could have chosen more promising ventures at the time of first funding. PMID:27499546
ERIC Educational Resources Information Center
Bogo, Marion; Lee, Barbara; McKee, Eileen; Ramjattan, Roxanne; Baird, Stephanie L.
2017-01-01
To strengthen students' preparation for engaging in field learning, an innovation was implemented to teach and assess foundation-year students' performance prior to entering field education. An Objective Structured Clinical Examination informed the final evaluation of students' performance in two companion courses on practice theory and skills.…
ERIC Educational Resources Information Center
Vosniadou, Stella
Analogical reasoning is one mechanism that has been recognized as having the potential of bringing prior knowledge to bear on the acquisition of new information. Analogical reasoning involves the identification and transfer of structural information from a known system to a new and relatively unknown system. The productive use of analogy is often…
Integration of prior knowledge into dense image matching for video surveillance
NASA Astrophysics Data System (ADS)
Menze, M.; Heipke, C.
2014-08-01
Three-dimensional information from dense image matching is a valuable input for a broad range of vision applications. While reliable approaches exist for dedicated stereo setups they do not easily generalize to more challenging camera configurations. In the context of video surveillance the typically large spatial extent of the region of interest and repetitive structures in the scene render the application of dense image matching a challenging task. In this paper we present an approach that derives strong prior knowledge from a planar approximation of the scene. This information is integrated into a graph-cut based image matching framework that treats the assignment of optimal disparity values as a labelling task. Introducing the planar prior heavily reduces ambiguities together with the search space and increases computational efficiency. The results provide a proof of concept of the proposed approach. It allows the reconstruction of dense point clouds in more general surveillance camera setups with wider stereo baselines.
Li, Ziyi; Safo, Sandra E; Long, Qi
2017-07-11
Sparse principal component analysis (PCA) is a popular tool for dimensionality reduction, pattern recognition, and visualization of high dimensional data. It has been recognized that complex biological mechanisms occur through concerted relationships of multiple genes working in networks that are often represented by graphs. Recent work has shown that incorporating such biological information improves feature selection and prediction performance in regression analysis, but there has been limited work on extending this approach to PCA. In this article, we propose two new sparse PCA methods called Fused and Grouped sparse PCA that enable incorporation of prior biological information in variable selection. Our simulation studies suggest that, compared to existing sparse PCA methods, the proposed methods achieve higher sensitivity and specificity when the graph structure is correctly specified, and are fairly robust to misspecified graph structures. Application to a glioblastoma gene expression dataset identified pathways that are suggested in the literature to be related with glioblastoma. The proposed sparse PCA methods Fused and Grouped sparse PCA can effectively incorporate prior biological information in variable selection, leading to improved feature selection and more interpretable principal component loadings and potentially providing insights on molecular underpinnings of complex diseases.
Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong
2017-08-15
Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Prospective regularization design in prior-image-based reconstruction
NASA Astrophysics Data System (ADS)
Dang, Hao; Siewerdsen, Jeffrey H.; Webster Stayman, J.
2015-12-01
Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in phantoms where the optimal parameters vary spatially by an order of magnitude or more. In a series of studies designed to explore potential unknowns associated with accurate PIBR, optimal prior image strength was found to vary with attenuation differences associated with anatomical change but exhibited only small variations as a function of the shape and size of the change. The results suggest that, given a target change attenuation, prospective patient-, change-, and data-specific customization of the prior image strength can be performed to ensure reliable reconstruction of specific anatomical changes.
Ruth, Veikko; Kolditz, Daniel; Steiding, Christian; Kalender, Willi A
2017-06-01
The performance of metal artifact reduction (MAR) methods in x-ray computed tomography (CT) suffers from incorrect identification of metallic implants in the artifact-affected volumetric images. The aim of this study was to investigate potential improvements of state-of-the-art MAR methods by using prior information on geometry and material of the implant. The influence of a novel prior knowledge-based segmentation (PS) compared with threshold-based segmentation (TS) on 2 MAR methods (linear interpolation [LI] and normalized-MAR [NORMAR]) was investigated. The segmentation is the initial step of both MAR methods. Prior knowledge-based segmentation uses 3-dimensional registered computer-aided design (CAD) data as prior knowledge to estimate the correct position and orientation of the metallic objects. Threshold-based segmentation uses an adaptive threshold to identify metal. Subsequently, for LI and NORMAR, the selected voxels are projected into the raw data domain to mark metal areas. Attenuation values in these areas are replaced by different interpolation schemes followed by a second reconstruction. Finally, the previously selected metal voxels are replaced by the metal voxels determined by PS or TS in the initial reconstruction. First, we investigated in an elaborate phantom study if the knowledge of the exact implant shape extracted from the CAD data provided by the manufacturer of the implant can improve the MAR result. Second, the leg of a human cadaver was scanned using a clinical CT system before and after the implantation of an artificial knee joint. The results were compared regarding segmentation accuracy, CT number accuracy, and the restoration of distorted structures. The use of PS improved the efficacy of LI and NORMAR compared with TS. Artifacts caused by insufficient segmentation were reduced, and additional information was made available within the projection data. The estimation of the implant shape was more exact and not dependent on a threshold value. Consequently, the visibility of structures was improved when comparing the new approach to the standard method. This was further confirmed by improved CT value accuracy and reduced image noise. The PS approach based on prior implant information provides image quality which is superior to TS-based MAR, especially when the shape of the metallic implant is complex. The new approach can be useful for improving MAR methods and dose calculations within radiation therapy based on the MAR corrected CT images.
NASA Astrophysics Data System (ADS)
Lauvaux, Thomas; Miles, Natasha L.; Deng, Aijun; Richardson, Scott J.; Cambaliza, Maria O.; Davis, Kenneth J.; Gaudet, Brian; Gurney, Kevin R.; Huang, Jianhua; O'Keefe, Darragh; Song, Yang; Karion, Anna; Oda, Tomohiro; Patarasuk, Risa; Razlivanov, Igor; Sarmiento, Daniel; Shepson, Paul; Sweeney, Colm; Turnbull, Jocelyn; Wu, Kai
2016-05-01
Based on a uniquely dense network of surface towers measuring continuously the atmospheric concentrations of greenhouse gases (GHGs), we developed the first comprehensive monitoring systems of CO2 emissions at high resolution over the city of Indianapolis. The urban inversion evaluated over the 2012-2013 dormant season showed a statistically significant increase of about 20% (from 4.5 to 5.7 MtC ± 0.23 MtC) compared to the Hestia CO2 emission estimate, a state-of-the-art building-level emission product. Spatial structures in prior emission errors, mostly undetermined, appeared to affect the spatial pattern in the inverse solution and the total carbon budget over the entire area by up to 15%, while the inverse solution remains fairly insensitive to the CO2 boundary inflow and to the different prior emissions (i.e., ODIAC). Preceding the surface emission optimization, we improved the atmospheric simulations using a meteorological data assimilation system also informing our Bayesian inversion system through updated observations error variances. Finally, we estimated the uncertainties associated with undetermined parameters using an ensemble of inversions. The total CO2 emissions based on the ensemble mean and quartiles (5.26-5.91 MtC) were statistically different compared to the prior total emissions (4.1 to 4.5 MtC). Considering the relatively small sensitivity to the different parameters, we conclude that atmospheric inversions are potentially able to constrain the carbon budget of the city, assuming sufficient data to measure the inflow of GHG over the city, but additional information on prior emission error structures are required to determine the spatial structures of urban emissions at high resolution.
Naive Physics, Event Perception, Lexical Semantics, and Language Acquisition
1993-04-01
settings within a framework of universal grammar. His central claim is that children use primarily unembedded material as evidence for the parameter...differentiate embedded from unembedded material. Deriving such structural information requires that the learner determine constituent order prior io ot her
Reactions of Thiocyanate Ions with Acid: A Laboratory Experiment.
ERIC Educational Resources Information Center
Glidewell, Christopher; And Others
1984-01-01
Background information, procedures, and typical results are provided for a three-part experiment involving reactions of potassium thiocynate (KNCS) with sulfuric acid. The experiment represents the final stage of structured work prior to students' research projects during their final year. (JM)
Scott L. Stephens; Jamie M. Lydersen; Brandon M. Collins; Danny L. Fry; Marc D. Meyer
2015-01-01
Many managers today are tasked with restoring forests to mitigate the potential for uncharacteristically severe fire. One challenge to this mandate is the lack of large-scale reference information on forest structure prior to impacts from Euro-American settlement. We used a robust 1911 historical dataset that covers a large geographic extent (>10,000 ha) and has...
ERIC Educational Resources Information Center
Day, Jeanne D.; Engelhardt, Jean
Two studies examined how the factors of content-relevant knowledge and text organization influence students' abilities to study and to remember text information. The first experiment examined the effect of prior content knowledge on students' ability to identify important information in the text. Forty 7th- and forty 11th-grade students, experts…
Ter Wal, Anne L J; Alexy, Oliver; Block, Jörn; Sandner, Philipp G
2016-09-01
Open networks give actors non-redundant information that is diverse, while closed networks offer redundant information that is easier to interpret. Integrating arguments about network structure and the similarity of actors' knowledge, we propose two types of network configurations that combine diversity and ease of interpretation. Closed-diverse networks offer diversity in actors' knowledge domains and shared third-party ties to help in interpreting that knowledge. In open-specialized networks, structural holes offer diversity, while shared interpretive schema and overlap between received information and actors' prior knowledge help in interpreting new information without the help of third parties. In contrast, actors in open-diverse networks suffer from information overload due to the lack of shared schema or overlapping prior knowledge for the interpretation of diverse information, and actors in closed-specialized networks suffer from overembeddedness because they cannot access diverse information. Using CrunchBase data on early-stage venture capital investments in the U.S. information technology sector, we test the effect of investors' social capital on the success of their portfolio ventures. We find that ventures have the highest chances of success if their syndicating investors have either open-specialized or closed-diverse networks. These effects are manifested beyond the direct effects of ventures' or investors' quality and are robust to controlling for the possibility that certain investors could have chosen more promising ventures at the time of first funding.
A path analysis of Internet health information seeking behaviors among older adults.
Chang, Sun Ju; Im, Eun-Ok
2014-01-01
The Internet has emerged as an innovative tool that older adults can use to obtain health-related information. However, the relationships among predictors of Internet health information seeking behaviors (IHISB) in this population are not well understood. To fill this gap, this study examined the direct and indirect pathways of potential predictors of IHISB among older South Korean adults, using the modified Technology Acceptance Model 3. Participants were 300 older South Korean adults who had used the Internet to obtain health information within the past month. Data were collected via a self-report questionnaire and were analyzed through structural equation modeling. Two variables-prior experience and behavioral intention to use-had positive direct effects on IHISB. These findings imply that health care providers promoting IHISB among older adults should consider these individuals' prior experience with the Internet and their willingness to use the Internet as a source of health information. Copyright © 2014 Mosby, Inc. All rights reserved.
The Leisure Activities of Mental Patients Prior to Hospitalization.
ERIC Educational Resources Information Center
Babow, Irving; Simkin, Sol
To study the leisure activities, social participation, and organizational participation of mental patients before hospital admission, a three-part research instrument was developed consisting of a structured interview schedule requesting information on the patient's leisure activities, a self-administered questionnaire entitled Survey of Opinions…
NASA Astrophysics Data System (ADS)
Kountouris, Panagiotis; Gerbig, Christoph; Rödenbeck, Christian; Karstens, Ute; Koch, Thomas Frank; Heimann, Martin
2018-03-01
Atmospheric inversions are widely used in the optimization of surface carbon fluxes on a regional scale using information from atmospheric CO2 dry mole fractions. In many studies the prior flux uncertainty applied to the inversion schemes does not directly reflect the true flux uncertainties but is used to regularize the inverse problem. Here, we aim to implement an inversion scheme using the Jena inversion system and applying a prior flux error structure derived from a model-data residual analysis using high spatial and temporal resolution over a full year period in the European domain. We analyzed the performance of the inversion system with a synthetic experiment, in which the flux constraint is derived following the same residual analysis but applied to the model-model mismatch. The synthetic study showed a quite good agreement between posterior and true
fluxes on European, country, annual and monthly scales. Posterior monthly and country-aggregated fluxes improved their correlation coefficient with the known truth
by 7 % compared to the prior estimates when compared to the reference, with a mean correlation of 0.92. The ratio of the SD between the posterior and reference and between the prior and reference was also reduced by 33 % with a mean value of 1.15. We identified temporal and spatial scales on which the inversion system maximizes the derived information; monthly temporal scales at around 200 km spatial resolution seem to maximize the information gain.
Ligand placement based on prior structures: the guided ligand-replacement method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klei, Herbert E.; Bristol-Myers Squibb, Princeton, NJ 08543-4000; Moriarty, Nigel W., E-mail: nwmoriarty@lbl.gov
2014-01-01
A new module, Guided Ligand Replacement (GLR), has been developed in Phenix to increase the ease and success rate of ligand placement when prior protein-ligand complexes are available. The process of iterative structure-based drug design involves the X-ray crystal structure determination of upwards of 100 ligands with the same general scaffold (i.e. chemotype) complexed with very similar, if not identical, protein targets. In conjunction with insights from computational models and assays, this collection of crystal structures is analyzed to improve potency, to achieve better selectivity and to reduce liabilities such as absorption, distribution, metabolism, excretion and toxicology. Current methods formore » modeling ligands into electron-density maps typically do not utilize information on how similar ligands bound in related structures. Even if the electron density is of sufficient quality and resolution to allow de novo placement, the process can take considerable time as the size, complexity and torsional degrees of freedom of the ligands increase. A new module, Guided Ligand Replacement (GLR), was developed in Phenix to increase the ease and success rate of ligand placement when prior protein–ligand complexes are available. At the heart of GLR is an algorithm based on graph theory that associates atoms in the target ligand with analogous atoms in the reference ligand. Based on this correspondence, a set of coordinates is generated for the target ligand. GLR is especially useful in two situations: (i) modeling a series of large, flexible, complicated or macrocyclic ligands in successive structures and (ii) modeling ligands as part of a refinement pipeline that can automatically select a reference structure. Even in those cases for which no reference structure is available, if there are multiple copies of the bound ligand per asymmetric unit GLR offers an efficient way to complete the model after the first ligand has been placed. In all of these applications, GLR leverages prior knowledge from earlier structures to facilitate ligand placement in the current structure.« less
Investigating the impact of spatial priors on the performance of model-based IVUS elastography
Richards, M S; Doyley, M M
2012-01-01
This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648
Minimally Informative Prior Distributions for PSA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros
2010-06-01
A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that themore » posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the constrained noninformative prior, the robust prior places a heavy-tailed Cauchy prior on the canonical parameter of the aleatory model.« less
Filtering genetic variants and placing informative priors based on putative biological function.
Friedrichs, Stefanie; Malzahn, Dörthe; Pugh, Elizabeth W; Almeida, Marcio; Liu, Xiao Qing; Bailey, Julia N
2016-02-03
High-density genetic marker data, especially sequence data, imply an immense multiple testing burden. This can be ameliorated by filtering genetic variants, exploiting or accounting for correlations between variants, jointly testing variants, and by incorporating informative priors. Priors can be based on biological knowledge or predicted variant function, or even be used to integrate gene expression or other omics data. Based on Genetic Analysis Workshop (GAW) 19 data, this article discusses diversity and usefulness of functional variant scores provided, for example, by PolyPhen2, SIFT, or RegulomeDB annotations. Incorporating functional scores into variant filters or weights and adjusting the significance level for correlations between variants yielded significant associations with blood pressure traits in a large family study of Mexican Americans (GAW19 data set). Marker rs218966 in gene PHF14 and rs9836027 in MAP4 significantly associated with hypertension; additionally, rare variants in SNUPN significantly associated with systolic blood pressure. Variant weights strongly influenced the power of kernel methods and burden tests. Apart from variant weights in test statistics, prior weights may also be used when combining test statistics or to informatively weight p values while controlling false discovery rate (FDR). Indeed, power improved when gene expression data for FDR-controlled informative weighting of association test p values of genes was used. Finally, approaches exploiting variant correlations included identity-by-descent mapping and the optimal strategy for joint testing rare and common variants, which was observed to depend on linkage disequilibrium structure.
Sequential Dictionary Learning From Correlated Data: Application to fMRI Data Analysis.
Seghouane, Abd-Krim; Iqbal, Asif
2017-03-22
Sequential dictionary learning via the K-SVD algorithm has been revealed as a successful alternative to conventional data driven methods such as independent component analysis (ICA) for functional magnetic resonance imaging (fMRI) data analysis. fMRI datasets are however structured data matrices with notions of spatio-temporal correlation and temporal smoothness. This prior information has not been included in the K-SVD algorithm when applied to fMRI data analysis. In this paper we propose three variants of the K-SVD algorithm dedicated to fMRI data analysis by accounting for this prior information. The proposed algorithms differ from the K-SVD in their sparse coding and dictionary update stages. The first two algorithms account for the known correlation structure in the fMRI data by using the squared Q, R-norm instead of the Frobenius norm for matrix approximation. The third and last algorithm account for both the known correlation structure in the fMRI data and the temporal smoothness. The temporal smoothness is incorporated in the dictionary update stage via regularization of the dictionary atoms obtained with penalization. The performance of the proposed dictionary learning algorithms are illustrated through simulations and applications on real fMRI data.
Bayesian hierarchical models for regional climate reconstructions of the last glacial maximum
NASA Astrophysics Data System (ADS)
Weitzel, Nils; Hense, Andreas; Ohlwein, Christian
2017-04-01
Spatio-temporal reconstructions of past climate are important for the understanding of the long term behavior of the climate system and the sensitivity to forcing changes. Unfortunately, they are subject to large uncertainties, have to deal with a complex proxy-climate structure, and a physically reasonable interpolation between the sparse proxy observations is difficult. Bayesian Hierarchical Models (BHMs) are a class of statistical models that is well suited for spatio-temporal reconstructions of past climate because they permit the inclusion of multiple sources of information (e.g. records from different proxy types, uncertain age information, output from climate simulations) and quantify uncertainties in a statistically rigorous way. BHMs in paleoclimatology typically consist of three stages which are modeled individually and are combined using Bayesian inference techniques. The data stage models the proxy-climate relation (often named transfer function), the process stage models the spatio-temporal distribution of the climate variables of interest, and the prior stage consists of prior distributions of the model parameters. For our BHMs, we translate well-known proxy-climate transfer functions for pollen to a Bayesian framework. In addition, we can include Gaussian distributed local climate information from preprocessed proxy records. The process stage combines physically reasonable spatial structures from prior distributions with proxy records which leads to a multivariate posterior probability distribution for the reconstructed climate variables. The prior distributions that constrain the possible spatial structure of the climate variables are calculated from climate simulation output. We present results from pseudoproxy tests as well as new regional reconstructions of temperatures for the last glacial maximum (LGM, ˜ 21,000 years BP). These reconstructions combine proxy data syntheses with information from climate simulations for the LGM that were performed in the PMIP3 project. The proxy data syntheses consist either of raw pollen data or of normally distributed climate data from preprocessed proxy records. Future extensions of our method contain the inclusion of other proxy types (transfer functions), the implementation of other spatial interpolation techniques, the use of age uncertainties, and the extension to spatio-temporal reconstructions of the last deglaciation. Our work is part of the PalMod project funded by the German Federal Ministry of Education and Science (BMBF).
Sensitivity analyses for sparse-data problems-using weakly informative bayesian priors.
Hamra, Ghassan B; MacLehose, Richard F; Cole, Stephen R
2013-03-01
Sparse-data problems are common, and approaches are needed to evaluate the sensitivity of parameter estimates based on sparse data. We propose a Bayesian approach that uses weakly informative priors to quantify sensitivity of parameters to sparse data. The weakly informative prior is based on accumulated evidence regarding the expected magnitude of relationships using relative measures of disease association. We illustrate the use of weakly informative priors with an example of the association of lifetime alcohol consumption and head and neck cancer. When data are sparse and the observed information is weak, a weakly informative prior will shrink parameter estimates toward the prior mean. Additionally, the example shows that when data are not sparse and the observed information is not weak, a weakly informative prior is not influential. Advancements in implementation of Markov Chain Monte Carlo simulation make this sensitivity analysis easily accessible to the practicing epidemiologist.
Sensitivity Analyses for Sparse-Data Problems—Using Weakly Informative Bayesian Priors
Hamra, Ghassan B.; MacLehose, Richard F.; Cole, Stephen R.
2013-01-01
Sparse-data problems are common, and approaches are needed to evaluate the sensitivity of parameter estimates based on sparse data. We propose a Bayesian approach that uses weakly informative priors to quantify sensitivity of parameters to sparse data. The weakly informative prior is based on accumulated evidence regarding the expected magnitude of relationships using relative measures of disease association. We illustrate the use of weakly informative priors with an example of the association of lifetime alcohol consumption and head and neck cancer. When data are sparse and the observed information is weak, a weakly informative prior will shrink parameter estimates toward the prior mean. Additionally, the example shows that when data are not sparse and the observed information is not weak, a weakly informative prior is not influential. Advancements in implementation of Markov Chain Monte Carlo simulation make this sensitivity analysis easily accessible to the practicing epidemiologist. PMID:23337241
Adaptive structured dictionary learning for image fusion based on group-sparse-representation
NASA Astrophysics Data System (ADS)
Yang, Jiajie; Sun, Bin; Luo, Chengwei; Wu, Yuzhong; Xu, Limei
2018-04-01
Dictionary learning is the key process of sparse representation which is one of the most widely used image representation theories in image fusion. The existing dictionary learning method does not use the group structure information and the sparse coefficients well. In this paper, we propose a new adaptive structured dictionary learning algorithm and a l1-norm maximum fusion rule that innovatively utilizes grouped sparse coefficients to merge the images. In the dictionary learning algorithm, we do not need prior knowledge about any group structure of the dictionary. By using the characteristics of the dictionary in expressing the signal, our algorithm can automatically find the desired potential structure information that hidden in the dictionary. The fusion rule takes the physical meaning of the group structure dictionary, and makes activity-level judgement on the structure information when the images are being merged. Therefore, the fused image can retain more significant information. Comparisons have been made with several state-of-the-art dictionary learning methods and fusion rules. The experimental results demonstrate that, the dictionary learning algorithm and the fusion rule both outperform others in terms of several objective evaluation metrics.
Schürmann, Tim; Beckerle, Philipp; Preller, Julia; Vogt, Joachim; Christ, Oliver
2016-12-19
In product development for lower limb prosthetic devices, a set of special criteria needs to be met. Prosthetic devices have a direct impact on the rehabilitation process after an amputation with both perceived technological and psychological aspects playing an important role. However, available psychometric questionnaires fail to consider the important links between these two dimensions. In this article a probabilistic latent trait model is proposed with seven technical and psychological factors which measure satisfaction with the prosthesis. The results of a first study are used to determine the basic parameters of the statistical model. These distributions represent hypotheses about factor loadings between manifest items and latent factors of the proposed psychometric questionnaire. A study was conducted and analyzed to form hypotheses for the prior distributions of the questionnaire's measurement model. An expert agreement study conducted on 22 experts was used to determine the prior distribution of item-factor loadings in the model. Model parameters that had to be specified as part of the measurement model were informed prior distributions on the item-factor loadings. For the current 70 items in the questionnaire, each factor loading was set to represent the certainty with which experts had assigned the items to their respective factors. Considering only the measurement model and not the structural model of the questionnaire, 70 out of 217 informed prior distributions on parameters were set. The use of preliminary studies to set prior distributions in latent trait models, while being a relatively new approach in psychological research, provides helpful information towards the design of a seven factor questionnaire that means to identify relations between technical and psychological factors in prosthetic product design and rehabilitation medicine.
Parallelized Bayesian inversion for three-dimensional dental X-ray imaging.
Kolehmainen, Ville; Vanne, Antti; Siltanen, Samuli; Järvenpää, Seppo; Kaipio, Jari P; Lassas, Matti; Kalke, Martti
2006-02-01
Diagnostic and operational tasks based on dental radiology often require three-dimensional (3-D) information that is not available in a single X-ray projection image. Comprehensive 3-D information about tissues can be obtained by computerized tomography (CT) imaging. However, in dental imaging a conventional CT scan may not be available or practical because of high radiation dose, low-resolution or the cost of the CT scanner equipment. In this paper, we consider a novel type of 3-D imaging modality for dental radiology. We consider situations in which projection images of the teeth are taken from a few sparsely distributed projection directions using the dentist's regular (digital) X-ray equipment and the 3-D X-ray attenuation function is reconstructed. A complication in these experiments is that the reconstruction of the 3-D structure based on a few projection images becomes an ill-posed inverse problem. Bayesian inversion is a well suited framework for reconstruction from such incomplete data. In Bayesian inversion, the ill-posed reconstruction problem is formulated in a well-posed probabilistic form in which a priori information is used to compensate for the incomplete information of the projection data. In this paper we propose a Bayesian method for 3-D reconstruction in dental radiology. The method is partially based on Kolehmainen et al. 2003. The prior model for dental structures consist of a weighted l1 and total variation (TV)-prior together with the positivity prior. The inverse problem is stated as finding the maximum a posteriori (MAP) estimate. To make the 3-D reconstruction computationally feasible, a parallelized version of an optimization algorithm is implemented for a Beowulf cluster computer. The method is tested with projection data from dental specimens and patient data. Tomosynthetic reconstructions are given as reference for the proposed method.
On vital aid: the why, what and how of validation
Kleywegt, Gerard J.
2009-01-01
Limitations to the data and subjectivity in the structure-determination process may cause errors in macromolecular crystal structures. Appropriate validation techniques may be used to reveal problems in structures, ideally before they are analysed, published or deposited. Additionally, such techniques may be used a posteriori to assess the (relative) merits of a model by potential users. Weak validation methods and statistics assess how well a model reproduces the information that was used in its construction (i.e. experimental data and prior knowledge). Strong methods and statistics, on the other hand, test how well a model predicts data or information that were not used in the structure-determination process. These may be data that were excluded from the process on purpose, general knowledge about macromolecular structure, information about the biological role and biochemical activity of the molecule under study or its mutants or complexes and predictions that are based on the model and that can be tested experimentally. PMID:19171968
Building Enterprise Transition Plans Through the Development of Collapsing Design Structure Matrices
2015-09-17
processes from the earliest input to the final output to evaluate where change is needed to reduce costs, reduce waste, and improve the flow of information...from) integrating a large complex enterprise? • How should firms/enterprises evaluate systems prior to integration? What are some valid taxonomies
Strategies for Teaching Children with Autism in Physical Education
ERIC Educational Resources Information Center
Groft-Jones, Melissa; Block, Martin E.
2006-01-01
The purpose of this article is to summarize information presented in the prior articles into practical strategies physical educators can use when teaching children with autism. The authors divided the article into three areas: (1) structuring the environment; (2) accommodating communication challenges; and (3) preventing challenging behaviors.…
Distributed Data Processing in a United States Naval Shipyard.
1979-12-01
25 1. Evolution ........ ..................... 25 2. Motivations for Distributed Processing ... ....... 30 a. Extensibility...51 B. EVOLUTION ...... ........................ ... 51 C. CONCEPTS .... ... ........................ . 55 D. FORM AND STRUCTURE OF THE...motivations for, and the characteristics of, distributed processing as they apply to management information systems. 1. Evolution Prior to the advent of
Yu, Rongjie; Abdel-Aty, Mohamed
2013-07-01
The Bayesian inference method has been frequently adopted to develop safety performance functions. One advantage of the Bayesian inference is that prior information for the independent variables can be included in the inference procedures. However, there are few studies that discussed how to formulate informative priors for the independent variables and evaluated the effects of incorporating informative priors in developing safety performance functions. This paper addresses this deficiency by introducing four approaches of developing informative priors for the independent variables based on historical data and expert experience. Merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson-lognormal models). Deviance information criterion (DIC), R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparison across the models indicated that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies. Furthermore, informative priors for the inverse dispersion parameter have also been introduced and tested. Different types of informative priors' effects on the model estimations and goodness-of-fit have been compared and concluded. Finally, based on the results, recommendations for future research topics and study applications have been made. Copyright © 2013 Elsevier Ltd. All rights reserved.
Graziele Rodrigues, Livia; De Souza, João Batista; De Torres, Erica Miranda; Ferreira Silva, Rhonan
2017-01-01
Background. The present study aimed to screen the knowledge and attitudes of dentists toward the use of informed consent forms prior to procedures involving operative dentistry. Methods. A research tool containing questions (questionnaire) regarding the use of informed consent forms was developed. The questionnaire consisted of seven questions structured to screen the current practice in operative dentistry towards the use of informed consent forms. Results. The questionnaires were distributed among 731 dentists, of which 179 returned them with answers. Sixty-seven dentists reported not using informed consent forms. The main reasons for not using informed consent forms were: having a complete dental record signed by the patient (67.2%) and having a good relation with patients (43.6%). The dentists who reported using informed consent forms revealed that they obtained them from other dentists and made their own modifications (35.9%). Few dentists revealed contacting lawyers (1.7%) and experts in legal dentistry (0.9%) for the development of their informed consent forms. Conclusion. A high number of dentists working in the field of operative dentistry behave according to the ethical standards in the clinical practice, becoming unprotected against ethical and legal actions. PMID:28413600
Searching Choices: Quantifying Decision-Making Processes Using Search Engine Data.
Moat, Helen Susannah; Olivola, Christopher Y; Chater, Nick; Preis, Tobias
2016-07-01
When making a decision, humans consider two types of information: information they have acquired through their prior experience of the world, and further information they gather to support the decision in question. Here, we present evidence that data from search engines such as Google can help us model both sources of information. We show that statistics from search engines on the frequency of content on the Internet can help us estimate the statistical structure of prior experience; and, specifically, we outline how such statistics can inform psychological theories concerning the valuation of human lives, or choices involving delayed outcomes. Turning to information gathering, we show that search query data might help measure human information gathering, and it may predict subsequent decisions. Such data enable us to compare information gathered across nations, where analyses suggest, for example, a greater focus on the future in countries with a higher per capita GDP. We conclude that search engine data constitute a valuable new resource for cognitive scientists, offering a fascinating new tool for understanding the human decision-making process. Copyright © 2016 The Authors. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Addressing potential prior-data conflict when using informative priors in proof-of-concept studies.
Mutsvari, Timothy; Tytgat, Dominique; Walley, Rosalind
2016-01-01
Bayesian methods are increasingly used in proof-of-concept studies. An important benefit of these methods is the potential to use informative priors, thereby reducing sample size. This is particularly relevant for treatment arms where there is a substantial amount of historical information such as placebo and active comparators. One issue with using an informative prior is the possibility of a mismatch between the informative prior and the observed data, referred to as prior-data conflict. We focus on two methods for dealing with this: a testing approach and a mixture prior approach. The testing approach assesses prior-data conflict by comparing the observed data to the prior predictive distribution and resorting to a non-informative prior if prior-data conflict is declared. The mixture prior approach uses a prior with a precise and diffuse component. We assess these approaches for the normal case via simulation and show they have some attractive features as compared with the standard one-component informative prior. For example, when the discrepancy between the prior and the data is sufficiently marked, and intuitively, one feels less certain about the results, both the testing and mixture approaches typically yield wider posterior-credible intervals than when there is no discrepancy. In contrast, when there is no discrepancy, the results of these approaches are typically similar to the standard approach. Whilst for any specific study, the operating characteristics of any selected approach should be assessed and agreed at the design stage; we believe these two approaches are each worthy of consideration. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Wu, Ying-Tien
2013-10-01
This study aims to provide insights into the role of learners' knowledge structures about a socio-scientific issue (SSI) in their informal reasoning on the issue. A total of 42 non-science major university students' knowledge structures and informal reasoning were assessed with multidimensional analyses. With both qualitative and quantitative analyses, this study revealed that those students with more extended and better-organized knowledge structures, as well as those who more frequently used higher-order information processing modes, were more oriented towards achieving a higher-level informal reasoning quality. The regression analyses further showed that the "richness" of the students' knowledge structures explained 25 % of the variation in their rebuttal construction, an important indicator of reasoning quality, indicating the significance of the role of students' sophisticated knowledge structure in SSI reasoning. Besides, this study also provides some initial evidence for the significant role of the "core" concept within one's knowledge structure in one's SSI reasoning. The findings in this study suggest that, in SSI-based instruction, science instructors should try to identify students' core concepts within their prior knowledge regarding the SSI, and then they should try to guide students to construct and structure relevant concepts or ideas regarding the SSI based on their core concepts. Thus, students could obtain extended and well-organized knowledge structures, which would then help them achieve better learning transfer in dealing with SSIs.
Adjusting for founder relatedness in a linkage analysis using prior information.
Sheehan, N A; Egeland, T
2008-01-01
In genetic linkage studies, while the pedigrees are generally known, background relatedness between the founding individuals, assumed by definition to be unrelated, can seriously affect the results of the analysis. Likelihood approaches to relationship estimation from genetic marker data can all be expressed in terms of finding the most likely pedigree connecting the individuals of interest. When the true relationship is the main focus, the set of all possible alternative pedigrees can be too large to consider. However, prior information is often available which, when incorporated in a formal and structured way, can restrict this set to a manageable size thus enabling the calculation of a posterior distribution from which inferences can be drawn. Here, the unknown relationships are more of a nuisance factor than of interest in their own right, so the focus is on adjusting the results of the analysis rather than on direct estimation. In this paper, we show how prior information on founder relationships can be exploited in some applications to generate a set of candidate extended pedigrees. We then weight the relevant pedigree-specific likelihoods by their posterior probabilities to adjust the lod score statistics. (c) 2007 S. Karger AG, Basel
NASA Astrophysics Data System (ADS)
Chen, Yingxuan; Yin, Fang-Fang; Zhang, Yawei; Zhang, You; Ren, Lei
2018-04-01
Purpose: compressed sensing reconstruction using total variation (TV) tends to over-smooth the edge information by uniformly penalizing the image gradient. The goal of this study is to develop a novel prior contour based TV (PCTV) method to enhance the edge information in compressed sensing reconstruction for CBCT. Methods: the edge information is extracted from prior planning-CT via edge detection. Prior CT is first registered with on-board CBCT reconstructed with TV method through rigid or deformable registration. The edge contours in prior-CT is then mapped to CBCT and used as the weight map for TV regularization to enhance edge information in CBCT reconstruction. The PCTV method was evaluated using extended-cardiac-torso (XCAT) phantom, physical CatPhan phantom and brain patient data. Results were compared with both TV and edge preserving TV (EPTV) methods which are commonly used for limited projection CBCT reconstruction. Relative error was used to calculate pixel value difference and edge cross correlation was defined as the similarity of edge information between reconstructed images and ground truth in the quantitative evaluation. Results: compared to TV and EPTV, PCTV enhanced the edge information of bone, lung vessels and tumor in XCAT reconstruction and complex bony structures in brain patient CBCT. In XCAT study using 45 half-fan CBCT projections, compared with ground truth, relative errors were 1.5%, 0.7% and 0.3% and edge cross correlations were 0.66, 0.72 and 0.78 for TV, EPTV and PCTV, respectively. PCTV is more robust to the projection number reduction. Edge enhancement was reduced slightly with noisy projections but PCTV was still superior to other methods. PCTV can maintain resolution while reducing the noise in the low mAs CatPhan reconstruction. Low contrast edges were preserved better with PCTV compared with TV and EPTV. Conclusion: PCTV preserved edge information as well as reduced streak artifacts and noise in low dose CBCT reconstruction. PCTV is superior to TV and EPTV methods in edge enhancement, which can potentially improve the localization accuracy in radiation therapy.
Sie, Aisha S; Spruijt, Liesbeth; van Zelst-Stams, Wendy A G; Mensenkamp, Arjen R; Ligtenberg, Marjolijn J; Brunner, Han G; Prins, Judith B; Hoogerbrugge, Nicoline
2012-05-08
Current practice for patients with breast cancer referred for genetic counseling, includes face-to-face consultations with a genetic counselor prior to and following DNA-testing. This is based on guidelines regarding Huntington's disease in anticipation of high psychosocial impact of DNA-testing for mutations in BRCA1/2 genes. The initial consultation covers generic information regarding hereditary breast cancer and the (im)possibilities of DNA-testing, prior to such testing. Patients with breast cancer may see this information as irrelevant or unnecessary because individual genetic advice depends on DNA-test results. Also, verbal information is not always remembered well by patients. A different format for this information prior to DNA-testing is possible: replacing initial face-to-face genetic counseling (DNA-intake procedure) by telephone, written and digital information sent to patients' homes (DNA-direct procedure). In this intervention study, 150 patients with breast cancer referred to the department of Clinical Genetics of the Radboud University Nijmegen Medical Centre are given the choice between two procedures, DNA-direct (intervention group) or DNA-intake (usual care, control group). During a triage telephone call, patients are excluded if they have problems with Dutch text, family communication, or of psychological or psychiatric nature. Primary outcome measures are satisfaction and psychological distress. Secondary outcome measures are determinants for the participant's choice of procedure, waiting and processing times, and family characteristics. Data are collected by self-report questionnaires at baseline and following completion of genetic counseling. A minority of participants will receive an invitation for a 30 min semi-structured telephone interview, e.g. confirmed carriers of a BRCA1/2 mutation, and those who report problems with the procedure. This study compares current practice of an intake consultation (DNA-intake) to a home informational package of telephone, written and digital information (DNA-direct) prior to DNA-testing in patients with breast cancer. The aim is to determine whether DNA-direct is an acceptable procedure for BRCA1/2 testing, in order to provide customized care to patients with breast cancer, cutting down on the period of uncertainty during this diagnostic process.
Preparing learners with partly incorrect intuitive prior knowledge for learning
Ohst, Andrea; Fondu, Béatrice M. E.; Glogger, Inga; Nückles, Matthias; Renkl, Alexander
2014-01-01
Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly) “incompatible” with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (re)organizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-)organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces. PMID:25071638
NASA Astrophysics Data System (ADS)
Villéger, Alice; Ouchchane, Lemlih; Lemaire, Jean-Jacques; Boire, Jean-Yves
2007-03-01
Symptoms of neurodegenerative pathologies such as Parkinson's disease can be relieved through Deep Brain Stimulation. This neurosurgical technique relies on high precision positioning of electrodes in specific areas of the basal ganglia and the thalamus. These subcortical anatomical targets must be located at pre-operative stage, from a set of MRI acquired under stereotactic conditions. In order to assist surgical planning, we designed a semi-automated image analysis process for extracting anatomical areas of interest. Complementary information, provided by both patient's data and expert knowledge, is represented as fuzzy membership maps, which are then fused by means of suitable possibilistic operators in order to achieve the segmentation of targets. More specifically, theoretical prior knowledge on brain anatomy is modelled within a 'virtual atlas' organised as a spatial graph: a list of vertices linked by edges, where each vertex represents an anatomical structure of interest and contains relevant information such as tissue composition, whereas each edge represents a spatial relationship between two structures, such as their relative directions. The model is built using heterogeneous sources of information such as qualitative descriptions from the expert, or quantitative information from prelabelled images. For each patient, tissue membership maps are extracted from MR data through a classification step. Prior model and patient's data are then matched by using a research algorithm (or 'strategy') which simultaneously computes an estimation of the location of every structures. The method was tested on 10 clinical images, with promising results. Location and segmentation results were statistically assessed, opening perspectives for enhancements.
A Method for Constructing Informative Priors for Bayesian Modeling of Occupational Hygiene Data.
Quick, Harrison; Huynh, Tran; Ramachandran, Gurumurthy
2017-01-01
In many occupational hygiene settings, the demand for more accurate, more precise results is at odds with limited resources. To combat this, practitioners have begun using Bayesian methods to incorporate prior information into their statistical models in order to obtain more refined inference from their data. This is not without risk, however, as incorporating prior information that disagrees with the information contained in data can lead to spurious conclusions, particularly if the prior is too informative. In this article, we propose a method for constructing informative prior distributions for normal and lognormal data that are intuitive to specify and robust to bias. To demonstrate the use of these priors, we walk practitioners through a step-by-step implementation of our priors using an illustrative example. We then conclude with recommendations for general use. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
From protein sequence to dynamics and disorder with DynaMine.
Cilia, Elisa; Pancsa, Rita; Tompa, Peter; Lenaerts, Tom; Vranken, Wim F
2013-01-01
Protein function and dynamics are closely related; however, accurate dynamics information is difficult to obtain. Here based on a carefully assembled data set derived from experimental data for proteins in solution, we quantify backbone dynamics properties on the amino-acid level and develop DynaMine--a fast, high-quality predictor of protein backbone dynamics. DynaMine uses only protein sequence information as input and shows great potential in distinguishing regions of different structural organization, such as folded domains, disordered linkers, molten globules and pre-structured binding motifs of different sizes. It also identifies disordered regions within proteins with an accuracy comparable to the most sophisticated existing predictors, without depending on prior disorder knowledge or three-dimensional structural information. DynaMine provides molecular biologists with an important new method that grasps the dynamical characteristics of any protein of interest, as we show here for human p53 and E1A from human adenovirus 5.
Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography
NASA Astrophysics Data System (ADS)
Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.
2016-05-01
Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.
NASA Astrophysics Data System (ADS)
Harvey, David J.; Struwe, Weston B.
2018-05-01
There is considerable potential for the use of ion mobility mass spectrometry in structural glycobiology due in large part to the gas-phase separation attributes not typically observed by orthogonal methods. Here, we evaluate the capability of traveling wave ion mobility combined with negative ion collision-induced dissociation to provide structural information on N-linked glycans containing multiple fucose residues forming the Lewisx and Lewisy epitopes. These epitopes are involved in processes such as cell-cell recognition and are important as cancer biomarkers. Specific information that could be obtained from the intact N-glycans by negative ion CID included the general topology of the glycan such as the presence or absence of a bisecting GlcNAc residue and the branching pattern of the triantennary glycans. Information on the location of the fucose residues was also readily obtainable from ions specific to each antenna. Some isobaric fragment ions produced prior to ion mobility could subsequently be separated and, in some cases, provided additional valuable structural information that was missing from the CID spectra alone.
Automated segmentation of midbrain structures with high iron content.
Garzón, Benjamín; Sitnikov, Rouslan; Bäckman, Lars; Kalpouzos, Grégoria
2018-04-15
The substantia nigra (SN), the subthalamic nucleus (STN), and the red nucleus (RN) are midbrain structures of ample interest in many neuroimaging studies, which may benefit from the availability of automated segmentation methods. The high iron content of these structures awards them high contrast in quantitative susceptibility mapping (QSM) images. We present a novel segmentation method that leverages the information of these images to produce automated segmentations of the SN, STN, and RN. The algorithm builds a map of spatial priors for the structures by non-linearly registering a set of manually-traced training labels to the midbrain. The priors are used to inform a Gaussian mixture model of the image intensities, with smoothness constraints imposed to ensure anatomical plausibility. The method was validated on manual segmentations from a sample of 40 healthy younger and older subjects. Average Dice scores were 0.81 (0.05) for the SN, 0.66 (0.14) for the STN and 0.88 (0.04) for the RN in the left hemisphere, and similar values were obtained for the right hemisphere. In all structures, volumes of manual and automatically obtained segmentations were significantly correlated. The algorithm showed lower accuracy on R 2 * and T 2 -weighted Fluid Attenuated Inversion Recovery (FLAIR) images, which are also sensitive to iron content. To illustrate an application of the method, we show that the automated segmentations were comparable to the manual ones regarding detection of age-related differences to putative iron content. Copyright © 2017 Elsevier Inc. All rights reserved.
Ting, Chih-Chung; Yu, Chia-Chen; Maloney, Laurence T.
2015-01-01
In Bayesian decision theory, knowledge about the probabilities of possible outcomes is captured by a prior distribution and a likelihood function. The prior reflects past knowledge and the likelihood summarizes current sensory information. The two combined (integrated) form a posterior distribution that allows estimation of the probability of different possible outcomes. In this study, we investigated the neural mechanisms underlying Bayesian integration using a novel lottery decision task in which both prior knowledge and likelihood information about reward probability were systematically manipulated on a trial-by-trial basis. Consistent with Bayesian integration, as sample size increased, subjects tended to weigh likelihood information more compared with prior information. Using fMRI in humans, we found that the medial prefrontal cortex (mPFC) correlated with the mean of the posterior distribution, a statistic that reflects the integration of prior knowledge and likelihood of reward probability. Subsequent analysis revealed that both prior and likelihood information were represented in mPFC and that the neural representations of prior and likelihood in mPFC reflected changes in the behaviorally estimated weights assigned to these different sources of information in response to changes in the environment. Together, these results establish the role of mPFC in prior-likelihood integration and highlight its involvement in representing and integrating these distinct sources of information. PMID:25632152
Marginally specified priors for non-parametric Bayesian estimation
Kessler, David C.; Hoff, Peter D.; Dunson, David B.
2014-01-01
Summary Prior specification for non-parametric Bayesian inference involves the difficult task of quantifying prior knowledge about a parameter of high, often infinite, dimension. A statistician is unlikely to have informed opinions about all aspects of such a parameter but will have real information about functionals of the parameter, such as the population mean or variance. The paper proposes a new framework for non-parametric Bayes inference in which the prior distribution for a possibly infinite dimensional parameter is decomposed into two parts: an informative prior on a finite set of functionals, and a non-parametric conditional prior for the parameter given the functionals. Such priors can be easily constructed from standard non-parametric prior distributions in common use and inherit the large support of the standard priors on which they are based. Additionally, posterior approximations under these informative priors can generally be made via minor adjustments to existing Markov chain approximation algorithms for standard non-parametric prior distributions. We illustrate the use of such priors in the context of multivariate density estimation using Dirichlet process mixture models, and in the modelling of high dimensional sparse contingency tables. PMID:25663813
Cone beam x-ray luminescence computed tomography reconstruction with a priori anatomical information
NASA Astrophysics Data System (ADS)
Lo, Pei-An; Lin, Meng-Lung; Jin, Shih-Chun; Chen, Jyh-Cheng; Lin, Syue-Liang; Chang, C. Allen; Chiang, Huihua Kenny
2014-09-01
X-ray luminescence computed tomography (XLCT) is a novel molecular imaging modality that reconstructs the optical distribution of x-ray-excited phosphor particles with prior informational of anatomical CT image. The prior information improves the accuracy of image reconstruction. The system can also present anatomical CT image. The optical system based on a high sensitive charge coupled device (CCD) is perpendicular with a CT system. In the XLCT system, the xray was adopted to excite the phosphor of the sample and CCD camera was utilized to acquire luminescence emitted from the sample in 360 degrees projection free-space. In this study, the fluorescence diffuse optical tomography (FDOT)-like algorithm was used for image reconstruction, the structural prior information was incorporated in the reconstruction by adding a penalty term to the minimization function. The phosphor used in this study is Gd2O2S:Tb. For the simulation and experiments, the data was collected from 16 projections. The cylinder phantom was 40 mm in diameter and contains 8 mm diameter inclusion; the phosphor in the in vivo study was 5 mm in diameter at a depth of 3 mm. Both the errors were no more than 5%. Based on the results from these simulation and experimental studies, the novel XLCT method has demonstrated the feasibility for in vivo animal model studies.
Determining informative priors for cognitive models.
Lee, Michael D; Vanpaemel, Wolf
2018-02-01
The development of cognitive models involves the creative scientific formalization of assumptions, based on theory, observation, and other relevant information. In the Bayesian approach to implementing, testing, and using cognitive models, assumptions can influence both the likelihood function of the model, usually corresponding to assumptions about psychological processes, and the prior distribution over model parameters, usually corresponding to assumptions about the psychological variables that influence those processes. The specification of the prior is unique to the Bayesian context, but often raises concerns that lead to the use of vague or non-informative priors in cognitive modeling. Sometimes the concerns stem from philosophical objections, but more often practical difficulties with how priors should be determined are the stumbling block. We survey several sources of information that can help to specify priors for cognitive models, discuss some of the methods by which this information can be formalized in a prior distribution, and identify a number of benefits of including informative priors in cognitive modeling. Our discussion is based on three illustrative cognitive models, involving memory retention, categorization, and decision making.
Bayesian network prior: network analysis of biological data using external knowledge
Isci, Senol; Dogan, Haluk; Ozturk, Cengizhan; Otu, Hasan H.
2014-01-01
Motivation: Reverse engineering GI networks from experimental data is a challenging task due to the complex nature of the networks and the noise inherent in the data. One way to overcome these hurdles would be incorporating the vast amounts of external biological knowledge when building interaction networks. We propose a framework where GI networks are learned from experimental data using Bayesian networks (BNs) and the incorporation of external knowledge is also done via a BN that we call Bayesian Network Prior (BNP). BNP depicts the relation between various evidence types that contribute to the event ‘gene interaction’ and is used to calculate the probability of a candidate graph (G) in the structure learning process. Results: Our simulation results on synthetic, simulated and real biological data show that the proposed approach can identify the underlying interaction network with high accuracy even when the prior information is distorted and outperforms existing methods. Availability: Accompanying BNP software package is freely available for academic use at http://bioe.bilgi.edu.tr/BNP. Contact: hasan.otu@bilgi.edu.tr Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:24215027
Chruscicki, Adam; Badke, Katherin; Peddie, David; Small, Serena; Balka, Ellen; Hohl, Corinne M
2016-01-01
Adverse drug events (ADEs), harmful unintended consequences of medication use, are a leading cause of hospital admissions, yet are rarely documented in a structured format between care providers. We describe pilot-testing structured ADE documentation fields prior to integration into an electronic medical record (EMR). We completed a qualitative study at two Canadian hospitals. Using data derived from a systematic review of the literature, we developed screen mock-ups for an ADE reporting platform, iteratively revised in participatory workshops with diverse end-user groups. We designed a paper-based form reflecting the data elements contained in the mock-ups. We distributed them to a convenience sample of clinical pharmacists, and completed ethnographic workplace observations while the forms were used. We reviewed completed forms, collected feedback from pharmacists using semi-structured interviews, and coded the data in NVivo for themes related to the ADE form. We completed 25 h of clinical observations, and 24 ADEs were documented. Pharmacists perceived the form as simple and clear, with sufficient detail to capture ADEs. They identified fields for omission, and others requiring more detail. Pharmacists encountered barriers to documenting ADEs including uncertainty about what constituted a reportable ADE, inability to complete patient follow-up, the need for inter-professional communication to rule out alternative diagnoses, and concern about creating a permanent record. Paper-based pilot-testing allowed planning for important modifications in an ADE documentation form prior to implementation in an EMR. While paper-based piloting is rarely reported prior to EMR implementations, it can inform design and enhance functionality. Piloting with other groups of care providers and in different healthcare settings will likely lead to further revisions prior to broader implementations.
An efficient semi-supervised community detection framework in social networks.
Li, Zhen; Gong, Yong; Pan, Zhisong; Hu, Guyu
2017-01-01
Community detection is an important tasks across a number of research fields including social science, biology, and physics. In the real world, topology information alone is often inadequate to accurately find out community structure due to its sparsity and noise. The potential useful prior information such as pairwise constraints which contain must-link and cannot-link constraints can be obtained from domain knowledge in many applications. Thus, combining network topology with prior information to improve the community detection accuracy is promising. Previous methods mainly utilize the must-link constraints while cannot make full use of cannot-link constraints. In this paper, we propose a semi-supervised community detection framework which can effectively incorporate two types of pairwise constraints into the detection process. Particularly, must-link and cannot-link constraints are represented as positive and negative links, and we encode them by adding different graph regularization terms to penalize closeness of the nodes. Experiments on multiple real-world datasets show that the proposed framework significantly improves the accuracy of community detection.
Restoring fire suppressed Texas pak woodlands to historic conditions using prescribed fire
Jeff C. Sparks; Michael C. Stambaugh; Eric L. Keith
2012-01-01
Comparable to many oak ecosystems across the eastern United States, oak woodlands in Texas display characteristics of changing composition and structure due to altered fire regimes. Information describing historic fire regimes suggests woodlands underwent relatively frequent and repeated burning prior to major Euro-American influence in the early 19th century. Oak...
Information Theoretic Secret Key Generation: Structured Codes and Tree Packing
ERIC Educational Resources Information Center
Nitinawarat, Sirin
2010-01-01
This dissertation deals with a multiterminal source model for secret key generation by multiple network terminals with prior and privileged access to a set of correlated signals complemented by public discussion among themselves. Emphasis is placed on a characterization of secret key capacity, i.e., the largest rate of an achievable secret key,…
ERIC Educational Resources Information Center
Mills, Robert J.; Dupin-Bryant, Pamela A.; Johnson, John D.; Beaulieu, Tanya Y.
2015-01-01
The demand for Information Systems (IS) graduates with expertise in Structured Query Language (SQL) and database management is vast and projected to increase as "big data" becomes ubiquitous. To prepare students to solve complex problems in a data-driven world, educators must explore instructional strategies to help link prior knowledge…
Modeling Protein Expression and Protein Signaling Pathways
Telesca, Donatello; Müller, Peter; Kornblau, Steven M.; Suchard, Marc A.; Ji, Yuan
2015-01-01
High-throughput functional proteomic technologies provide a way to quantify the expression of proteins of interest. Statistical inference centers on identifying the activation state of proteins and their patterns of molecular interaction formalized as dependence structure. Inference on dependence structure is particularly important when proteins are selected because they are part of a common molecular pathway. In that case, inference on dependence structure reveals properties of the underlying pathway. We propose a probability model that represents molecular interactions at the level of hidden binary latent variables that can be interpreted as indicators for active versus inactive states of the proteins. The proposed approach exploits available expert knowledge about the target pathway to define an informative prior on the hidden conditional dependence structure. An important feature of this prior is that it provides an instrument to explicitly anchor the model space to a set of interactions of interest, favoring a local search approach to model determination. We apply our model to reverse-phase protein array data from a study on acute myeloid leukemia. Our inference identifies relevant subpathways in relation to the unfolding of the biological process under study. PMID:26246646
Effects of prior information on decoding degraded speech: an fMRI study.
Clos, Mareike; Langner, Robert; Meyer, Martin; Oechslin, Mathias S; Zilles, Karl; Eickhoff, Simon B
2014-01-01
Expectations and prior knowledge are thought to support the perceptual analysis of incoming sensory stimuli, as proposed by the predictive-coding framework. The current fMRI study investigated the effect of prior information on brain activity during the decoding of degraded speech stimuli. When prior information enabled the comprehension of the degraded sentences, the left middle temporal gyrus and the left angular gyrus were activated, highlighting a role of these areas in meaning extraction. In contrast, the activation of the left inferior frontal gyrus (area 44/45) appeared to reflect the search for meaningful information in degraded speech material that could not be decoded because of mismatches with the prior information. Our results show that degraded sentences evoke instantaneously different percepts and activation patterns depending on the type of prior information, in line with prediction-based accounts of perception. Copyright © 2012 Wiley Periodicals, Inc.
Shen, Yang; Bax, Ad
2015-01-01
Summary Chemical shifts are obtained at the first stage of any protein structural study by NMR spectroscopy. Chemical shifts are known to be impacted by a wide range of structural factors and the artificial neural network based TALOS-N program has been trained to extract backbone and sidechain torsion angles from 1H, 15N and 13C shifts. The program is quite robust, and typically yields backbone torsion angles for more than 90% of the residues, and sidechain χ1 rotamer information for about half of these, in addition to reliably predicting secondary structure. The use of TALOS-N is illustrated for the protein DinI, and torsion angles obtained by TALOS-N analysis from the measured chemical shifts of its backbone and 13Cβ nuclei are compared to those seen in a prior, experimentally determined structure. The program is also particularly useful for generating torsion angle restraints, which then can be used during standard NMR protein structure calculations. PMID:25502373
Yang, Linglu; Yan, Bo; Reinhard, Björn M.
2009-01-01
The optical spectra of individual Ag-Au alloy hollow particles were correlated with the particles’ structures obtained by transmission electron microscopy (TEM). The TEM provided direct experimental access to the dimension of the cavity, thickness of the metal shell, and the interparticle distance of hollow particle dimers with high spatial resolution. The analysis of correlated spectral and structural information enabled the quantification of the influence of the core-shell structure on the resonance energy, plasmon lifetime, and plasmon coupling efficiency. Electron beam exposure during TEM inspection was observed to affect plasmon wavelength and lifetime, making optical inspection prior to structural characterization mandatory. PMID:19768108
Pometti, Carolina L; Bessega, Cecilia F; Saidman, Beatriz O; Vilardi, Juan C
2014-03-01
Bayesian clustering as implemented in STRUCTURE or GENELAND software is widely used to form genetic groups of populations or individuals. On the other hand, in order to satisfy the need for less computer-intensive approaches, multivariate analyses are specifically devoted to extracting information from large datasets. In this paper, we report the use of a dataset of AFLP markers belonging to 15 sampling sites of Acacia caven for studying the genetic structure and comparing the consistency of three methods: STRUCTURE, GENELAND and DAPC. Of these methods, DAPC was the fastest one and showed accuracy in inferring the K number of populations (K = 12 using the find.clusters option and K = 15 with a priori information of populations). GENELAND in turn, provides information on the area of membership probabilities for individuals or populations in the space, when coordinates are specified (K = 12). STRUCTURE also inferred the number of K populations and the membership probabilities of individuals based on ancestry, presenting the result K = 11 without prior information of populations and K = 15 using the LOCPRIOR option. Finally, in this work all three methods showed high consistency in estimating the population structure, inferring similar numbers of populations and the membership probabilities of individuals to each group, with a high correlation between each other.
Safo, Sandra E; Li, Shuzhao; Long, Qi
2018-03-01
Integrative analysis of high dimensional omics data is becoming increasingly popular. At the same time, incorporating known functional relationships among variables in analysis of omics data has been shown to help elucidate underlying mechanisms for complex diseases. In this article, our goal is to assess association between transcriptomic and metabolomic data from a Predictive Health Institute (PHI) study that includes healthy adults at a high risk of developing cardiovascular diseases. Adopting a strategy that is both data-driven and knowledge-based, we develop statistical methods for sparse canonical correlation analysis (CCA) with incorporation of known biological information. Our proposed methods use prior network structural information among genes and among metabolites to guide selection of relevant genes and metabolites in sparse CCA, providing insight on the molecular underpinning of cardiovascular disease. Our simulations demonstrate that the structured sparse CCA methods outperform several existing sparse CCA methods in selecting relevant genes and metabolites when structural information is informative and are robust to mis-specified structural information. Our analysis of the PHI study reveals that a number of gene and metabolic pathways including some known to be associated with cardiovascular diseases are enriched in the set of genes and metabolites selected by our proposed approach. © 2017, The International Biometric Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn
2016-07-15
Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifiermore » for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.« less
3D microwave tomography of the breast using prior anatomical information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golnabi, Amir H., E-mail: golnabia@montclair.edu; Meaney, Paul M.; Paulsen, Keith D.
2016-04-15
Purpose: The authors have developed a new 3D breast image reconstruction technique that utilizes the soft tissue spatial resolution of magnetic resonance imaging (MRI) and integrates the dielectric property differentiation from microwave imaging to produce a dual modality approach with the goal of augmenting the specificity of MR imaging, possibly without the need for nonspecific contrast agents. The integration is performed through the application of a soft prior regularization which imports segmented geometric meshes generated from MR exams and uses it to constrain the microwave tomography algorithm to recover nearly uniform property distributions within segmented regions with sharp delineation betweenmore » these internal subzones. Methods: Previous investigations have demonstrated that this approach is effective in 2D simulation and phantom experiments and also in clinical exams. The current study extends the algorithm to 3D and provides a thorough analysis of the sensitivity and robustness to misalignment errors in size and location between the spatial prior information and the actual data. Results: Image results in 3D were not strongly dependent on reconstruction mesh density, and the changes of less than 30% in recovered property values arose from variations of more than 125% in target region size—an outcome which was more robust than in 2D. Similarly, changes of less than 13% occurred in the 3D image results from variations in target location of nearly 90% of the inclusion size. Permittivity and conductivity errors were about 5 times and 2 times smaller, respectively, with the 3D spatial prior algorithm in actual phantom experiments than those which occurred without priors. Conclusions: The presented study confirms that the incorporation of structural information in the form of a soft constraint can considerably improve the accuracy of the property estimates in predefined regions of interest. These findings are encouraging and establish a strong foundation for using the soft prior technique in clinical studies, where their microwave imaging system and MRI can simultaneously collect breast exam data in patients.« less
Bayesian generalized linear mixed modeling of Tuberculosis using informative priors.
Ojo, Oluwatobi Blessing; Lougue, Siaka; Woldegerima, Woldegebriel Assefa
2017-01-01
TB is rated as one of the world's deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where there is no solid external information about the distribution of the parameter of interest. The main aim of this study is to profile people living with TB in South Africa. In this paper, identical regression models are fitted for classical and Bayesian approach both with non-informative and informative prior, using South Africa General Household Survey (GHS) data for the year 2014. For the Bayesian model with informative prior, South Africa General Household Survey dataset for the year 2011 to 2013 are used to set up priors for the model 2014.
Xiao, Jian; Cao, Hongyuan; Chen, Jun
2017-09-15
Next generation sequencing technologies have enabled the study of the human microbiome through direct sequencing of microbial DNA, resulting in an enormous amount of microbiome sequencing data. One unique characteristic of microbiome data is the phylogenetic tree that relates all the bacterial species. Closely related bacterial species have a tendency to exhibit a similar relationship with the environment or disease. Thus, incorporating the phylogenetic tree information can potentially improve the detection power for microbiome-wide association studies, where hundreds or thousands of tests are conducted simultaneously to identify bacterial species associated with a phenotype of interest. Despite much progress in multiple testing procedures such as false discovery rate (FDR) control, methods that take into account the phylogenetic tree are largely limited. We propose a new FDR control procedure that incorporates the prior structure information and apply it to microbiome data. The proposed procedure is based on a hierarchical model, where a structure-based prior distribution is designed to utilize the phylogenetic tree. By borrowing information from neighboring bacterial species, we are able to improve the statistical power of detecting associated bacterial species while controlling the FDR at desired levels. When the phylogenetic tree is mis-specified or non-informative, our procedure achieves a similar power as traditional procedures that do not take into account the tree structure. We demonstrate the performance of our method through extensive simulations and real microbiome datasets. We identified far more alcohol-drinking associated bacterial species than traditional methods. R package StructFDR is available from CRAN. chen.jun2@mayo.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.
2018-05-01
Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.
Depaoli, Sarah
2013-06-01
Growth mixture modeling (GMM) represents a technique that is designed to capture change over time for unobserved subgroups (or latent classes) that exhibit qualitatively different patterns of growth. The aim of the current article was to explore the impact of latent class separation (i.e., how similar growth trajectories are across latent classes) on GMM performance. Several estimation conditions were compared: maximum likelihood via the expectation maximization (EM) algorithm and the Bayesian framework implementing diffuse priors, "accurate" informative priors, weakly informative priors, data-driven informative priors, priors reflecting partial-knowledge of parameters, and "inaccurate" (but informative) priors. The main goal was to provide insight about the optimal estimation condition under different degrees of latent class separation for GMM. Results indicated that optimal parameter recovery was obtained though the Bayesian approach using "accurate" informative priors, and partial-knowledge priors showed promise for the recovery of the growth trajectory parameters. Maximum likelihood and the remaining Bayesian estimation conditions yielded poor parameter recovery for the latent class proportions and the growth trajectories. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Schmidt, Hiemke K; Rothgangel, Martin; Grube, Dietmar
2017-12-01
Awareness of various arguments can help interactants present opinions, stress points, and build counterarguments during discussions. At school, some topics are taught in a way that students learn to accumulate knowledge and gather arguments, and later employ them during debates. Prior knowledge may facilitate recalling information on well structured, fact-based topics, but does it facilitate recalling arguments during discussions on complex, interdisciplinary topics? We assessed the prior knowledge in domains related to a bioethical topic of 277 students from Germany (approximately 15 years old), their interest in the topic, and their general knowledge. The students read a text with arguments for and against prenatal diagnostics and tried to recall the arguments one week later and again six weeks later. Prior knowledge in various domains related to the topic individually and separately helped students recall the arguments. These relationships were independent of students' interest in the topic and their general knowledge. Copyright © 2017 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2011 CFR
2011-04-01
... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...
Code of Federal Regulations, 2012 CFR
2012-04-01
... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...
Code of Federal Regulations, 2013 CFR
2013-04-01
... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...
Code of Federal Regulations, 2014 CFR
2014-04-01
... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...
Code of Federal Regulations, 2010 CFR
2010-04-01
... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richards, Elizabeth H.; Schindel, Kay; Bosiljevac, Tom
2011-12-01
Structural Considerations for Solar Installers provides a comprehensive outline of structural considerations associated with simplified solar installations and recommends a set of best practices installers can follow when assessing such considerations. Information in the manual comes from engineering and solar experts as well as case studies. The objectives of the manual are to ensure safety and structural durability for rooftop solar installations and to potentially accelerate the permitting process by identifying and remedying structural issues prior to installation. The purpose of this document is to provide tools and guidelines for installers to help ensure that residential photovoltaic (PV) power systemsmore » are properly specified and installed with respect to the continuing structural integrity of the building.« less
Dokoumetzidis, Aristides; Aarons, Leon
2005-08-01
We investigated the propagation of population pharmacokinetic information across clinical studies by applying Bayesian techniques. The aim was to summarize the population pharmacokinetic estimates of a study in appropriate statistical distributions in order to use them as Bayesian priors in consequent population pharmacokinetic analyses. Various data sets of simulated and real clinical data were fitted with WinBUGS, with and without informative priors. The posterior estimates of fittings with non-informative priors were used to build parametric informative priors and the whole procedure was carried on in a consecutive manner. The posterior distributions of the fittings with informative priors where compared to those of the meta-analysis fittings of the respective combinations of data sets. Good agreement was found, for the simulated and experimental datasets when the populations were exchangeable, with the posterior distribution from the fittings with the prior to be nearly identical to the ones estimated with meta-analysis. However, when populations were not exchangeble an alternative parametric form for the prior, the natural conjugate prior, had to be used in order to have consistent results. In conclusion, the results of a population pharmacokinetic analysis may be summarized in Bayesian prior distributions that can be used consecutively with other analyses. The procedure is an alternative to meta-analysis and gives comparable results. It has the advantage that it is faster than the meta-analysis, due to the large datasets used with the latter and can be performed when the data included in the prior are not actually available.
Efficient structure from motion on large scenes using UAV with position and pose information
NASA Astrophysics Data System (ADS)
Teng, Xichao; Yu, Qifeng; Shang, Yang; Luo, Jing; Wang, Gang
2018-04-01
In this paper, we exploit prior information from global positioning systems and inertial measurement units to speed up the process of large scene reconstruction from images acquired by Unmanned Aerial Vehicles. We utilize weak pose information and intrinsic parameter to obtain the projection matrix for each view. As compared to unmanned aerial vehicles' flight altitude, topographic relief can usually be ignored, we assume that the scene is flat and use weak perspective camera to get projective transformations between two views. Furthermore, we propose an overlap criterion and select potentially matching view pairs between projective transformed views. A robust global structure from motion method is used for image based reconstruction. Our real world experiments show that the approach is accurate, scalable and computationally efficient. Moreover, projective transformations between views can also be used to eliminate false matching.
ERIC Educational Resources Information Center
Wijekumar, Kausalai; Meyer, Bonnie J. F.; Lei, Puiwa; Cheng, Weiyi; Ji, Xuejun; Joshi, R. M.
2017-01-01
Reading and comprehending content area texts require learners to effectively select and encode with hierarchically strategic memory structures in order to combine new information with prior knowledge. Unfortunately, evidence from state and national tests shows that children fail to successfully navigate the reading comprehension challenges they…
Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jesse, S.; Chi, M.; Belianinov, A.
Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. In this paper, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO 3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in naturemore » and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. Finally, however, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.« less
Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography
Jesse, S.; Chi, M.; Belianinov, A.; ...
2016-05-23
Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. In this paper, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO 3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in naturemore » and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. Finally, however, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.« less
Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography
Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.
2016-01-01
Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy. PMID:27211523
Estimating Bayesian Phylogenetic Information Content
Lewis, Paul O.; Chen, Ming-Hui; Kuo, Lynn; Lewis, Louise A.; Fučíková, Karolina; Neupane, Suman; Wang, Yu-Bo; Shi, Daoyuan
2016-01-01
Measuring the phylogenetic information content of data has a long history in systematics. Here we explore a Bayesian approach to information content estimation. The entropy of the posterior distribution compared with the entropy of the prior distribution provides a natural way to measure information content. If the data have no information relevant to ranking tree topologies beyond the information supplied by the prior, the posterior and prior will be identical. Information in data discourages consideration of some hypotheses allowed by the prior, resulting in a posterior distribution that is more concentrated (has lower entropy) than the prior. We focus on measuring information about tree topology using marginal posterior distributions of tree topologies. We show that both the accuracy and the computational efficiency of topological information content estimation improve with use of the conditional clade distribution, which also allows topological information content to be partitioned by clade. We explore two important applications of our method: providing a compelling definition of saturation and detecting conflict among data partitions that can negatively affect analyses of concatenated data. [Bayesian; concatenation; conditional clade distribution; entropy; information; phylogenetics; saturation.] PMID:27155008
Worldwide Protein Data Bank validation information: usage and trends.
Smart, Oliver S; Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika; Kleywegt, Gerard J; Velankar, Sameer
2018-03-01
Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrends DB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics.
Worldwide Protein Data Bank validation information: usage and trends
Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika
2018-01-01
Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrendsDB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics. PMID:29533231
Faulon, Jean-Loup; Misra, Milind; Martin, Shawn; ...
2007-11-23
Motivation: Identifying protein enzymatic or pharmacological activities are important areas of research in biology and chemistry. Biological and chemical databases are increasingly being populated with linkages between protein sequences and chemical structures. Additionally, there is now sufficient information to apply machine-learning techniques to predict interactions between chemicals and proteins at a genome scale. Current machine-learning techniques use as input either protein sequences and structures or chemical information. We propose here a method to infer protein–chemical interactions using heterogeneous input consisting of both protein sequence and chemical information. Results: Our method relies on expressing proteins and chemicals with a common cheminformaticsmore » representation. We demonstrate our approach by predicting whether proteins can catalyze reactions not present in training sets. We also predict whether a given drug can bind a target, in the absence of prior binding information for that drug and target. Lastly, such predictions cannot be made with current machine-learning techniques requiring binding information for individual reactions or individual targets.« less
Phantom experiments using soft-prior regularization EIT for breast cancer imaging.
Murphy, Ethan K; Mahara, Aditya; Wu, Xiaotian; Halter, Ryan J
2017-06-01
A soft-prior regularization (SR) electrical impedance tomography (EIT) technique for breast cancer imaging is described, which shows an ability to accurately reconstruct tumor/inclusion conductivity values within a dense breast model investigated using a cylindrical and a breast-shaped tank. The SR-EIT method relies on knowing the spatial location of a suspicious lesion initially detected from a second imaging modality. Standard approaches (using Laplace smoothing and total variation regularization) without prior structural information are unable to accurately reconstruct or detect the tumors. The soft-prior approach represents a very significant improvement to these standard approaches, and has the potential to improve conventional imaging techniques, such as automated whole breast ultrasound (AWB-US), by providing electrical property information of suspicious lesions to improve AWB-US's ability to discriminate benign from cancerous lesions. Specifically, the best soft-regularization technique found average absolute tumor/inclusion errors of 0.015 S m -1 for the cylindrical test and 0.055 S m -1 and 0.080 S m -1 for the breast-shaped tank for 1.8 cm and 2.5 cm inclusions, respectively. The standard approaches were statistically unable to distinguish the tumor from the mammary gland tissue. An analysis of false tumors (benign suspicious lesions) provides extra insight into the potential and challenges EIT has for providing clinically relevant information. The ability to obtain accurate conductivity values of a suspicious lesion (>1.8 cm) detected from another modality (e.g. AWB-US) could significantly reduce false positives and result in a clinically important technology.
Barriers and facilitators of consumer use of nutrition labels at sit-down restaurant chains.
Auchincloss, Amy H; Young, Candace; Davis, Andrea L; Wasson, Sara; Chilton, Mariana; Karamanian, Vanesa
2013-12-01
Numerous localities have mandated that chain restaurants post nutrition information at the point of purchase. However, some studies suggest that consumers are not highly responsive to menu labelling. The present qualitative study explored influences on full-service restaurant customers’ noticing and using menu labelling. Five focus groups were conducted with thirty-six consumers. A semi-structured script elicited barriers and facilitators to using nutrition information by showing excerpts of real menus from full-service chain restaurants. Participants were recruited from a full-service restaurant chain in Philadelphia, Pennsylvania, USA, in September 2011. Focus group participants were mostly female, African American, with incomes <$US 60 000, mean age 36 years and education 14·5 years. At recruitment, 33 % (n 12) reported changing their order after seeing nutrition information on the menu. Three themes characterized influences on label use in restaurants: nutrition knowledge, menu design and display, and normative attitudes and behaviours. Barriers to using labels were low prior knowledge of nutrition; displaying nutrition information using codes; low expectations of the nutritional quality of restaurant food; and restaurant discounts, promotions and social influences that overwhelmed interest in nutrition and reinforced disinterest in nutrition. Facilitators were higher prior knowledge of recommended daily intake; spending time reading the menu; having strong prior interest in nutrition/healthy eating; and being with people who reinforced dietary priorities. Menu labelling use may increase if consumers learn a few key recommended dietary reference values, understand basic energy intake/expenditure scenarios and if chain restaurants present nutrition information in a user-friendly way and promote healthier items.
Bayesian generalized linear mixed modeling of Tuberculosis using informative priors
Woldegerima, Woldegebriel Assefa
2017-01-01
TB is rated as one of the world’s deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where there is no solid external information about the distribution of the parameter of interest. The main aim of this study is to profile people living with TB in South Africa. In this paper, identical regression models are fitted for classical and Bayesian approach both with non-informative and informative prior, using South Africa General Household Survey (GHS) data for the year 2014. For the Bayesian model with informative prior, South Africa General Household Survey dataset for the year 2011 to 2013 are used to set up priors for the model 2014. PMID:28257437
Conceptual change strategies in teaching genetics
NASA Astrophysics Data System (ADS)
Batzli, Laura Elizabeth
The purpose of this study was to evaluate the effectiveness of utilizing conceptual change strategies when teaching high school genetics. The study examined the effects of structuring instruction to provide students with cognitive situations which promote conceptual change, specifically instruction was structured to elicit students' prior knowledge. The goal of the study was that the students would not only be able to solve genetics problems and define basic terminology but they would also have constructed more scientific schemas of the actual processes involved in inheritance. This study is based on the constructivist theory of learning and conceptual change research which suggest that students are actively involved in the process of relating new information to prior knowledge as they construct new knowledge. Two sections of biology II classes received inquiry based instruction and participated in structured cooperative learning groups. However, the unique difference in the treatment group's instruction was the use of structured thought time and the resulting social interaction between the students. The treatment group students' instructional design allowed students to socially construct their cognitive knowledge after elicitation of their prior knowledge. In contrast, the instructional design for the control group students allowed them to socially construct their cognitive knowledge of genetics without the individually structured thought time. The results indicated that the conceptual change strategies with individually structured thought time improved the students' scientific mastery of genetics concepts and they maintained fewer post instructional alternative conceptions. Although all students gained the ability to correctly solve genetics problems, the treatment group students were able to explain the processes involved in terms of meiosis. The treatment group students were also able to better apply their knowledge to novel genetic situations. The implications for genetics instruction from these results were discussed.
Variable Selection with Prior Information for Generalized Linear Models via the Prior LASSO Method.
Jiang, Yuan; He, Yunxiao; Zhang, Heping
LASSO is a popular statistical tool often used in conjunction with generalized linear models that can simultaneously select variables and estimate parameters. When there are many variables of interest, as in current biological and biomedical studies, the power of LASSO can be limited. Fortunately, so much biological and biomedical data have been collected and they may contain useful information about the importance of certain variables. This paper proposes an extension of LASSO, namely, prior LASSO (pLASSO), to incorporate that prior information into penalized generalized linear models. The goal is achieved by adding in the LASSO criterion function an additional measure of the discrepancy between the prior information and the model. For linear regression, the whole solution path of the pLASSO estimator can be found with a procedure similar to the Least Angle Regression (LARS). Asymptotic theories and simulation results show that pLASSO provides significant improvement over LASSO when the prior information is relatively accurate. When the prior information is less reliable, pLASSO shows great robustness to the misspecification. We illustrate the application of pLASSO using a real data set from a genome-wide association study.
NASA Astrophysics Data System (ADS)
Aydin, Orhun; Caers, Jef Karel
2017-08-01
Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed methodology generates realistic fault network models conditioned to data and a conceptual model of the underlying tectonics.
Project Prospector: Unmanned Exploration and Apollo Support Program
NASA Technical Reports Server (NTRS)
1969-01-01
Prior to the establishment of a manned lunar observatory or base, it is essential that a compendium of information be available on the environment, composition, structure, and topography of the moon. In an effort to satisfy this need for improved and detailed information, NASA has undertaken a lunar program which ranges from the utilization of circumlunar flight vehicles, equipped with automatic photographic and radiation measuring equipment which responds to commands from the earth, to actual determination of surface composition and features obtained from unmanned instrumented spacecraft which impact the moon.
Liu, Bo; Wu, Huayi; Wang, Yandong; Liu, Wenming
2015-01-01
Main road features extracted from remotely sensed imagery play an important role in many civilian and military applications, such as updating Geographic Information System (GIS) databases, urban structure analysis, spatial data matching and road navigation. Current methods for road feature extraction from high-resolution imagery are typically based on threshold value segmentation. It is difficult however, to completely separate road features from the background. We present a new method for extracting main roads from high-resolution grayscale imagery based on directional mathematical morphology and prior knowledge obtained from the Volunteered Geographic Information found in the OpenStreetMap. The two salient steps in this strategy are: (1) using directional mathematical morphology to enhance the contrast between roads and non-roads; (2) using OpenStreetMap roads as prior knowledge to segment the remotely sensed imagery. Experiments were conducted on two ZiYuan-3 images and one QuickBird high-resolution grayscale image to compare our proposed method to other commonly used techniques for road feature extraction. The results demonstrated the validity and better performance of the proposed method for urban main road feature extraction. PMID:26397832
Pometti, Carolina L.; Bessega, Cecilia F.; Saidman, Beatriz O.; Vilardi, Juan C.
2014-01-01
Bayesian clustering as implemented in STRUCTURE or GENELAND software is widely used to form genetic groups of populations or individuals. On the other hand, in order to satisfy the need for less computer-intensive approaches, multivariate analyses are specifically devoted to extracting information from large datasets. In this paper, we report the use of a dataset of AFLP markers belonging to 15 sampling sites of Acacia caven for studying the genetic structure and comparing the consistency of three methods: STRUCTURE, GENELAND and DAPC. Of these methods, DAPC was the fastest one and showed accuracy in inferring the K number of populations (K = 12 using the find.clusters option and K = 15 with a priori information of populations). GENELAND in turn, provides information on the area of membership probabilities for individuals or populations in the space, when coordinates are specified (K = 12). STRUCTURE also inferred the number of K populations and the membership probabilities of individuals based on ancestry, presenting the result K = 11 without prior information of populations and K = 15 using the LOCPRIOR option. Finally, in this work all three methods showed high consistency in estimating the population structure, inferring similar numbers of populations and the membership probabilities of individuals to each group, with a high correlation between each other. PMID:24688293
40 CFR 60.2953 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2013 CFR
2013-07-01
... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... and Reporting § 60.2953 What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s...
40 CFR 60.2195 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2014 CFR
2014-07-01
... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned...
40 CFR 60.2953 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2014 CFR
2014-07-01
... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... and Reporting § 60.2953 What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s...
40 CFR 60.2195 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2013 CFR
2013-07-01
... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned...
Biomedical image segmentation using geometric deformable models and metaheuristics.
Mesejo, Pablo; Valsecchi, Andrea; Marrakchi-Kacem, Linda; Cagnoni, Stefano; Damas, Sergio
2015-07-01
This paper describes a hybrid level set approach for medical image segmentation. This new geometric deformable model combines region- and edge-based information with the prior shape knowledge introduced using deformable registration. Our proposal consists of two phases: training and test. The former implies the learning of the level set parameters by means of a Genetic Algorithm, while the latter is the proper segmentation, where another metaheuristic, in this case Scatter Search, derives the shape prior. In an experimental comparison, this approach has shown a better performance than a number of state-of-the-art methods when segmenting anatomical structures from different biomedical image modalities. Copyright © 2013 Elsevier Ltd. All rights reserved.
Collaborative learning in networks.
Mason, Winter; Watts, Duncan J
2012-01-17
Complex problems in science, business, and engineering typically require some tradeoff between exploitation of known solutions and exploration for novel ones, where, in many cases, information about known solutions can also disseminate among individual problem solvers through formal or informal networks. Prior research on complex problem solving by collectives has found the counterintuitive result that inefficient networks, meaning networks that disseminate information relatively slowly, can perform better than efficient networks for problems that require extended exploration. In this paper, we report on a series of 256 Web-based experiments in which groups of 16 individuals collectively solved a complex problem and shared information through different communication networks. As expected, we found that collective exploration improved average success over independent exploration because good solutions could diffuse through the network. In contrast to prior work, however, we found that efficient networks outperformed inefficient networks, even in a problem space with qualitative properties thought to favor inefficient networks. We explain this result in terms of individual-level explore-exploit decisions, which we find were influenced by the network structure as well as by strategic considerations and the relative payoff between maxima. We conclude by discussing implications for real-world problem solving and possible extensions.
Collaborative learning in networks
Mason, Winter; Watts, Duncan J.
2012-01-01
Complex problems in science, business, and engineering typically require some tradeoff between exploitation of known solutions and exploration for novel ones, where, in many cases, information about known solutions can also disseminate among individual problem solvers through formal or informal networks. Prior research on complex problem solving by collectives has found the counterintuitive result that inefficient networks, meaning networks that disseminate information relatively slowly, can perform better than efficient networks for problems that require extended exploration. In this paper, we report on a series of 256 Web-based experiments in which groups of 16 individuals collectively solved a complex problem and shared information through different communication networks. As expected, we found that collective exploration improved average success over independent exploration because good solutions could diffuse through the network. In contrast to prior work, however, we found that efficient networks outperformed inefficient networks, even in a problem space with qualitative properties thought to favor inefficient networks. We explain this result in terms of individual-level explore-exploit decisions, which we find were influenced by the network structure as well as by strategic considerations and the relative payoff between maxima. We conclude by discussing implications for real-world problem solving and possible extensions. PMID:22184216
Perception on Informed Consent Regarding Nursing Care Practices in a Tertiary Care Center.
Paudel, B; Shrestha, G K
Background Consent for care procedures is mandatory after receipt of adequate information. It maintains patient's rights and autonomy to make thoughtful decisions. Poor communication often leads to poor health quality. Objective To assess hospitalized patients' perception on informed consent regarding nursing care practices in a tertiary care center. Method This is a descriptive cross-sectional study among 113 admitted patients conducted in February 2012 at Dhulikhel Hospital, Nepal. Patients of various wards were selected using purposive non-probability sampling with at least 3 days of hospitalization. Close ended structured questionnaire was used to assess patients' perception on three different areas of informed consent (information giving, opportunity to make decision and taking prior consent). Result Among the participants 71.6% perceived positively regarding informed consent towards nursing care practices with a mean score of 3.32 ± 1.28. Patients' perception on various areas of informed consent viz. information giving, opportunities to make specific decision and taking prior consent were all positive with mean values of 3.43±1.12, 2.88±1.23, 3.65±1.49 respectively. Comparison of mean perception of informed consent with various variables revealed insignificant correlation (p-value >0.05) for age, educational level and previous hospitalization while it was significant (p-value < 0.05) for communication skills of nurses. Conclusion Majority of patients have positive perception on informed consent towards nursing care practices. Communication skills of nurses affect the perception of patients' regardless of age, education level and past experiences.
2 1/2-Year-Old Children Use Animacy and Syntax to Learn a New Noun
ERIC Educational Resources Information Center
Childers, Jane B.; Echols, Catharine H.
2004-01-01
We examine how attention to animacy information may contribute to children's developing knowledge of language. This research extends beyond prior research in that children were shown dynamic events with novel entities, and were asked not only to comprehend sentences but to use sentence structure to infer the meaning of a new word. In a 4 x 3…
Krasichkov, A S; Grigoriev, E B; Nifontov, E M; Shapovalov, V V
The paper presents an algorithm of cardio complex classification as part of processing the data of continuous cardiac monitoring. R-wave detection concurrently with cardio complex sorting is discussed. The core of this approach is the use of prior information about. cardio complex forms, segmental structure, and degree of kindness. Results of the sorting algorithm testing are provided.
Structuring and extracting knowledge for the support of hypothesis generation in molecular biology
Roos, Marco; Marshall, M Scott; Gibson, Andrew P; Schuemie, Martijn; Meij, Edgar; Katrenko, Sophia; van Hage, Willem Robert; Krommydas, Konstantinos; Adriaans, Pieter W
2009-01-01
Background Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are contained in the millions of documents available from PubMed. Semantic Web provides tools for sharing prior knowledge, while information retrieval and information extraction techniques enable its extraction from literature. Their combination makes prior knowledge available for computational analysis and inference. While some tools provide complete solutions that limit the control over the modeling and extraction processes, we seek a methodology that supports control by the experimenter over these critical processes. Results We describe progress towards automated support for the generation of biomolecular hypotheses. Semantic Web technologies are used to structure and store knowledge, while a workflow extracts knowledge from text. We designed minimal proto-ontologies in OWL for capturing different aspects of a text mining experiment: the biological hypothesis, text and documents, text mining, and workflow provenance. The models fit a methodology that allows focus on the requirements of a single experiment while supporting reuse and posterior analysis of extracted knowledge from multiple experiments. Our workflow is composed of services from the 'Adaptive Information Disclosure Application' (AIDA) toolkit as well as a few others. The output is a semantic model with putative biological relations, with each relation linked to the corresponding evidence. Conclusion We demonstrated a 'do-it-yourself' approach for structuring and extracting knowledge in the context of experimental research on biomolecular mechanisms. The methodology can be used to bootstrap the construction of semantically rich biological models using the results of knowledge extraction processes. Models specific to particular experiments can be constructed that, in turn, link with other semantic models, creating a web of knowledge that spans experiments. Mapping mechanisms can link to other knowledge resources such as OBO ontologies or SKOS vocabularies. AIDA Web Services can be used to design personalized knowledge extraction procedures. In our example experiment, we found three proteins (NF-Kappa B, p21, and Bax) potentially playing a role in the interplay between nutrients and epigenetic gene regulation. PMID:19796406
Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E
2018-01-01
One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.
Farid, Ahmed; Abdel-Aty, Mohamed; Lee, Jaeyoung; Eluru, Naveen
2017-09-01
Safety performance functions (SPFs) are essential tools for highway agencies to predict crashes, identify hotspots and assess safety countermeasures. In the Highway Safety Manual (HSM), a variety of SPFs are provided for different types of roadway facilities, crash types and severity levels. Agencies, lacking the necessary resources to develop own localized SPFs, may opt to apply the HSM's SPFs for their jurisdictions. Yet, municipalities that want to develop and maintain their regional SPFs might encounter the issue of the small sample bias. Bayesian inference is being conducted to address this issue by combining the current data with prior information to achieve reliable results. It follows that the essence of Bayesian statistics is the application of informative priors, obtained from other SPFs or experts' experiences. In this study, we investigate the applicability of informative priors for Bayesian negative binomial SPFs for rural divided multilane highway segments in Florida and California. An SPF with non-informative priors is developed for each state and its parameters' distributions are assigned to the other state's SPF as informative priors. The performances of SPFs are evaluated by applying each state's SPFs to the other state. The analysis is conducted for both total (KABCO) and severe (KAB) crashes. As per the results, applying one state's SPF with informative priors, which are the other state's SPF independent variable estimates, to the latter state's conditions yields better goodness of fit (GOF) values than applying the former state's SPF with non-informative priors to the conditions of the latter state. This is for both total and severe crash SPFs. Hence, for localities where it is not preferred to develop own localized SPFs and adopt SPFs from elsewhere to cut down on resources, application of informative priors is shown to facilitate the process. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.
A multistage motion vector processing method for motion-compensated frame interpolation.
Huang, Ai- Mei; Nguyen, Truong Q
2008-05-01
In this paper, a novel, low-complexity motion vector processing algorithm at the decoder is proposed for motion-compensated frame interpolation or frame rate up-conversion. We address the problems of having broken edges and deformed structures in an interpolated frame by hierarchically refining motion vectors on different block sizes. Our method explicitly considers the reliability of each received motion vector and has the capability of preserving the structure information. This is achieved by analyzing the distribution of residual energies and effectively merging blocks that have unreliable motion vectors. The motion vector reliability information is also used as a prior knowledge in motion vector refinement using a constrained vector median filter to avoid choosing identical unreliable one. We also propose using chrominance information in our method. Experimental results show that the proposed scheme has better visual quality and is also robust, even in video sequences with complex scenes and fast motion.
Autonomous learning based on cost assumptions: theoretical studies and experiments in robot control.
Ribeiro, C H; Hemerly, E M
2000-02-01
Autonomous learning techniques are based on experience acquisition. In most realistic applications, experience is time-consuming: it implies sensor reading, actuator control and algorithmic update, constrained by the learning system dynamics. The information crudeness upon which classical learning algorithms operate make such problems too difficult and unrealistic. Nonetheless, additional information for facilitating the learning process ideally should be embedded in such a way that the structural, well-studied characteristics of these fundamental algorithms are maintained. We investigate in this article a more general formulation of the Q-learning method that allows for a spreading of information derived from single updates towards a neighbourhood of the instantly visited state and converges to optimality. We show how this new formulation can be used as a mechanism to safely embed prior knowledge about the structure of the state space, and demonstrate it in a modified implementation of a reinforcement learning algorithm in a real robot navigation task.
Matranga, Domenica; Firenze, Alberto; Vullo, Angela
2013-10-01
The aim of this study was to show the potential of Bayesian analysis in statistical modelling of dental caries data. Because of the bounded nature of the dmft (DMFT) index, zero-inflated binomial (ZIB) and beta-binomial (ZIBB) models were considered. The effects of incorporating prior information available about the parameters of models were also shown. The data set used in this study was the Belo Horizonte Caries Prevention (BELCAP) study (Böhning et al. (1999)), consisting of five variables collected among 797 Brazilian school children designed to evaluate four programmes for reducing caries. Only the eight primary molar teeth were considered in the data set. A data augmentation algorithm was used for estimation. Firstly, noninformative priors were used to express our lack of knowledge about the regression parameters. Secondly, prior information about the probability of being a structural zero dmft and the probability of being caries affected in the subpopulation of susceptible children was incorporated. With noninformative priors, the best fitting model was the ZIBB. Education (OR = 0.76, 95% CrI: 0.59, 0.99), all interventions (OR = 0.46, 95% CrI: 0.35, 0.62), rinsing (OR = 0.61, 95% CrI: 0.47, 0.80) and hygiene (OR = 0.65, 95% CrI: 0.49, 0.86) were demonstrated to be factors protecting children from being caries affected. Being male increased the probability of being caries diseased (OR = 1.19, 95% CrI: 1.01, 1.42). However, after incorporating informative priors, ZIB models' estimates were not influenced, while ZIBB models reduced deviance and confirmed the association with all interventions and rinsing only. In our application, Bayesian estimates showed a similar accuracy and precision than likelihood-based estimates, although they offered many computational advantages and the possibility of expressing all forms of uncertainty in terms of probability. The overdispersion parameter could expound why the introduction of prior information had significant effects on the parameters of the ZIBB model, while ZIB estimates remained unchanged. Finally, the best performance of ZIBB compared to the ZIB model was shown to catch overdispersion in data. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Bayesian estimation of the transmissivity spatial structure from pumping test data
NASA Astrophysics Data System (ADS)
Demir, Mehmet Taner; Copty, Nadim K.; Trinchero, Paolo; Sanchez-Vila, Xavier
2017-06-01
Estimating the statistical parameters (mean, variance, and integral scale) that define the spatial structure of the transmissivity or hydraulic conductivity fields is a fundamental step for the accurate prediction of subsurface flow and contaminant transport. In practice, the determination of the spatial structure is a challenge because of spatial heterogeneity and data scarcity. In this paper, we describe a novel approach that uses time drawdown data from multiple pumping tests to determine the transmissivity statistical spatial structure. The method builds on the pumping test interpretation procedure of Copty et al. (2011) (Continuous Derivation method, CD), which uses the time-drawdown data and its time derivative to estimate apparent transmissivity values as a function of radial distance from the pumping well. A Bayesian approach is then used to infer the statistical parameters of the transmissivity field by combining prior information about the parameters and the likelihood function expressed in terms of radially-dependent apparent transmissivities determined from pumping tests. A major advantage of the proposed Bayesian approach is that the likelihood function is readily determined from randomly generated multiple realizations of the transmissivity field, without the need to solve the groundwater flow equation. Applying the method to synthetically-generated pumping test data, we demonstrate that, through a relatively simple procedure, information on the spatial structure of the transmissivity may be inferred from pumping tests data. It is also shown that the prior parameter distribution has a significant influence on the estimation procedure, given the non-uniqueness of the estimation procedure. Results also indicate that the reliability of the estimated transmissivity statistical parameters increases with the number of available pumping tests.
34 CFR 99.30 - Under what conditions is prior consent required to disclose information?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 1 2010-07-01 2010-07-01 false Under what conditions is prior consent required to disclose information? 99.30 Section 99.30 Education Office of the Secretary, Department of Education FAMILY... Information From Education Records? § 99.30 Under what conditions is prior consent required to disclose...
ERIC Educational Resources Information Center
Woloshyn, Vera E.; And Others
1994-01-01
Thirty-two factual statements, half consistent and half not consistent with subjects' prior knowledge, were processed by 140 sixth and seventh graders. Half were directed to use elaborative interrogation (using prior knowledge) to answer why each statement was true. Across all memory measures, elaborative interrogation subjects performed better…
When generating answers benefits arithmetic skill: the importance of prior knowledge.
Rittle-Johnson, Bethany; Kmicikewycz, Alexander Oleksij
2008-09-01
People remember information better if they generate the information while studying rather than read the information. However, prior research has not investigated whether this generation effect extends to related but unstudied items and has not been conducted in classroom settings. We compared third graders' success on studied and unstudied multiplication problems after they spent a class period generating answers to problems or reading the answers from a calculator. The effect of condition interacted with prior knowledge. Students with low prior knowledge had higher accuracy in the generate condition, but as prior knowledge increased, the advantage of generating answers decreased. The benefits of generating answers may extend to unstudied items and to classroom settings, but only for learners with low prior knowledge.
Tree Biomass Estimation of Chinese fir (Cunninghamia lanceolata) Based on Bayesian Method
Zhang, Jianguo
2013-01-01
Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass. PMID:24278198
Tree biomass estimation of Chinese fir (Cunninghamia lanceolata) based on Bayesian method.
Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo
2013-01-01
Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation W = a(D2H)b was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass.
Sleep Spindle Density Predicts the Effect of Prior Knowledge on Memory Consolidation
Lambon Ralph, Matthew A.; Kempkes, Marleen; Cousins, James N.; Lewis, Penelope A.
2016-01-01
Information that relates to a prior knowledge schema is remembered better and consolidates more rapidly than information that does not. Another factor that influences memory consolidation is sleep and growing evidence suggests that sleep-related processing is important for integration with existing knowledge. Here, we perform an examination of how sleep-related mechanisms interact with schema-dependent memory advantage. Participants first established a schema over 2 weeks. Next, they encoded new facts, which were either related to the schema or completely unrelated. After a 24 h retention interval, including a night of sleep, which we monitored with polysomnography, participants encoded a second set of facts. Finally, memory for all facts was tested in a functional magnetic resonance imaging scanner. Behaviorally, sleep spindle density predicted an increase of the schema benefit to memory across the retention interval. Higher spindle densities were associated with reduced decay of schema-related memories. Functionally, spindle density predicted increased disengagement of the hippocampus across 24 h for schema-related memories only. Together, these results suggest that sleep spindle activity is associated with the effect of prior knowledge on memory consolidation. SIGNIFICANCE STATEMENT Episodic memories are gradually assimilated into long-term memory and this process is strongly influenced by sleep. The consolidation of new information is also influenced by its relationship to existing knowledge structures, or schemas, but the role of sleep in such schema-related consolidation is unknown. We show that sleep spindle density predicts the extent to which schemas influence the consolidation of related facts. This is the first evidence that sleep is associated with the interaction between prior knowledge and long-term memory formation. PMID:27030764
Carré, Clément; Mas, André; Krouk, Gabriel
2017-01-01
Inferring transcriptional gene regulatory networks from transcriptomic datasets is a key challenge of systems biology, with potential impacts ranging from medicine to agronomy. There are several techniques used presently to experimentally assay transcription factors to target relationships, defining important information about real gene regulatory networks connections. These techniques include classical ChIP-seq, yeast one-hybrid, or more recently, DAP-seq or target technologies. These techniques are usually used to validate algorithm predictions. Here, we developed a reverse engineering approach based on mathematical and computer simulation to evaluate the impact that this prior knowledge on gene regulatory networks may have on training machine learning algorithms. First, we developed a gene regulatory networks-simulating engine called FRANK (Fast Randomizing Algorithm for Network Knowledge) that is able to simulate large gene regulatory networks (containing 10 4 genes) with characteristics of gene regulatory networks observed in vivo. FRANK also generates stable or oscillatory gene expression directly produced by the simulated gene regulatory networks. The development of FRANK leads to important general conclusions concerning the design of large and stable gene regulatory networks harboring scale free properties (built ex nihilo). In combination with supervised (accepting prior knowledge) support vector machine algorithm we (i) address biologically oriented questions concerning our capacity to accurately reconstruct gene regulatory networks and in particular we demonstrate that prior-knowledge structure is crucial for accurate learning, and (ii) draw conclusions to inform experimental design to performed learning able to solve gene regulatory networks in the future. By demonstrating that our predictions concerning the influence of the prior-knowledge structure on support vector machine learning capacity holds true on real data ( Escherichia coli K14 network reconstruction using network and transcriptomic data), we show that the formalism used to build FRANK can to some extent be a reasonable model for gene regulatory networks in real cells.
NASA Astrophysics Data System (ADS)
Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Karion, A.; Mueller, K.; Gourdji, S.; Martin, C.; Whetstone, J. R.
2017-12-01
The National Institute of Standards and Technology (NIST) supports the North-East Corridor Baltimore Washington (NEC-B/W) project and Indianapolis Flux Experiment (INFLUX) aiming to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties. These projects employ different flux estimation methods including top-down inversion approaches. The traditional Bayesian inversion method estimates emission distributions by updating prior information using atmospheric observations of Green House Gases (GHG) coupled to an atmospheric and dispersion model. The magnitude of the update is dependent upon the observed enhancement along with the assumed errors such as those associated with prior information and the atmospheric transport and dispersion model. These errors are specified within the inversion covariance matrices. The assumed structure and magnitude of the specified errors can have large impact on the emission estimates from the inversion. The main objective of this work is to build a data-adaptive model for these covariances matrices. We construct a synthetic data experiment using a Kalman Filter inversion framework (Lopez et al., 2017) employing different configurations of transport and dispersion model and an assumed prior. Unlike previous traditional Bayesian approaches, we estimate posterior emissions using regularized sample covariance matrices associated with prior errors to investigate whether the structure of the matrices help to better recover our hypothetical true emissions. To incorporate transport model error, we use ensemble of transport models combined with space-time analytical covariance to construct a covariance that accounts for errors in space and time. A Kalman Filter is then run using these covariances along with Maximum Likelihood Estimates (MLE) of the involved parameters. Preliminary results indicate that specifying sptio-temporally varying errors in the error covariances can improve the flux estimates and uncertainties. We also demonstrate that differences between the modeled and observed meteorology can be used to predict uncertainties associated with atmospheric transport and dispersion modeling which can help improve the skill of an inversion at urban scales.
Commowick, Olivier; Warfield, Simon K
2010-01-01
In order to evaluate the quality of segmentations of an image and assess intra- and inter-expert variability in segmentation performance, an Expectation Maximization (EM) algorithm for Simultaneous Truth And Performance Level Estimation (STAPLE) was recently developed. This algorithm, originally presented for segmentation validation, has since been used for many applications, such as atlas construction and decision fusion. However, the manual delineation of structures of interest is a very time consuming and burdensome task. Further, as the time required and burden of manual delineation increase, the accuracy of the delineation is decreased. Therefore, it may be desirable to ask the experts to delineate only a reduced number of structures or the segmentation of all structures by all experts may simply not be achieved. Fusion from data with some structures not segmented by each expert should be carried out in a manner that accounts for the missing information. In other applications, locally inconsistent segmentations may drive the STAPLE algorithm into an undesirable local optimum, leading to misclassifications or misleading experts performance parameters. We present a new algorithm that allows fusion with partial delineation and which can avoid convergence to undesirable local optima in the presence of strongly inconsistent segmentations. The algorithm extends STAPLE by incorporating prior probabilities for the expert performance parameters. This is achieved through a Maximum A Posteriori formulation, where the prior probabilities for the performance parameters are modeled by a beta distribution. We demonstrate that this new algorithm enables dramatically improved fusion from data with partial delineation by each expert in comparison to fusion with STAPLE. PMID:20879379
Commowick, Olivier; Warfield, Simon K
2010-01-01
In order to evaluate the quality of segmentations of an image and assess intra- and inter-expert variability in segmentation performance, an Expectation Maximization (EM) algorithm for Simultaneous Truth And Performance Level Estimation (STAPLE) was recently developed. This algorithm, originally presented for segmentation validation, has since been used for many applications, such as atlas construction and decision fusion. However, the manual delineation of structures of interest is a very time consuming and burdensome task. Further, as the time required and burden of manual delineation increase, the accuracy of the delineation is decreased. Therefore, it may be desirable to ask the experts to delineate only a reduced number of structures or the segmentation of all structures by all experts may simply not be achieved. Fusion from data with some structures not segmented by each expert should be carried out in a manner that accounts for the missing information. In other applications, locally inconsistent segmentations may drive the STAPLE algorithm into an undesirable local optimum, leading to misclassifications or misleading experts performance parameters. We present a new algorithm that allows fusion with partial delineation and which can avoid convergence to undesirable local optima in the presence of strongly inconsistent segmentations. The algorithm extends STAPLE by incorporating prior probabilities for the expert performance parameters. This is achieved through a Maximum A Posteriori formulation, where the prior probabilities for the performance parameters are modeled by a beta distribution. We demonstrate that this new algorithm enables dramatically improved fusion from data with partial delineation by each expert in comparison to fusion with STAPLE.
A comment on priors for Bayesian occupancy models.
Northrup, Joseph M; Gerber, Brian D
2018-01-01
Understanding patterns of species occurrence and the processes underlying these patterns is fundamental to the study of ecology. One of the more commonly used approaches to investigate species occurrence patterns is occupancy modeling, which can account for imperfect detection of a species during surveys. In recent years, there has been a proliferation of Bayesian modeling in ecology, which includes fitting Bayesian occupancy models. The Bayesian framework is appealing to ecologists for many reasons, including the ability to incorporate prior information through the specification of prior distributions on parameters. While ecologists almost exclusively intend to choose priors so that they are "uninformative" or "vague", such priors can easily be unintentionally highly informative. Here we report on how the specification of a "vague" normally distributed (i.e., Gaussian) prior on coefficients in Bayesian occupancy models can unintentionally influence parameter estimation. Using both simulated data and empirical examples, we illustrate how this issue likely compromises inference about species-habitat relationships. While the extent to which these informative priors influence inference depends on the data set, researchers fitting Bayesian occupancy models should conduct sensitivity analyses to ensure intended inference, or employ less commonly used priors that are less informative (e.g., logistic or t prior distributions). We provide suggestions for addressing this issue in occupancy studies, and an online tool for exploring this issue under different contexts.
Accommodating Uncertainty in Prior Distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Vander Wiel, Scott Alan
2017-01-19
A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, W; Yin, F; Zhang, Y
Purpose: To investigate the feasibility of using structure-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. Methods: A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion-model extracted by global PCA and a free-form deformation (GMM-FD) technique, using data fidelity constraint and the deformation energy minimization. In thismore » study, a new structural-PCA method was developed to build a structural motion-model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume. The estimation accuracy was evaluated by the Volume-Percent-Difference (VPD)/Center-of-Mass-Shift (COMS) between lesions in the estimated and “ground-truth” on board 4D-CBCT. Results: Among 6 different XCAT scenarios corresponding to respirational and anatomical changes from planning CT to on-board using single 30° on-board projections, the VPD/COMS for SMM-WFD was reduced to 10.64±3.04%/1.20±0.45mm from 21.72±9.24%/1.80±0.53mm for GMM-FD. Using 15° orthogonal projections, the VPD/COMS was further reduced to 1.91±0.86%/0.31±0.42mm based on SMM-WFD. Conclusion: Compared to GMM-FD technique, the SMM-WFD technique can substantially improve the 4D-CBCT estimation accuracy using extremely small scan angles to provide ultra-fast 4D verification. This work was supported by the National Institutes of Health under Grant No. R01-CA184173 and a research grant from Varian Medical Systems.« less
Human action recognition with group lasso regularized-support vector machine
NASA Astrophysics Data System (ADS)
Luo, Huiwu; Lu, Huanzhang; Wu, Yabei; Zhao, Fei
2016-05-01
The bag-of-visual-words (BOVW) and Fisher kernel are two popular models in human action recognition, and support vector machine (SVM) is the most commonly used classifier for the two models. We show two kinds of group structures in the feature representation constructed by BOVW and Fisher kernel, respectively, since the structural information of feature representation can be seen as a prior for the classifier and can improve the performance of the classifier, which has been verified in several areas. However, the standard SVM employs L2-norm regularization in its learning procedure, which penalizes each variable individually and cannot express the structural information of feature representation. We replace the L2-norm regularization with group lasso regularization in standard SVM, and a group lasso regularized-support vector machine (GLRSVM) is proposed. Then, we embed the group structural information of feature representation into GLRSVM. Finally, we introduce an algorithm to solve the optimization problem of GLRSVM by alternating directions method of multipliers. The experiments evaluated on KTH, YouTube, and Hollywood2 datasets show that our method achieves promising results and improves the state-of-the-art methods on KTH and YouTube datasets.
Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin
2016-04-01
There is a large amount of functional genetic data available, which can be used to inform fine-mapping association studies (in diseases with well-characterised disease pathways). Single nucleotide polymorphism (SNP) prioritization via Bayes factors is attractive because prior information can inform the effect size or the prior probability of causal association. This approach requires the specification of the effect size. If the information needed to estimate a priori the probability density for the effect sizes for causal SNPs in a genomic region isn't consistent or isn't available, then specifying a prior variance for the effect sizes is challenging. We propose both an empirical method to estimate this prior variance, and a coherent approach to using SNP-level functional data, to inform the prior probability of causal association. Through simulation we show that when ranking SNPs by our empirical Bayes factor in a fine-mapping study, the causal SNP rank is generally as high or higher than the rank using Bayes factors with other plausible values of the prior variance. Importantly, we also show that assigning SNP-specific prior probabilities of association based on expert prior functional knowledge of the disease mechanism can lead to improved causal SNPs ranks compared to ranking with identical prior probabilities of association. We demonstrate the use of our methods by applying the methods to the fine mapping of the CASP8 region of chromosome 2 using genotype data from the Collaborative Oncological Gene-Environment Study (COGS) Consortium. The data we analysed included approximately 46,000 breast cancer case and 43,000 healthy control samples. © 2016 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Backus, George E.
1999-01-01
The purpose of the grant was to study how prior information about the geomagnetic field can be used to interpret surface and satellite magnetic measurements, to generate quantitative descriptions of prior information that might be so used, and to use this prior information to obtain from satellite data a model of the core field with statistically justifiable error estimates. The need for prior information in geophysical inversion has long been recognized. Data sets are finite, and faithful descriptions of aspects of the earth almost always require infinite-dimensional model spaces. By themselves, the data can confine the correct earth model only to an infinite-dimensional subset of the model space. Earth properties other than direct functions of the observed data cannot be estimated from those data without prior information about the earth. Prior information is based on what the observer already knows before the data become available. Such information can be "hard" or "soft". Hard information is a belief that the real earth must lie in some known region of model space. For example, the total ohmic dissipation in the core is probably less that the total observed geothermal heat flow out of the earth's surface. (In principle, ohmic heat in the core can be recaptured to help drive the dynamo, but this effect is probably small.) "Soft" information is a probability distribution on the model space, a distribution that the observer accepts as a quantitative description of her/his beliefs about the earth. The probability distribution can be a subjective prior in the sense of Bayes or the objective result of a statistical study of previous data or relevant theories.
Ander, Bradley P.; Zhang, Xiaoshuai; Xue, Fuzhong; Sharp, Frank R.; Yang, Xiaowei
2013-01-01
The discovery of genetic or genomic markers plays a central role in the development of personalized medicine. A notable challenge exists when dealing with the high dimensionality of the data sets, as thousands of genes or millions of genetic variants are collected on a relatively small number of subjects. Traditional gene-wise selection methods using univariate analyses face difficulty to incorporate correlational, structural, or functional structures amongst the molecular measures. For microarray gene expression data, we first summarize solutions in dealing with ‘large p, small n’ problems, and then propose an integrative Bayesian variable selection (iBVS) framework for simultaneously identifying causal or marker genes and regulatory pathways. A novel partial least squares (PLS) g-prior for iBVS is developed to allow the incorporation of prior knowledge on gene-gene interactions or functional relationships. From the point view of systems biology, iBVS enables user to directly target the joint effects of multiple genes and pathways in a hierarchical modeling diagram to predict disease status or phenotype. The estimated posterior selection probabilities offer probabilitic and biological interpretations. Both simulated data and a set of microarray data in predicting stroke status are used in validating the performance of iBVS in a Probit model with binary outcomes. iBVS offers a general framework for effective discovery of various molecular biomarkers by combining data-based statistics and knowledge-based priors. Guidelines on making posterior inferences, determining Bayesian significance levels, and improving computational efficiencies are also discussed. PMID:23844055
Peng, Bin; Zhu, Dianwen; Ander, Bradley P; Zhang, Xiaoshuai; Xue, Fuzhong; Sharp, Frank R; Yang, Xiaowei
2013-01-01
The discovery of genetic or genomic markers plays a central role in the development of personalized medicine. A notable challenge exists when dealing with the high dimensionality of the data sets, as thousands of genes or millions of genetic variants are collected on a relatively small number of subjects. Traditional gene-wise selection methods using univariate analyses face difficulty to incorporate correlational, structural, or functional structures amongst the molecular measures. For microarray gene expression data, we first summarize solutions in dealing with 'large p, small n' problems, and then propose an integrative Bayesian variable selection (iBVS) framework for simultaneously identifying causal or marker genes and regulatory pathways. A novel partial least squares (PLS) g-prior for iBVS is developed to allow the incorporation of prior knowledge on gene-gene interactions or functional relationships. From the point view of systems biology, iBVS enables user to directly target the joint effects of multiple genes and pathways in a hierarchical modeling diagram to predict disease status or phenotype. The estimated posterior selection probabilities offer probabilitic and biological interpretations. Both simulated data and a set of microarray data in predicting stroke status are used in validating the performance of iBVS in a Probit model with binary outcomes. iBVS offers a general framework for effective discovery of various molecular biomarkers by combining data-based statistics and knowledge-based priors. Guidelines on making posterior inferences, determining Bayesian significance levels, and improving computational efficiencies are also discussed.
Metadynamic metainference: Enhanced sampling of the metainference ensemble using metadynamics
Bonomi, Massimiliano; Camilloni, Carlo; Vendruscolo, Michele
2016-01-01
Accurate and precise structural ensembles of proteins and macromolecular complexes can be obtained with metainference, a recently proposed Bayesian inference method that integrates experimental information with prior knowledge and deals with all sources of errors in the data as well as with sample heterogeneity. The study of complex macromolecular systems, however, requires an extensive conformational sampling, which represents a separate challenge. To address such challenge and to exhaustively and efficiently generate structural ensembles we combine metainference with metadynamics and illustrate its application to the calculation of the free energy landscape of the alanine dipeptide. PMID:27561930
BaTMAn: Bayesian Technique for Multi-image Analysis
NASA Astrophysics Data System (ADS)
Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.
2016-12-01
Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.
Heggland, Liv-Helen; Hausken, Kjell
2013-05-01
The aim of this article is to identify how health care professionals and patients experience patient participation in decision-making processes in hospitals. Eighteen semi-structured interviews with experts from different disciplines such as medicine and nursing in surgical departments as well as patients who have undergone surgical treatment constitute the data. By content analysis four categories of patient participation were identified: information dissemination, formulation of options, integration of information, and control. To meet the increasing demands of patient participation, this categorization with four identified critical areas for participation in decision-making has important implications in guiding information support for patients prior to surgery and during hospitalization.
On a full Bayesian inference for force reconstruction problems
NASA Astrophysics Data System (ADS)
Aucejo, M.; De Smet, O.
2018-05-01
In a previous paper, the authors introduced a flexible methodology for reconstructing mechanical sources in the frequency domain from prior local information on both their nature and location over a linear and time invariant structure. The proposed approach was derived from Bayesian statistics, because of its ability in mathematically accounting for experimenter's prior knowledge. However, since only the Maximum a Posteriori estimate was computed, the posterior uncertainty about the regularized solution given the measured vibration field, the mechanical model and the regularization parameter was not assessed. To answer this legitimate question, this paper fully exploits the Bayesian framework to provide, from a Markov Chain Monte Carlo algorithm, credible intervals and other statistical measures (mean, median, mode) for all the parameters of the force reconstruction problem.
Ruck, Jessica M; Van Pilsum Rasmussen, Sarah E; Henderson, Macey L; Massie, Allan B; Segev, Dorry L
2018-06-08
Efforts are underway to improve living kidney donor (LKD) education, but current LKD concerns and information-gathering preferences have not been ascertained to inform evidence-based resource development. As a result, prior studies have found that donors desire information that is not included in current informed consent and/or educational materials. We conducted semi-structured interviews with 50 LKDs who donated at our center to assess (1) concerns about donation that they either had personally before or after donation or heard from family members or friends, (2) information that they had desired before donation, and (3) where they sought information about donation. We used thematic analysis of verbatim interview transcriptions to identify donation-related concerns. We compared the demographic characteristics of participants reporting specific concerns using Fisher's exact test. We identified 19 unique concerns that participants had or heard about living kidney donation. 20% of participants reported having had no pre-donation concerns; 38% reported no post-donation concerns. The most common concern pre-donation was future kidney failure (22%), post-donation was the recovery process (24%), and from family was endangering their family unit (16%). 44% of participants reported being less concerned than family. 26% of participants wished they had had additional information prior to donating, including practical advice for recovery (10%) and information about specific complications (14%). Caucasian participants were more likely to hear at least one concern from family (76% vs. 33%, p = 0.02). The most commonly consulted educational resources were health care providers (100%) and websites (79% of donors since 2000). 26% of participants had had contact with other donors; an additional 20% desired contact with other LKDs. Potential donors not only have personal donation-related concerns but frequently hear donation-related concerns from family members and friends. Current gaps in donor education include an absence of practical, peer-to-peer advice about donation from other prior donors and materials directed and potential donors' family members and friends. These findings can inform the development of new educational practices and resources targeted not only at LKDs but at their social networks.
Geometry-aware multiscale image registration via OBBTree-based polyaffine log-demons.
Seiler, Christof; Pennec, Xavier; Reyes, Mauricio
2011-01-01
Non-linear image registration is an important tool in many areas of image analysis. For instance, in morphometric studies of a population of brains, free-form deformations between images are analyzed to describe the structural anatomical variability. Such a simple deformation model is justified by the absence of an easy expressible prior about the shape changes. Applying the same algorithms used in brain imaging to orthopedic images might not be optimal due to the difference in the underlying prior on the inter-subject deformations. In particular, using an un-informed deformation prior often leads to local minima far from the expected solution. To improve robustness and promote anatomically meaningful deformations, we propose a locally affine and geometry-aware registration algorithm that automatically adapts to the data. We build upon the log-domain demons algorithm and introduce a new type of OBBTree-based regularization in the registration with a natural multiscale structure. The regularization model is composed of a hierarchy of locally affine transformations via their logarithms. Experiments on mandibles show improved accuracy and robustness when used to initialize the demons, and even similar performance by direct comparison to the demons, with a significantly lower degree of freedom. This closes the gap between polyaffine and non-rigid registration and opens new ways to statistically analyze the registration results.
Receptive Field Inference with Localized Priors
Park, Mijung; Pillow, Jonathan W.
2011-01-01
The linear receptive field describes a mapping from sensory stimuli to a one-dimensional variable governing a neuron's spike response. However, traditional receptive field estimators such as the spike-triggered average converge slowly and often require large amounts of data. Bayesian methods seek to overcome this problem by biasing estimates towards solutions that are more likely a priori, typically those with small, smooth, or sparse coefficients. Here we introduce a novel Bayesian receptive field estimator designed to incorporate locality, a powerful form of prior information about receptive field structure. The key to our approach is a hierarchical receptive field model that flexibly adapts to localized structure in both spacetime and spatiotemporal frequency, using an inference method known as empirical Bayes. We refer to our method as automatic locality determination (ALD), and show that it can accurately recover various types of smooth, sparse, and localized receptive fields. We apply ALD to neural data from retinal ganglion cells and V1 simple cells, and find it achieves error rates several times lower than standard estimators. Thus, estimates of comparable accuracy can be achieved with substantially less data. Finally, we introduce a computationally efficient Markov Chain Monte Carlo (MCMC) algorithm for fully Bayesian inference under the ALD prior, yielding accurate Bayesian confidence intervals for small or noisy datasets. PMID:22046110
NASA Astrophysics Data System (ADS)
Feng, S.; Lauvaux, T.; Keller, K.; Davis, K. J.
2016-12-01
Current estimates of biogenic carbon fluxes over North America based on top-down atmospheric inversions are subject to considerable uncertainty. This uncertainty stems to a large part from the uncertain prior fluxes estimates with the associated error covariances and approximations in the atmospheric transport models that link observed carbon dioxide mixing ratios with surface fluxes. Specifically, approximations in the representation of vertical mixing associated with atmospheric turbulence or convective transport and largely under-determined prior fluxes and their error structures significantly hamper our capacity to reliably estimate regional carbon fluxes. The Atmospheric Carbon and Transport - America (ACT-America) mission aims at reducing the uncertainties in inverse fluxes at the regional-scale by deploying airborne and ground-based platforms to characterize atmospheric GHG mixing ratios and the concurrent atmospheric dynamics. Two aircraft measure the 3-dimensional distribution of greenhouse gases at synoptic scales, focusing on the atmospheric boundary layer and the free troposphere during both fair and stormy weather conditions. Here we analyze two main questions: (i) What level of information can we expect from the currently planned observations? (ii) How might ACT-America reduce the hindcast and predictive uncertainty of carbon estimates over North America?
Exploring information provision in reconstructive breast surgery: A qualitative study.
Potter, Shelley; Mills, Nicola; Cawthorn, Simon; Wilson, Sherif; Blazeby, Jane
2015-12-01
Women considering reconstructive breast surgery (RBS) require adequate information to make informed treatment decisions. This study explored patients' and health professionals' (HPs) perceptions of the adequacy of information provided for decision-making in RBS. Semi-structured interviews with a purposive sample of patients who had undergone RBS and HPs providing specialist care explored participants' experiences of information provision prior to RBS. Professionals reported providing standardised verbal, written and photographic information about the process and outcomes of surgery. Women, by contrast, reported varying levels of information provision. Some felt fully-informed but others perceived they had received insufficient information about available treatment options or possible outcomes of surgery to make an informed decision. Women need adequate information to make informed decisions about RBS and current practice may not meet women's needs. Minimum agreed standards of information provision, especially about alternative types of reconstruction, are recommended to improve decision-making in RBS. Copyright © 2015 Elsevier Ltd. All rights reserved.
A comment on priors for Bayesian occupancy models
Gerber, Brian D.
2018-01-01
Understanding patterns of species occurrence and the processes underlying these patterns is fundamental to the study of ecology. One of the more commonly used approaches to investigate species occurrence patterns is occupancy modeling, which can account for imperfect detection of a species during surveys. In recent years, there has been a proliferation of Bayesian modeling in ecology, which includes fitting Bayesian occupancy models. The Bayesian framework is appealing to ecologists for many reasons, including the ability to incorporate prior information through the specification of prior distributions on parameters. While ecologists almost exclusively intend to choose priors so that they are “uninformative” or “vague”, such priors can easily be unintentionally highly informative. Here we report on how the specification of a “vague” normally distributed (i.e., Gaussian) prior on coefficients in Bayesian occupancy models can unintentionally influence parameter estimation. Using both simulated data and empirical examples, we illustrate how this issue likely compromises inference about species-habitat relationships. While the extent to which these informative priors influence inference depends on the data set, researchers fitting Bayesian occupancy models should conduct sensitivity analyses to ensure intended inference, or employ less commonly used priors that are less informative (e.g., logistic or t prior distributions). We provide suggestions for addressing this issue in occupancy studies, and an online tool for exploring this issue under different contexts. PMID:29481554
Xi, Jianing; Wang, Minghui; Li, Ao
2018-06-05
Discovery of mutated driver genes is one of the primary objective for studying tumorigenesis. To discover some relatively low frequently mutated driver genes from somatic mutation data, many existing methods incorporate interaction network as prior information. However, the prior information of mRNA expression patterns are not exploited by these existing network-based methods, which is also proven to be highly informative of cancer progressions. To incorporate prior information from both interaction network and mRNA expressions, we propose a robust and sparse co-regularized nonnegative matrix factorization to discover driver genes from mutation data. Furthermore, our framework also conducts Frobenius norm regularization to overcome overfitting issue. Sparsity-inducing penalty is employed to obtain sparse scores in gene representations, of which the top scored genes are selected as driver candidates. Evaluation experiments by known benchmarking genes indicate that the performance of our method benefits from the two type of prior information. Our method also outperforms the existing network-based methods, and detect some driver genes that are not predicted by the competing methods. In summary, our proposed method can improve the performance of driver gene discovery by effectively incorporating prior information from interaction network and mRNA expression patterns into a robust and sparse co-regularized matrix factorization framework.
Acoustic Emission of Large PRSEUS Structures (Pultruded Rod Stitched Efficient Unitized Structure)
NASA Technical Reports Server (NTRS)
Horne, Michael R.; Juarez, Peter D.
2016-01-01
In the role of structural health monitoring (SHM), Acoustic Emission (AE) analysis is being investigated as an effective method for tracking damage development in large composite structures under load. Structures made using Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) for damage tolerant, light, and economical airframe construction are being pursued by The Boeing Company and NASA under the Environmentally Responsible Aircraft initiative (ERA). The failure tests of two PRSEUS substructures based on the Boeing Hybrid Wing Body fuselage concept were conducted during third quarter 2011 and second quarter 2015. One fundamental concern of these tests was determining the effectiveness of the stitched integral stiffeners to inhibit damage progression. By design, severe degradation of load carrying capability should not occur prior to Design Ultimate Load (DUL). While minor damage prior to DUL was anticipated, the integral stitching should not fail since this would allow a stiffener-skin delamination to progress rapidly and alter the transfer of load into the stiffeners. In addition, the stiffeners should not fracture because they are fundamental to structural integrity. Getting the best information from each AE sensor is a primary consideration because a sparse network of sensors is implemented. Sensitivity to stiffener-contiguous degradation is supported by sensors near the stiffeners, which increases the coverage per sensor via AE waveguide actions. Some sensors are located near potentially critical areas or "critical zones" as identified by numerical analyses. The approach is compared with the damage progression monitored by other techniques (e.g. ultrasonic C-scan).
The neural basis of belief updating and rational decision making
Achtziger, Anja; Hügelschäfer, Sabine; Steinhauser, Marco
2014-01-01
Rational decision making under uncertainty requires forming beliefs that integrate prior and new information through Bayes’ rule. Human decision makers typically deviate from Bayesian updating by either overweighting the prior (conservatism) or overweighting new information (e.g. the representativeness heuristic). We investigated these deviations through measurements of electrocortical activity in the human brain during incentivized probability-updating tasks and found evidence of extremely early commitment to boundedly rational heuristics. Participants who overweight new information display a lower sensibility to conflict detection, captured by an event-related potential (the N2) observed around 260 ms after the presentation of new information. Conservative decision makers (who overweight prior probabilities) make up their mind before new information is presented, as indicated by the lateralized readiness potential in the brain. That is, they do not inhibit the processing of new information but rather immediately rely on the prior for making a decision. PMID:22956673
The neural basis of belief updating and rational decision making.
Achtziger, Anja; Alós-Ferrer, Carlos; Hügelschäfer, Sabine; Steinhauser, Marco
2014-01-01
Rational decision making under uncertainty requires forming beliefs that integrate prior and new information through Bayes' rule. Human decision makers typically deviate from Bayesian updating by either overweighting the prior (conservatism) or overweighting new information (e.g. the representativeness heuristic). We investigated these deviations through measurements of electrocortical activity in the human brain during incentivized probability-updating tasks and found evidence of extremely early commitment to boundedly rational heuristics. Participants who overweight new information display a lower sensibility to conflict detection, captured by an event-related potential (the N2) observed around 260 ms after the presentation of new information. Conservative decision makers (who overweight prior probabilities) make up their mind before new information is presented, as indicated by the lateralized readiness potential in the brain. That is, they do not inhibit the processing of new information but rather immediately rely on the prior for making a decision.
Development of uncertainty-based work injury model using Bayesian structural equation modelling.
Chatterjee, Snehamoy
2014-01-01
This paper proposed a Bayesian method-based structural equation model (SEM) of miners' work injury for an underground coal mine in India. The environmental and behavioural variables for work injury were identified and causal relationships were developed. For Bayesian modelling, prior distributions of SEM parameters are necessary to develop the model. In this paper, two approaches were adopted to obtain prior distribution for factor loading parameters and structural parameters of SEM. In the first approach, the prior distributions were considered as a fixed distribution function with specific parameter values, whereas, in the second approach, prior distributions of the parameters were generated from experts' opinions. The posterior distributions of these parameters were obtained by applying Bayesian rule. The Markov Chain Monte Carlo sampling in the form Gibbs sampling was applied for sampling from the posterior distribution. The results revealed that all coefficients of structural and measurement model parameters are statistically significant in experts' opinion-based priors, whereas, two coefficients are not statistically significant when fixed prior-based distributions are applied. The error statistics reveals that Bayesian structural model provides reasonably good fit of work injury with high coefficient of determination (0.91) and less mean squared error as compared to traditional SEM.
NASA Astrophysics Data System (ADS)
Shi, Jade; Nobrega, R. Paul; Schwantes, Christian; Kathuria, Sagar V.; Bilsel, Osman; Matthews, C. Robert; Lane, T. J.; Pande, Vijay S.
2017-03-01
The dynamics of globular proteins can be described in terms of transitions between a folded native state and less-populated intermediates, or excited states, which can play critical roles in both protein folding and function. Excited states are by definition transient species, and therefore are difficult to characterize using current experimental techniques. Here, we report an atomistic model of the excited state ensemble of a stabilized mutant of an extensively studied flavodoxin fold protein CheY. We employed a hybrid simulation and experimental approach in which an aggregate 42 milliseconds of all-atom molecular dynamics were used as an informative prior for the structure of the excited state ensemble. This prior was then refined against small-angle X-ray scattering (SAXS) data employing an established method (EROS). The most striking feature of the resulting excited state ensemble was an unstructured N-terminus stabilized by non-native contacts in a conformation that is topologically simpler than the native state. Using these results, we then predict incisive single molecule FRET experiments as a means of model validation. This study demonstrates the paradigm of uniting simulation and experiment in a statistical model to study the structure of protein excited states and rationally design validating experiments.
Joint reconstruction of PET-MRI by exploiting structural similarity
NASA Astrophysics Data System (ADS)
Ehrhardt, Matthias J.; Thielemans, Kris; Pizarro, Luis; Atkinson, David; Ourselin, Sébastien; Hutton, Brian F.; Arridge, Simon R.
2015-01-01
Recent advances in technology have enabled the combination of positron emission tomography (PET) with magnetic resonance imaging (MRI). These PET-MRI scanners simultaneously acquire functional PET and anatomical or functional MRI data. As function and anatomy are not independent of one another the images to be reconstructed are likely to have shared structures. We aim to exploit this inherent structural similarity by reconstructing from both modalities in a joint reconstruction framework. The structural similarity between two modalities can be modelled in two different ways: edges are more likely to be at similar positions and/or to have similar orientations. We analyse the diffusion process generated by minimizing priors that encapsulate these different models. It turns out that the class of parallel level set priors always corresponds to anisotropic diffusion which is sometimes forward and sometimes backward diffusion. We perform numerical experiments where we jointly reconstruct from blurred Radon data with Poisson noise (PET) and under-sampled Fourier data with Gaussian noise (MRI). Our results show that both modalities benefit from each other in areas of shared edge information. The joint reconstructions have less artefacts and sharper edges compared to separate reconstructions and the ℓ2-error can be reduced in all of the considered cases of under-sampling.
BAYESIAN PROTEIN STRUCTURE ALIGNMENT.
Rodriguez, Abel; Schmidler, Scott C
The analysis of the three-dimensional structure of proteins is an important topic in molecular biochemistry. Structure plays a critical role in defining the function of proteins and is more strongly conserved than amino acid sequence over evolutionary timescales. A key challenge is the identification and evaluation of structural similarity between proteins; such analysis can aid in understanding the role of newly discovered proteins and help elucidate evolutionary relationships between organisms. Computational biologists have developed many clever algorithmic techniques for comparing protein structures, however, all are based on heuristic optimization criteria, making statistical interpretation somewhat difficult. Here we present a fully probabilistic framework for pairwise structural alignment of proteins. Our approach has several advantages, including the ability to capture alignment uncertainty and to estimate key "gap" parameters which critically affect the quality of the alignment. We show that several existing alignment methods arise as maximum a posteriori estimates under specific choices of prior distributions and error models. Our probabilistic framework is also easily extended to incorporate additional information, which we demonstrate by including primary sequence information to generate simultaneous sequence-structure alignments that can resolve ambiguities obtained using structure alone. This combined model also provides a natural approach for the difficult task of estimating evolutionary distance based on structural alignments. The model is illustrated by comparison with well-established methods on several challenging protein alignment examples.
1987-12-01
the 50-year design ser- vice life. Since these structures were built prior to 1940, the concrete does not contain intentionally entrained air and is...with which designers and contractors are familiar from past experience on new construction. However, there is increasing evidence that rehabilitation...with designers and contractors. Although the Information obtained from the various sources varied widely from project to project, attempts were made to
Garrard, Georgia E; McCarthy, Michael A; Vesk, Peter A; Radford, James Q; Bennett, Andrew F
2012-01-01
1. Informative Bayesian priors can improve the precision of estimates in ecological studies or estimate parameters for which little or no information is available. While Bayesian analyses are becoming more popular in ecology, the use of strongly informative priors remains rare, perhaps because examples of informative priors are not readily available in the published literature. 2. Dispersal distance is an important ecological parameter, but is difficult to measure and estimates are scarce. General models that provide informative prior estimates of dispersal distances will therefore be valuable. 3. Using a world-wide data set on birds, we develop a predictive model of median natal dispersal distance that includes body mass, wingspan, sex and feeding guild. This model predicts median dispersal distance well when using the fitted data and an independent test data set, explaining up to 53% of the variation. 4. Using this model, we predict a priori estimates of median dispersal distance for 57 woodland-dependent bird species in northern Victoria, Australia. These estimates are then used to investigate the relationship between dispersal ability and vulnerability to landscape-scale changes in habitat cover and fragmentation. 5. We find evidence that woodland bird species with poor predicted dispersal ability are more vulnerable to habitat fragmentation than those species with longer predicted dispersal distances, thus improving the understanding of this important phenomenon. 6. The value of constructing informative priors from existing information is also demonstrated. When used as informative priors for four example species, predicted dispersal distances reduced the 95% credible intervals of posterior estimates of dispersal distance by 8-19%. Further, should we have wished to collect information on avian dispersal distances and relate it to species' responses to habitat loss and fragmentation, data from 221 individuals across 57 species would have been required to obtain estimates with the same precision as those provided by the general model. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.
Smith, David; Woodman, Richard; Drummond, Aaron; Battersby, Malcolm
2016-03-30
Knowledge of a problem gambler's underlying gambling related cognitions plays an important role in treatment planning. The Gambling Related Cognitions Scale (GRCS) is therefore frequently used in clinical settings for screening and evaluation of treatment outcomes. However, GRCS validation studies have generated conflicting results regarding its latent structure using traditional confirmatory factor analyses (CFA). This may partly be due to the rigid constraints imposed on cross-factor loadings with traditional CFA. The aim of this investigation was to determine whether a Bayesian structural equation modelling (BSEM) approach to examination of the GRCS factor structure would better replicate substantive theory and also inform model re-specifications. Participants were 454 treatment-seekers at first presentation to a gambling treatment centre between January 2012 and December 2014. Model fit indices were well below acceptable standards for CFA. In contrast, the BSEM model which included small informative priors for the residual covariance matrix in addition to cross-loadings produced excellent model fit for the original hypothesised factor structure. The results also informed re-specification of the CFA model which provided more reasonable model fit. These conclusions have implications that should be useful to both clinicians and researchers evaluating measurement models relating to gambling related cognitions in treatment-seekers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Alcohol Use as a Determinant of HIV Risk Behaviors Among Recent Latino Immigrants in South Florida
Rojas, Patria; Dillon, Frank R.; Cyrus, Elena; Ravelo, Gira J.; Malow, Robert M.; De La Rosa, Mario
2013-01-01
Information on the association between alcohol use and Latino sexual risk behavior prior to immigrating to the United States is scarce. Given this population's rapid growth, documenting the influence of alcohol use on Recent Latino Immigrants’ (RLI) sexual risk behaviors is essential. Data prior to immigration were retrospectively collected from 527 RLI ages 18-39. Quantity and frequency of alcohol use during the 90 days prior to immigration and pre-immigration sexual risk behaviors were measured. Structural equation modeling was used to examine the relationships. Males, single participants, and participants with higher incomes reported more alcohol use. Higher alcohol use was associated with lower condom use frequency, having sex under the influence, and more sexual partners among all participants. Results point to the importance of creating interventions targeting adult RLI men, given their likelihood to engage in alcohol consumption, sex under the influence of alcohol, and sex with multiple partners without condoms. PMID:23706771
Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.
2013-01-01
The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.
Nudging toward Inquiry: Awakening and Building upon Prior Knowledge
ERIC Educational Resources Information Center
Fontichiaro, Kristin, Comp.
2010-01-01
"Prior knowledge" (sometimes called schema or background knowledge) is information one already knows that helps him/her make sense of new information. New learning builds on existing prior knowledge. In traditional reporting-style research projects, students bypass this crucial step and plow right into answer-finding. It's no wonder that many…
NASA Astrophysics Data System (ADS)
Truckenbrodt, Sina C.; Gómez-Dans, José; Stelmaszczuk-Górska, Martyna A.; Chernetskiy, Maxim; Schmullius, Christiane C.
2017-04-01
Throughout the past decades various satellite sensors have been launched that record reflectance in the optical domain and facilitate comprehensive monitoring of the vegetation-covered land surface from space. The interaction of photons with the canopy, leaves and soil that determines the spectrum of reflected sunlight can be simulated with radiative transfer models (RTMs). The inversion of RTMs permits the derivation of state variables such as leaf area index (LAI) and leaf chlorophyll content from top-of-canopy reflectance. Space-borne data are, however, insufficient for an unambiguous derivation of state variables and additional constraints are required to resolve this ill-posed problem. Data assimilation techniques permit the conflation of various information with due allowance for associated uncertainties. The Earth Observation Land Data Assimilation System (EO-LDAS) integrates RTMs into a dynamic process model that describes the temporal evolution of state variables. In addition, prior information is included to further constrain the inversion and enhance the state variable derivation. In previous studies on EO-LDAS, prior information was represented by temporally constant values for all investigated state variables, while information about their phenological evolution was neglected. Here, we examine to what extent the implementation of prior information reflecting the phenological variability improves the performance of EO-LDAS with respect to the monitoring of crops on the agricultural Gebesee test site (Central Germany). Various routines for the generation of prior information are tested. This involves the usage of data on state variables that was acquired in previous years as well as the application of phenological models. The performance of EO-LDAS with the newly implemented prior information is tested based on medium resolution satellite imagery (e.g., RapidEye REIS, Sentinel-2 MSI, Landsat-7 ETM+ and Landsat-8 OLI). The predicted state variables are validated against in situ data from the Gebesee test site that were acquired with a weekly to fortnightly resolution throughout the growing seasons of 2010, 2013, 2014 and 2016. Furthermore, the results are compared with the outcome of using constant values as prior information. In this presentation, the EO-LDAS scheme and results obtained from different prior information are presented.
Strekalova, Yulia A; James, Vaughan S
2017-09-01
User-generated information on the Internet provides opportunities for the monitoring of health information consumer attitudes. For example, information about cancer prevention may cause decisional conflict. Yet posts and conversations shared by health information consumers online are often not readily actionable for interpretation and decision-making due to their unstandardized format. This study extends prior research on the use of natural language as a predictor of consumer attitudes and provides a link to decision-making by evaluating the predictive role of uncertainty indicators expressed in natural language. Analyzed data included free-text comments and structured scale responses related to information about skin cancer prevention options. The study identified natural language indicators of uncertainty and showed that it can serve as a predictor of decisional conflict. The natural indicators of uncertainty reported here can facilitate the monitoring of health consumer perceptions about cancer prevention recommendations and inform education and communication campaign planning and evaluation.
Predictive processing of novel compounds: evidence from Japanese.
Hirose, Yuki; Mazuka, Reiko
2015-03-01
Our study argues that pre-head anticipatory processing operates at a level below the level of the sentence. A visual-world eye-tracking study demonstrated that, in processing of Japanese novel compounds, the compound structure can be constructed prior to the head if the prosodic information on the preceding modifier constituent signals that the Compound Accent Rule (CAR) is being applied. This prosodic cue rules out the single head analysis of the modifier noun, which would otherwise be a natural and economical choice. Once the structural representation for the head is computed in advance, the parser becomes faster in identifying the compound meaning. This poses a challenge to models maintaining that structural integration and word recognition are separate processes. At the same time, our results, together with previous findings, suggest the possibility that there is some degree of staging during the processing of different sources of information during the comprehension of compound nouns. Copyright © 2014 Elsevier B.V. All rights reserved.
Isomer Information from Ion Mobility Separation of High-Mannose Glycan Fragments.
Harvey, David J; Seabright, Gemma E; Vasiljevic, Snezana; Crispin, Max; Struwe, Weston B
2018-05-01
Extracted arrival time distributions of negative ion CID-derived fragments produced prior to traveling-wave ion mobility separation were evaluated for their ability to provide structural information on N-linked glycans. Fragmentation of high-mannose glycans released from several glycoproteins, including those from viral sources, provided over 50 fragments, many of which gave unique collisional cross-sections and provided additional information used to assign structural isomers. For example, cross-ring fragments arising from cleavage of the reducing terminal GlcNAc residue on Man 8 GlcNAc 2 isomers have unique collision cross-sections enabling isomers to be differentiated in mixtures. Specific fragment collision cross-sections enabled identification of glycans, the antennae of which terminated in the antigenic α-galactose residue, and ions defining the composition of the 6-antenna of several of the glycans were also found to have different cross-sections from isomeric ions produced in the same spectra. Potential mechanisms for the formation of the various ions are discussed and the estimated collisional cross-sections are tabulated. Graphical Abstract ᅟ.
Tertiary structural propensities reveal fundamental sequence/structure relationships.
Zheng, Fan; Zhang, Jian; Grigoryan, Gevorg
2015-05-05
Extracting useful generalizations from the continually growing Protein Data Bank (PDB) is of central importance. We hypothesize that the PDB contains valuable quantitative information on the level of local tertiary structural motifs (TERMs). We show that by breaking a protein structure into its constituent TERMs, and querying the PDB to characterize the natural ensemble matching each, we can estimate the compatibility of the structure with a given amino acid sequence through a metric we term "structure score." Considering submissions from recent Critical Assessment of Structure Prediction (CASP) experiments, we found a strong correlation (R = 0.69) between structure score and model accuracy, with poorly predicted regions readily identifiable. This performance exceeds that of leading atomistic statistical energy functions. Furthermore, TERM-based analysis of two prototypical multi-state proteins rapidly produced structural insights fully consistent with prior extensive experimental studies. We thus find that TERM-based analysis should have considerable utility for protein structural biology. Copyright © 2015 Elsevier Ltd. All rights reserved.
Predictive top-down integration of prior knowledge during speech perception.
Sohoglu, Ediz; Peelle, Jonathan E; Carlyon, Robert P; Davis, Matthew H
2012-06-20
A striking feature of human perception is that our subjective experience depends not only on sensory information from the environment but also on our prior knowledge or expectations. The precise mechanisms by which sensory information and prior knowledge are integrated remain unclear, with longstanding disagreement concerning whether integration is strictly feedforward or whether higher-level knowledge influences sensory processing through feedback connections. Here we used concurrent EEG and MEG recordings to determine how sensory information and prior knowledge are integrated in the brain during speech perception. We manipulated listeners' prior knowledge of speech content by presenting matching, mismatching, or neutral written text before a degraded (noise-vocoded) spoken word. When speech conformed to prior knowledge, subjective perceptual clarity was enhanced. This enhancement in clarity was associated with a spatiotemporal profile of brain activity uniquely consistent with a feedback process: activity in the inferior frontal gyrus was modulated by prior knowledge before activity in lower-level sensory regions of the superior temporal gyrus. In parallel, we parametrically varied the level of speech degradation, and therefore the amount of sensory detail, so that changes in neural responses attributable to sensory information and prior knowledge could be directly compared. Although sensory detail and prior knowledge both enhanced speech clarity, they had an opposite influence on the evoked response in the superior temporal gyrus. We argue that these data are best explained within the framework of predictive coding in which sensory activity is compared with top-down predictions and only unexplained activity propagated through the cortical hierarchy.
2012-01-01
Background An important question in the analysis of biochemical data is that of identifying subsets of molecular variables that may jointly influence a biological response. Statistical variable selection methods have been widely used for this purpose. In many settings, it may be important to incorporate ancillary biological information concerning the variables of interest. Pathway and network maps are one example of a source of such information. However, although ancillary information is increasingly available, it is not always clear how it should be used nor how it should be weighted in relation to primary data. Results We put forward an approach in which biological knowledge is incorporated using informative prior distributions over variable subsets, with prior information selected and weighted in an automated, objective manner using an empirical Bayes formulation. We employ continuous, linear models with interaction terms and exploit biochemically-motivated sparsity constraints to permit exact inference. We show an example of priors for pathway- and network-based information and illustrate our proposed method on both synthetic response data and by an application to cancer drug response data. Comparisons are also made to alternative Bayesian and frequentist penalised-likelihood methods for incorporating network-based information. Conclusions The empirical Bayes method proposed here can aid prior elicitation for Bayesian variable selection studies and help to guard against mis-specification of priors. Empirical Bayes, together with the proposed pathway-based priors, results in an approach with a competitive variable selection performance. In addition, the overall procedure is fast, deterministic, and has very few user-set parameters, yet is capable of capturing interplay between molecular players. The approach presented is general and readily applicable in any setting with multiple sources of biological prior knowledge. PMID:22578440
Bucci, Melanie E.; Callahan, Peggy; Koprowski, John L.; Polfus, Jean L.; Krausman, Paul R.
2015-01-01
Stable isotope analysis of diet has become a common tool in conservation research. However, the multiple sources of uncertainty inherent in this analysis framework involve consequences that have not been thoroughly addressed. Uncertainty arises from the choice of trophic discrimination factors, and for Bayesian stable isotope mixing models (SIMMs), the specification of prior information; the combined effect of these aspects has not been explicitly tested. We used a captive feeding study of gray wolves (Canis lupus) to determine the first experimentally-derived trophic discrimination factors of C and N for this large carnivore of broad conservation interest. Using the estimated diet in our controlled system and data from a published study on wild wolves and their prey in Montana, USA, we then investigated the simultaneous effect of discrimination factors and prior information on diet reconstruction with Bayesian SIMMs. Discrimination factors for gray wolves and their prey were 1.97‰ for δ13C and 3.04‰ for δ15N. Specifying wolf discrimination factors, as opposed to the commonly used red fox (Vulpes vulpes) factors, made little practical difference to estimates of wolf diet, but prior information had a strong effect on bias, precision, and accuracy of posterior estimates. Without specifying prior information in our Bayesian SIMM, it was not possible to produce SIMM posteriors statistically similar to the estimated diet in our controlled study or the diet of wild wolves. Our study demonstrates the critical effect of prior information on estimates of animal diets using Bayesian SIMMs, and suggests species-specific trophic discrimination factors are of secondary importance. When using stable isotope analysis to inform conservation decisions researchers should understand the limits of their data. It may be difficult to obtain useful information from SIMMs if informative priors are omitted and species-specific discrimination factors are unavailable. PMID:25803664
Derbridge, Jonathan J; Merkle, Jerod A; Bucci, Melanie E; Callahan, Peggy; Koprowski, John L; Polfus, Jean L; Krausman, Paul R
2015-01-01
Stable isotope analysis of diet has become a common tool in conservation research. However, the multiple sources of uncertainty inherent in this analysis framework involve consequences that have not been thoroughly addressed. Uncertainty arises from the choice of trophic discrimination factors, and for Bayesian stable isotope mixing models (SIMMs), the specification of prior information; the combined effect of these aspects has not been explicitly tested. We used a captive feeding study of gray wolves (Canis lupus) to determine the first experimentally-derived trophic discrimination factors of C and N for this large carnivore of broad conservation interest. Using the estimated diet in our controlled system and data from a published study on wild wolves and their prey in Montana, USA, we then investigated the simultaneous effect of discrimination factors and prior information on diet reconstruction with Bayesian SIMMs. Discrimination factors for gray wolves and their prey were 1.97‰ for δ13C and 3.04‰ for δ15N. Specifying wolf discrimination factors, as opposed to the commonly used red fox (Vulpes vulpes) factors, made little practical difference to estimates of wolf diet, but prior information had a strong effect on bias, precision, and accuracy of posterior estimates. Without specifying prior information in our Bayesian SIMM, it was not possible to produce SIMM posteriors statistically similar to the estimated diet in our controlled study or the diet of wild wolves. Our study demonstrates the critical effect of prior information on estimates of animal diets using Bayesian SIMMs, and suggests species-specific trophic discrimination factors are of secondary importance. When using stable isotope analysis to inform conservation decisions researchers should understand the limits of their data. It may be difficult to obtain useful information from SIMMs if informative priors are omitted and species-specific discrimination factors are unavailable.
False recognition depends on depth of prior word processing: a magnetoencephalographic (MEG) study.
Walla, P; Hufnagl, B; Lindinger, G; Deecke, L; Imhof, H; Lang, W
2001-04-01
Brain activity was measured with a whole head magnetoencephalograph (MEG) during the test phases of word recognition experiments. Healthy young subjects had to discriminate between previously presented and new words. During prior study phases two different levels of word processing were provided according to two different kinds of instructions (shallow and deep encoding). Event-related fields (ERFs) associated with falsely recognized words (false alarms) were found to depend on the depth of processing during the prior study phase. False alarms elicited higher brain activity (as reflected by dipole strength) in case of prior deep encoding as compared to shallow encoding between 300 and 500 ms after stimulus onset at temporal brain areas. Between 500 and 700 ms we found evidence for differences in the involvement of neural structures related to both conditions of false alarms. Furthermore, the number of false alarms was found to depend on depth of processing. Shallow encoding led to a higher number of false alarms than deep encoding. All data are discussed as strong support for the ideas that a certain level of word processing is performed by a distinct set of neural systems and that the same neural systems which encode information are reactivated during the retrieval.
Liang, Li-Jung; Weiss, Robert E; Redelings, Benjamin; Suchard, Marc A
2009-10-01
Statistical analyses of phylogenetic data culminate in uncertain estimates of underlying model parameters. Lack of additional data hinders the ability to reduce this uncertainty, as the original phylogenetic dataset is often complete, containing the entire gene or genome information available for the given set of taxa. Informative priors in a Bayesian analysis can reduce posterior uncertainty; however, publicly available phylogenetic software specifies vague priors for model parameters by default. We build objective and informative priors using hierarchical random effect models that combine additional datasets whose parameters are not of direct interest but are similar to the analysis of interest. We propose principled statistical methods that permit more precise parameter estimates in phylogenetic analyses by creating informative priors for parameters of interest. Using additional sequence datasets from our lab or public databases, we construct a fully Bayesian semiparametric hierarchical model to combine datasets. A dynamic iteratively reweighted Markov chain Monte Carlo algorithm conveniently recycles posterior samples from the individual analyses. We demonstrate the value of our approach by examining the insertion-deletion (indel) process in the enolase gene across the Tree of Life using the phylogenetic software BALI-PHY; we incorporate prior information about indels from 82 curated alignments downloaded from the BAliBASE database.
Unbiased, scalable sampling of protein loop conformations from probabilistic priors.
Zhang, Yajia; Hauser, Kris
2013-01-01
Protein loops are flexible structures that are intimately tied to function, but understanding loop motion and generating loop conformation ensembles remain significant computational challenges. Discrete search techniques scale poorly to large loops, optimization and molecular dynamics techniques are prone to local minima, and inverse kinematics techniques can only incorporate structural preferences in adhoc fashion. This paper presents Sub-Loop Inverse Kinematics Monte Carlo (SLIKMC), a new Markov chain Monte Carlo algorithm for generating conformations of closed loops according to experimentally available, heterogeneous structural preferences. Our simulation experiments demonstrate that the method computes high-scoring conformations of large loops (>10 residues) orders of magnitude faster than standard Monte Carlo and discrete search techniques. Two new developments contribute to the scalability of the new method. First, structural preferences are specified via a probabilistic graphical model (PGM) that links conformation variables, spatial variables (e.g., atom positions), constraints and prior information in a unified framework. The method uses a sparse PGM that exploits locality of interactions between atoms and residues. Second, a novel method for sampling sub-loops is developed to generate statistically unbiased samples of probability densities restricted by loop-closure constraints. Numerical experiments confirm that SLIKMC generates conformation ensembles that are statistically consistent with specified structural preferences. Protein conformations with 100+ residues are sampled on standard PC hardware in seconds. Application to proteins involved in ion-binding demonstrate its potential as a tool for loop ensemble generation and missing structure completion.
Unbiased, scalable sampling of protein loop conformations from probabilistic priors
2013-01-01
Background Protein loops are flexible structures that are intimately tied to function, but understanding loop motion and generating loop conformation ensembles remain significant computational challenges. Discrete search techniques scale poorly to large loops, optimization and molecular dynamics techniques are prone to local minima, and inverse kinematics techniques can only incorporate structural preferences in adhoc fashion. This paper presents Sub-Loop Inverse Kinematics Monte Carlo (SLIKMC), a new Markov chain Monte Carlo algorithm for generating conformations of closed loops according to experimentally available, heterogeneous structural preferences. Results Our simulation experiments demonstrate that the method computes high-scoring conformations of large loops (>10 residues) orders of magnitude faster than standard Monte Carlo and discrete search techniques. Two new developments contribute to the scalability of the new method. First, structural preferences are specified via a probabilistic graphical model (PGM) that links conformation variables, spatial variables (e.g., atom positions), constraints and prior information in a unified framework. The method uses a sparse PGM that exploits locality of interactions between atoms and residues. Second, a novel method for sampling sub-loops is developed to generate statistically unbiased samples of probability densities restricted by loop-closure constraints. Conclusion Numerical experiments confirm that SLIKMC generates conformation ensembles that are statistically consistent with specified structural preferences. Protein conformations with 100+ residues are sampled on standard PC hardware in seconds. Application to proteins involved in ion-binding demonstrate its potential as a tool for loop ensemble generation and missing structure completion. PMID:24565175
Nowakowska, Marzena
2017-04-01
The development of the Bayesian logistic regression model classifying the road accident severity is discussed. The already exploited informative priors (method of moments, maximum likelihood estimation, and two-stage Bayesian updating), along with the original idea of a Boot prior proposal, are investigated when no expert opinion has been available. In addition, two possible approaches to updating the priors, in the form of unbalanced and balanced training data sets, are presented. The obtained logistic Bayesian models are assessed on the basis of a deviance information criterion (DIC), highest probability density (HPD) intervals, and coefficients of variation estimated for the model parameters. The verification of the model accuracy has been based on sensitivity, specificity and the harmonic mean of sensitivity and specificity, all calculated from a test data set. The models obtained from the balanced training data set have a better classification quality than the ones obtained from the unbalanced training data set. The two-stage Bayesian updating prior model and the Boot prior model, both identified with the use of the balanced training data set, outperform the non-informative, method of moments, and maximum likelihood estimation prior models. It is important to note that one should be careful when interpreting the parameters since different priors can lead to different models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Using expert knowledge for test linking.
Bolsinova, Maria; Hoijtink, Herbert; Vermeulen, Jorine Adinda; Béguin, Anton
2017-12-01
Linking and equating procedures are used to make the results of different test forms comparable. In the cases where no assumption of random equivalent groups can be made some form of linking design is used. In practice the amount of data available to link the two tests is often very limited due to logistic and security reasons, which affects the precision of linking procedures. This study proposes to enhance the quality of linking procedures based on sparse data by using Bayesian methods which combine the information in the linking data with background information captured in informative prior distributions. We propose two methods for the elicitation of prior knowledge about the difference in difficulty of two tests from subject-matter experts and explain how these results can be used in the specification of priors. To illustrate the proposed methods and evaluate the quality of linking with and without informative priors, an empirical example of linking primary school mathematics tests is presented. The results suggest that informative priors can increase the precision of linking without decreasing the accuracy. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Exploring Encoding and Retrieval Effects of Background Information on Text Memory
ERIC Educational Resources Information Center
Rawson, Katherine A.; Kintsch, Walter
2004-01-01
Two experiments were conducted (a) to evaluate how providing background information at test may benefit retrieval and (b) to further examine how providing background information prior to study influences encoding. Half of the participants read background information prior to study, and the other half did not. In each group, half were presented…
McCarron, C Elizabeth; Pullenayegum, Eleanor M; Thabane, Lehana; Goeree, Ron; Tarride, Jean-Eric
2013-04-01
Bayesian methods have been proposed as a way of synthesizing all available evidence to inform decision making. However, few practical applications of the use of Bayesian methods for combining patient-level data (i.e., trial) with additional evidence (e.g., literature) exist in the cost-effectiveness literature. The objective of this study was to compare a Bayesian cost-effectiveness analysis using informative priors to a standard non-Bayesian nonparametric method to assess the impact of incorporating additional information into a cost-effectiveness analysis. Patient-level data from a previously published nonrandomized study were analyzed using traditional nonparametric bootstrap techniques and bivariate normal Bayesian models with vague and informative priors. Two different types of informative priors were considered to reflect different valuations of the additional evidence relative to the patient-level data (i.e., "face value" and "skeptical"). The impact of using different distributions and valuations was assessed in a sensitivity analysis. Models were compared in terms of incremental net monetary benefit (INMB) and cost-effectiveness acceptability frontiers (CEAFs). The bootstrapping and Bayesian analyses using vague priors provided similar results. The most pronounced impact of incorporating the informative priors was the increase in estimated life years in the control arm relative to what was observed in the patient-level data alone. Consequently, the incremental difference in life years originally observed in the patient-level data was reduced, and the INMB and CEAF changed accordingly. The results of this study demonstrate the potential impact and importance of incorporating additional information into an analysis of patient-level data, suggesting this could alter decisions as to whether a treatment should be adopted and whether more information should be acquired.
Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.
Fennell, John; Baddeley, Roland
2012-10-01
Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.
An improved sampling method of complex network
NASA Astrophysics Data System (ADS)
Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing
2014-12-01
Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.
Mergers in health care: avoiding divorce IDS style.
Drazen, E; Kueber, M
1998-08-01
The recent flurry of merger activity in the healthcare industry has given rise to a significant number of integration efforts. Unfortunately, some of these "marriages" will end in "divorce." Reasons for failure can be found in four critical dimensions of integration: structural, operational, clinical, and informational. Each dimension has its associated pitfalls, and every merger confronts clearly identifiable risks. By taking steps to mitigate such risks, merging organizations can improve the chances the merger will succeed. If the merger does fail, measures taken prior to the merger, such as including an escape clause in the merger contract, can help avoid problems in dividing operational assets, physicians practices, and information assets.
Wissel, Tobias; Stüber, Patrick; Wagner, Benjamin; Bruder, Ralf; Schweikard, Achim; Ernst, Floris
2016-04-01
Patient immobilization and X-ray-based imaging provide neither a convenient nor a very accurate way to ensure low repositioning errors or to compensate for motion in cranial radiotherapy. We therefore propose an optical tracking device that exploits subcutaneous structures as landmarks in addition to merely spatial registration. To develop such head tracking algorithms, precise and robust computation of these structures is necessary. Here, we show that the tissue thickness can be predicted with high accuracy and moreover exploit local neighborhood information within the laser spot grid on the forehead to further increase this estimation accuracy. We use statistical learning with Support Vector Regression and Gaussian Processes to learn a relationship between optical backscatter features and an MR tissue thickness ground truth. We compare different kernel functions for the data of five different subjects. The incident angle of the laser on the forehead as well as local neighborhoods is incorporated into the feature space. The latter represent the backscatter features from four neighboring laser spots. We confirm that the incident angle has a positive effect on the estimation error of the tissue thickness. The root-mean-square error falls even below 0.15 mm when adding the complete neighborhood information. This prior knowledge also leads to a smoothing effect on the reconstructed skin patch. Learning between different head poses yields similar results. The partial overlap of the point clouds makes the trade-off between novel information and increased feature space dimension obvious and hence feature selection by e.g., sequential forward selection necessary.
In vivo bioluminescence tomography based on multi-view projection and 3D surface reconstruction
NASA Astrophysics Data System (ADS)
Zhang, Shuang; Wang, Kun; Leng, Chengcai; Deng, Kexin; Hu, Yifang; Tian, Jie
2015-03-01
Bioluminescence tomography (BLT) is a powerful optical molecular imaging modality, which enables non-invasive realtime in vivo imaging as well as 3D quantitative analysis in preclinical studies. In order to solve the inverse problem and reconstruct inner light sources accurately, the prior structural information is commonly necessary and obtained from computed tomography or magnetic resonance imaging. This strategy requires expensive hybrid imaging system, complicated operation protocol and possible involvement of ionizing radiation. The overall robustness highly depends on the fusion accuracy between the optical and structural information. In this study we present a pure optical bioluminescence tomographic system (POBTS) and a novel BLT method based on multi-view projection acquisition and 3D surface reconstruction. The POBTS acquired a sparse set of white light surface images and bioluminescent images of a mouse. Then the white light images were applied to an approximate surface model to generate a high quality textured 3D surface reconstruction of the mouse. After that we integrated multi-view luminescent images based on the previous reconstruction, and applied an algorithm to calibrate and quantify the surface luminescent flux in 3D.Finally, the internal bioluminescence source reconstruction was achieved with this prior information. A BALB/C mouse with breast tumor of 4T1-fLuc cells mouse model were used to evaluate the performance of the new system and technique. Compared with the conventional hybrid optical-CT approach using the same inverse reconstruction method, the reconstruction accuracy of this technique was improved. The distance error between the actual and reconstructed internal source was decreased by 0.184 mm.
1997-11-01
studies of business, law, management, the arts and ethics also focus on the nature and use of argument ( Toulmin , Rieke, & Janik, 1984). They provide...another definition of argument and a graphical representation (see Figure 3). Toulmin conceives of arguments as a linked structure of claims (or conclusions...conditions I conceptual or strategic knowledge L analyses-- Figure 3. Toulmin’s representation of argument. We have taken prior work by Kuhn and Toulmin
Olafsson, Kristinn; Pampoulie, Christophe; Hjorleifsdottir, Sigridur; Gudjonsson, Sigurdur; Hreggvidsson, Gudmundur O.
2014-01-01
Due to an improved understanding of past climatological conditions, it has now become possible to study the potential concordance between former climatological models and present-day genetic structure. Genetic variability was assessed in 26 samples from different rivers of Atlantic salmon in Iceland (total of 2,352 individuals), using 15 microsatellite loci. F-statistics revealed significant differences between the majority of the populations that were sampled. Bayesian cluster analyses using both prior information and no prior information on sampling location revealed the presence of two distinguishable genetic pools - namely, the Northern (Group 1) and Southern (Group 2) regions of Iceland. Furthermore, the random permutation of different allele sizes among allelic states revealed a significant mutational component to the genetic differentiation at four microsatellite loci (SsaD144, Ssa171, SSsp2201 and SsaF3), and supported the proposition of a historical origin behind the observed variation. The estimated time of divergence, using two different ABC methods, suggested that the observed genetic pattern originated from between the Last Glacial Maximum to the Younger Dryas, which serves as additional evidence of the relative immaturity of Icelandic fish populations, on account of the re-colonisation of this young environment following the Last Glacial Maximum. Additional analyses suggested the presence of several genetic entities which were likely to originate from the original groups detected. PMID:24498283
The value of image coregistration during stereotactic radiosurgery.
Koga, T; Maruyama, K; Igaki, H; Tago, M; Saito, N
2009-05-01
Coregistration of any neuroimaging studies into treatment planning for stereotactic radiosurgery became easily applicable using the Leksell Gamma Knife 4C, a new model of gamma knife. The authors investigated the advantage of this image processing. Since installation of the Leksell Gamma Knife 4C at the authors' institute, 180 sessions of radiosurgery were performed. Before completion of planning, coregistration of frameless images of other modalities or previous images was considered to refine planning. Treatment parameters were compared for planning before and after refinement by use of coregistered images. Coregistered computed tomography clarified the anatomical structures indistinct on magnetic resonance imaging. Positron emission tomography visualized lesions disclosing metabolically high activity. Coregistration of prior imaging distinguished progressing lesions from stable ones. Diffusion-tensor tractography was integrated for lesions adjacent to the corticospinal tract or the optic radiation. After refinement of planning in 36 sessions, excess treated volume decreased (p = 0.0062) and Paddick conformity index improved (p < 0.001). Maximal dose to the white matter tracts was decreased (p < 0.001). Image coregistration provided direct information on anatomy, metabolic activity, chronological changes, and adjacent critical structures. This gathered information was sufficiently informative during treatment planning to supplement ambiguous information on stereotactic images, and was useful especially in reducing irradiation to surrounding normal structures.
[Inferential evaluation of intimacy based on observation of interpersonal communication].
Kimura, Masanori
2015-06-01
How do people inferentially evaluate others' levels of intimacy with friends? We examined the inferential evaluation of intimacy based on the observation of interpersonal communication. In Experiment 1, participants (N = 41) responded to questions after observing conversations between friends. Results indicated that participants inferentially evaluated not only goodness of communication, but also intimacy between friends, using an expressivity heuristic approach. In Experiment 2, we investigated how inferential evaluation of intimacy was affected by prior information about relationships and by individual differences in face-to-face interactional ability. Participants (N = 64) were divided into prior- and no-prior-information groups and all performed the same task as in Experiment 1. Additionally, their interactional ability was assessed. In the prior-information group, individual differences had no effect on inferential evaluation of intimacy. On the other hand, in the no-prior-information group, face-to-face interactional ability partially influenced evaluations of intimacy. Finally, we discuss the fact that to understand one's social environment, it is important to observe others' interpersonal communications.
Objective Bayesian analysis of neutrino masses and hierarchy
NASA Astrophysics Data System (ADS)
Heavens, Alan F.; Sellentin, Elena
2018-04-01
Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.
GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions
Banta, Edward R.; Ahlfeld, David P.
2013-01-01
Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.
HEDEA: A Python Tool for Extracting and Analysing Semi-structured Information from Medical Records
Aggarwal, Anshul; Garhwal, Sunita
2018-01-01
Objectives One of the most important functions for a medical practitioner while treating a patient is to study the patient's complete medical history by going through all records, from test results to doctor's notes. With the increasing use of technology in medicine, these records are mostly digital, alleviating the problem of looking through a stack of papers, which are easily misplaced, but some of these are in an unstructured form. Large parts of clinical reports are in written text form and are tedious to use directly without appropriate pre-processing. In medical research, such health records may be a good, convenient source of medical data; however, lack of structure means that the data is unfit for statistical evaluation. In this paper, we introduce a system to extract, store, retrieve, and analyse information from health records, with a focus on the Indian healthcare scene. Methods A Python-based tool, Healthcare Data Extraction and Analysis (HEDEA), has been designed to extract structured information from various medical records using a regular expression-based approach. Results The HEDEA system is working, covering a large set of formats, to extract and analyse health information. Conclusions This tool can be used to generate analysis report and charts using the central database. This information is only provided after prior approval has been received from the patient for medical research purposes. PMID:29770248
HEDEA: A Python Tool for Extracting and Analysing Semi-structured Information from Medical Records.
Aggarwal, Anshul; Garhwal, Sunita; Kumar, Ajay
2018-04-01
One of the most important functions for a medical practitioner while treating a patient is to study the patient's complete medical history by going through all records, from test results to doctor's notes. With the increasing use of technology in medicine, these records are mostly digital, alleviating the problem of looking through a stack of papers, which are easily misplaced, but some of these are in an unstructured form. Large parts of clinical reports are in written text form and are tedious to use directly without appropriate pre-processing. In medical research, such health records may be a good, convenient source of medical data; however, lack of structure means that the data is unfit for statistical evaluation. In this paper, we introduce a system to extract, store, retrieve, and analyse information from health records, with a focus on the Indian healthcare scene. A Python-based tool, Healthcare Data Extraction and Analysis (HEDEA), has been designed to extract structured information from various medical records using a regular expression-based approach. The HEDEA system is working, covering a large set of formats, to extract and analyse health information. This tool can be used to generate analysis report and charts using the central database. This information is only provided after prior approval has been received from the patient for medical research purposes.
The Critical Role of Retrieval Processes in Release from Proactive Interference
ERIC Educational Resources Information Center
Bauml, Karl-Heinz T.; Kliegl, Oliver
2013-01-01
Proactive interference (PI) refers to the finding that memory for recently studied (target) information can be vastly impaired by the previous study of other (nontarget) information. PI can be reduced in a number of ways, for instance, by directed forgetting of the prior nontarget information, the testing of the prior nontarget information, or an…
On estimating the accuracy of monitoring methods using Bayesian error propagation technique
NASA Astrophysics Data System (ADS)
Zonta, Daniele; Bruschetta, Federico; Cappello, Carlo; Zandonini, R.; Pozzi, Matteo; Wang, Ming; Glisic, B.; Inaudi, D.; Posenato, D.; Zhao, Y.
2014-04-01
This paper illustrates an application of Bayesian logic to monitoring data analysis and structural condition state inference. The case study is a 260 m long cable-stayed bridge spanning the Adige River 10 km north of the town of Trento, Italy. This is a statically indeterminate structure, having a composite steel-concrete deck, supported by 12 stay cables. Structural redundancy, possible relaxation losses and an as-built condition differing from design, suggest that long-term load redistribution between cables can be expected. To monitor load redistribution, the owner decided to install a monitoring system which combines built-on-site elasto-magnetic and fiber-optic sensors. In this note, we discuss a rational way to improve the accuracy of the load estimate from the EM sensors taking advantage of the FOS information. More specifically, we use a multi-sensor Bayesian data fusion approach which combines the information from the two sensing systems with the prior knowledge, including design information and the outcomes of laboratory calibration. Using the data acquired to date, we demonstrate that combining the two measurements allows a more accurate estimate of the cable load, to better than 50 kN.
The association of hospital governance with innovation in Taiwan.
Yang, Chen-Wei; Yan, Yu-Hua; Fang, Shih-Chieh; Inamdar, Syeda Noorein; Lin, Hsien-Cheng
2018-01-01
Hospitals in Taiwan are facing major changes and innovation is increasingly becoming a critical factor for remaining competitive. One determinant that can have a significant impact on innovation is hospital governance. However, there is limited prior research on the relationship between hospital governance and innovation. The purpose of this study is to propose a conceptual framework to hypothesize the relationship between governance mechanisms and innovation and to empirically test the hypotheses in hospital organizations. We examine the relationship between governance mechanisms and innovation using data on 102 hospitals in Taiwan from the Taiwan Joint Commission on Hospital Accreditation and Quality Improvement. We model governance mechanisms using board structure, information transparency and strategic decision-making processes. For our modeling and data analysis we use measurement and structural models. We find that in hospital governance, information transparency and strategic decision making did impact innovation. However, governance structure did not. To facilitate innovation, hospital boards can increase information transparency and improve the decision-making process when considering strategic investments in innovative initiatives. To remain competitive, hospital boards need to develop and monitor indices that measure hospital innovation to ensure ongoing progress. Copyright © 2017 John Wiley & Sons, Ltd.
Estimating the Uncertain Mathematical Structure of Hydrological Model via Bayesian Data Assimilation
NASA Astrophysics Data System (ADS)
Bulygina, N.; Gupta, H.; O'Donell, G.; Wheater, H.
2008-12-01
The structure of hydrological model at macro scale (e.g. watershed) is inherently uncertain due to many factors, including the lack of a robust hydrological theory at the macro scale. In this work, we assume that a suitable conceptual model for the hydrologic system has already been determined - i.e., the system boundaries have been specified, the important state variables and input and output fluxes to be included have been selected, and the major hydrological processes and geometries of their interconnections have been identified. The structural identification problem then is to specify the mathematical form of the relationships between the inputs, state variables and outputs, so that a computational model can be constructed for making simulations and/or predictions of system input-state-output behaviour. We show how Bayesian data assimilation can be used to merge both prior beliefs in the form of pre-assumed model equations with information derived from the data to construct a posterior model. The approach, entitled Bayesian Estimation of Structure (BESt), is used to estimate a hydrological model for a small basin in England, at hourly time scales, conditioned on the assumption of 3-dimensional state - soil moisture storage, fast and slow flow stores - conceptual model structure. Inputs to the system are precipitation and potential evapotranspiration, and outputs are actual evapotranspiration and streamflow discharge. Results show the difference between prior and posterior mathematical structures, as well as provide prediction confidence intervals that reflect three types of uncertainty: due to initial conditions, due to input and due to mathematical structure.
Karvelis, Povilas; Seitz, Aaron R; Lawrie, Stephen M; Seriès, Peggy
2018-05-14
Recent theories propose that schizophrenia/schizotypy and autistic spectrum disorder are related to impairments in Bayesian inference that is, how the brain integrates sensory information (likelihoods) with prior knowledge. However existing accounts fail to clarify: (i) how proposed theories differ in accounts of ASD vs. schizophrenia and (ii) whether the impairments result from weaker priors or enhanced likelihoods. Here, we directly address these issues by characterizing how 91 healthy participants, scored for autistic and schizotypal traits, implicitly learned and combined priors with sensory information. This was accomplished through a visual statistical learning paradigm designed to quantitatively assess variations in individuals' likelihoods and priors. The acquisition of the priors was found to be intact along both traits spectra. However, autistic traits were associated with more veridical perception and weaker influence of expectations. Bayesian modeling revealed that this was due, not to weaker prior expectations, but to more precise sensory representations. © 2018, Karvelis et al.
Rapid and reliable protein structure determination via chemical shift threading.
Hafsa, Noor E; Berjanskii, Mark V; Arndt, David; Wishart, David S
2018-01-01
Protein structure determination using nuclear magnetic resonance (NMR) spectroscopy can be both time-consuming and labor intensive. Here we demonstrate how chemical shift threading can permit rapid, robust, and accurate protein structure determination using only chemical shift data. Threading is a relatively old bioinformatics technique that uses a combination of sequence information and predicted (or experimentally acquired) low-resolution structural data to generate high-resolution 3D protein structures. The key motivations behind using NMR chemical shifts for protein threading lie in the fact that they are easy to measure, they are available prior to 3D structure determination, and they contain vital structural information. The method we have developed uses not only sequence and chemical shift similarity but also chemical shift-derived secondary structure, shift-derived super-secondary structure, and shift-derived accessible surface area to generate a high quality protein structure regardless of the sequence similarity (or lack thereof) to a known structure already in the PDB. The method (called E-Thrifty) was found to be very fast (often < 10 min/structure) and to significantly outperform other shift-based or threading-based structure determination methods (in terms of top template model accuracy)-with an average TM-score performance of 0.68 (vs. 0.50-0.62 for other methods). Coupled with recent developments in chemical shift refinement, these results suggest that protein structure determination, using only NMR chemical shifts, is becoming increasingly practical and reliable. E-Thrifty is available as a web server at http://ethrifty.ca .
Structure Learning in Bayesian Sensorimotor Integration
Genewein, Tim; Hez, Eduard; Razzaghpanah, Zeynab; Braun, Daniel A.
2015-01-01
Previous studies have shown that sensorimotor processing can often be described by Bayesian learning, in particular the integration of prior and feedback information depending on its degree of reliability. Here we test the hypothesis that the integration process itself can be tuned to the statistical structure of the environment. We exposed human participants to a reaching task in a three-dimensional virtual reality environment where we could displace the visual feedback of their hand position in a two dimensional plane. When introducing statistical structure between the two dimensions of the displacement, we found that over the course of several days participants adapted their feedback integration process in order to exploit this structure for performance improvement. In control experiments we found that this adaptation process critically depended on performance feedback and could not be induced by verbal instructions. Our results suggest that structural learning is an important meta-learning component of Bayesian sensorimotor integration. PMID:26305797
Exposure to asbestos: psychological responses of mesothelioma patients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lebovits, A.H.; Chahinian, A.P.; Holland, J.C.
1983-01-01
Thirty-eight patients with a diagnosis of malignant mesothelioma participated in a semi-structured interview to evaluate asbestos exposure, acquisition of increased risk information, and retrospective reporting of cognitive and behavioral reactions (particularly smoking behavior) to risk information. Twenty-eight patients (74%) had direct occupational contact with asbestos, and six patients (16%) reported indirect nonoccupational exposure to asbestos. Only two (10%) of the directly exposed patients acquired risk information from professional sources prior to diagnosis of mesothelioma. The most frequently reported reaction to learning of increased risk of cancer was a denial of the risk by minimizing personal exposure. Few patients reported beingmore » concerned about the information of increased risk. Smoking behavior did not change as a result of risk information, nor was there any increase in visits to physicians. Guidelines for psychosocial management of at-risk groups are recommended.« less
Cortical plasticity as a mechanism for storing Bayesian priors in sensory perception.
Köver, Hania; Bao, Shaowen
2010-05-05
Human perception of ambiguous sensory signals is biased by prior experiences. It is not known how such prior information is encoded, retrieved and combined with sensory information by neurons. Previous authors have suggested dynamic encoding mechanisms for prior information, whereby top-down modulation of firing patterns on a trial-by-trial basis creates short-term representations of priors. Although such a mechanism may well account for perceptual bias arising in the short-term, it does not account for the often irreversible and robust changes in perception that result from long-term, developmental experience. Based on the finding that more frequently experienced stimuli gain greater representations in sensory cortices during development, we reasoned that prior information could be stored in the size of cortical sensory representations. For the case of auditory perception, we use a computational model to show that prior information about sound frequency distributions may be stored in the size of primary auditory cortex frequency representations, read-out by elevated baseline activity in all neurons and combined with sensory-evoked activity to generate a perception that conforms to Bayesian integration theory. Our results suggest an alternative neural mechanism for experience-induced long-term perceptual bias in the context of auditory perception. They make the testable prediction that the extent of such perceptual prior bias is modulated by both the degree of cortical reorganization and the magnitude of spontaneous activity in primary auditory cortex. Given that cortical over-representation of frequently experienced stimuli, as well as perceptual bias towards such stimuli is a common phenomenon across sensory modalities, our model may generalize to sensory perception, rather than being specific to auditory perception.
Cooley, Richard L.
1983-01-01
This paper investigates factors influencing the degree of improvement in estimates of parameters of a nonlinear regression groundwater flow model by incorporating prior information of unknown reliability. Consideration of expected behavior of the regression solutions and results of a hypothetical modeling problem lead to several general conclusions. First, if the parameters are properly scaled, linearized expressions for the mean square error (MSE) in parameter estimates of a nonlinear model will often behave very nearly as if the model were linear. Second, by using prior information, the MSE in properly scaled parameters can be reduced greatly over the MSE of ordinary least squares estimates of parameters. Third, plots of estimated MSE and the estimated standard deviation of MSE versus an auxiliary parameter (the ridge parameter) specifying the degree of influence of the prior information on regression results can help determine the potential for improvement of parameter estimates. Fourth, proposed criteria can be used to make appropriate choices for the ridge parameter and another parameter expressing degree of overall bias in the prior information. Results of a case study of Truckee Meadows, Reno-Sparks area, Washoe County, Nevada, conform closely to the results of the hypothetical problem. In the Truckee Meadows case, incorporation of prior information did not greatly change the parameter estimates from those obtained by ordinary least squares. However, the analysis showed that both sets of estimates are more reliable than suggested by the standard errors from ordinary least squares.
[Patient information prior to sterilization].
Rasmussen, O V; Henriksen, L O; Baldur, B; Hansen, T
1992-09-14
The law in Denmark prescribes that the patient and the general practitioner to whom the patient directs his or her request for sterilization are obliged to confirm by their signatures that the patient has received information about sterilization, its risk and consequences. We asked 97 men and 96 women, if they had received this information prior to their sterilization. They were also asked about their knowledge about sterilization. 54% of the women and 35% of the men indicated that they had not received information. Only few of these wished further information by the hospital doctor. Knowledge about sterilization was good. It is concluded that the information to the patient prior to sterilization is far from optimal. The patients' signature confirming verbal information is not a sufficient safeguard. We recommend, among other things, that the patient should receive written information and that both the general practitioner and the hospital responsible for the operation should ensure that optimal information is received by the patient.
Regression analysis using dependent Polya trees.
Schörgendorfer, Angela; Branscum, Adam J
2013-11-30
Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.
Making connections: Listening to visitor conversations at different styles of sea jelly exhibits
NASA Astrophysics Data System (ADS)
Galvan, Tamara M.
This study sought to determine what types of connections to prior experiences and knowledge were being made at two different styles of exhibits focusing on sea jellies. Family groups, consisting of one or two adults with one or two children aged 6-11, were audio recorded and tracked as they visited a view-only or touch pool sea jelly exhibit. A short interview was given after their visit to the sea jelly exhibit. The discourse from the exhibit and survey were coded for types of learning talk. Coding was also done to determine the inspiration for the connection and the subject of the connection (structural or behavioral). Visitors made connections regardless of the seajelly.exhibit design and results showed no differences in the type or frequency of the connections made. However, visitors were more likely to make connections on the subject of the sea jelly structure at the view only exhibit. Many of the connections, regardless of subject or inspiration, were metaphoric connections, demonstrating the importance of metaphors for making prior experience connections. Findings provide useful information for future aquarium practice.
Confidence set inference with a prior quadratic bound
NASA Technical Reports Server (NTRS)
Backus, George E.
1989-01-01
In the uniqueness part of a geophysical inverse problem, the observer wants to predict all likely values of P unknown numerical properties z=(z sub 1,...,z sub p) of the earth from measurement of D other numerical properties y (sup 0) = (y (sub 1) (sup 0), ..., y (sub D (sup 0)), using full or partial knowledge of the statistical distribution of the random errors in y (sup 0). The data space Y containing y(sup 0) is D-dimensional, so when the model space X is infinite-dimensional the linear uniqueness problem usually is insoluble without prior information about the correct earth model x. If that information is a quadratic bound on x, Bayesian inference (BI) and stochastic inversion (SI) inject spurious structure into x, implied by neither the data nor the quadratic bound. Confidence set inference (CSI) provides an alternative inversion technique free of this objection. Confidence set inference is illustrated in the problem of estimating the geomagnetic field B at the core-mantle boundary (CMB) from components of B measured on or above the earth's surface.
Iterative Region-of-Interest Reconstruction from Limited Data Using Prior Information
NASA Astrophysics Data System (ADS)
Vogelgesang, Jonas; Schorr, Christian
2017-12-01
In practice, computed tomography and computed laminography applications suffer from incomplete data. In particular, when inspecting large objects with extremely different diameters in longitudinal and transversal directions or when high resolution reconstructions are desired, the physical conditions of the scanning system lead to restricted data and truncated projections, also known as the interior or region-of-interest (ROI) problem. To recover the searched-for density function of the inspected object, we derive a semi-discrete model of the ROI problem that inherently allows the incorporation of geometrical prior information in an abstract Hilbert space setting for bounded linear operators. Assuming that the attenuation inside the object is approximately constant, as for fibre reinforced plastics parts or homogeneous objects where one is interested in locating defects like cracks or porosities, we apply the semi-discrete Landweber-Kaczmarz method to recover the inner structure of the object inside the ROI from the measured data resulting in a semi-discrete iteration method. Finally, numerical experiments for three-dimensional tomographic applications with both an inherent restricted source and ROI problem are provided to verify the proposed method for the ROI reconstruction.
Cooley, Richard L.
1993-01-01
Calibration data (observed values corresponding to model-computed values of dependent variables) are incorporated into a general method of computing exact Scheffé-type confidence intervals analogous to the confidence intervals developed in part 1 (Cooley, this issue) for a function of parameters derived from a groundwater flow model. Parameter uncertainty is specified by a distribution of parameters conditioned on the calibration data. This distribution was obtained as a posterior distribution by applying Bayes' theorem to the hydrogeologically derived prior distribution of parameters from part 1 and a distribution of differences between the calibration data and corresponding model-computed dependent variables. Tests show that the new confidence intervals can be much smaller than the intervals of part 1 because the prior parameter variance-covariance structure is altered so that combinations of parameters that give poor model fit to the data are unlikely. The confidence intervals of part 1 and the new confidence intervals can be effectively employed in a sequential method of model construction whereby new information is used to reduce confidence interval widths at each stage.
NASA Astrophysics Data System (ADS)
Parrott, Annette M.
Problem. Science teachers are charged with preparing students to become scientifically literate individuals. Teachers are given curriculum that specifies the knowledge that students should come away with; however, they are not necessarily aware of the knowledge with which the student arrives or how best to help them navigate between the two knowledge states. Educators must be aware, not only of where their students are conceptually, but how their students move from their prior knowledge and naive theories, to scientifically acceptable theories. The understanding of how students navigate this course has the potential to revolutionize educational practices. Methods. This study explored how five 9th grade biology students reconstructed their cognitive frameworks and navigated conceptual change from prior conception to consensual genetics knowledge. The research questions investigated were: (1) how do students in the process of changing their naive science theories to accepted science theories describe their journey from prior knowledge to current conception, and (2) what are the methods that students utilize to bridge the gap between alternate and consensual science conceptions to effect conceptual change. Qualitative and quantitative methods were employed to gather and analyze the data. In depth, semi-structured interviews formed the primary data for probing the context and details of students' conceptual change experience. Primary interview data was coded by thematic analysis. Results and discussion. This study revealed information about students' perceived roles in learning, the role of articulation in the conceptual change process, and ways in which a community of learners aids conceptual change. It was ascertained that students see their role in learning primarily as repeating information until they could add that information to their knowledge. Students are more likely to consider challenges to their conceptual frameworks and be more motivated to become active participants in constructing their knowledge when they are working collaboratively with peers instead of receiving instruction from their teacher. Articulation was found to be instrumental in aiding learners in identifying their alternate conceptions as well as in revisiting, investigating and reconstructing their conceptual frameworks. Based on the assumptions generated, suggestions were offered to inform pedagogical practice in support of the conceptual change process.
An Intervention and Assessment to Improve Information Literacy
ERIC Educational Resources Information Center
Scharf, Davida
2013-01-01
Purpose: The goal of the study was to test an intervention using a brief essay as an instrument for evaluating higher-order information literacy skills in college students, while accounting for prior conditions such as socioeconomic status and prior academic achievement, and identify other predictors of information literacy through an evaluation…
External priors for the next generation of CMB experiments
Manzotti, Alessandro; Dodelson, Scott; Park, Youngsoo
2016-03-28
Planned cosmic microwave background (CMB) experiments can dramatically improve what we know about neutrino physics, inflation, and dark energy. The low level of noise, together with improved angular resolution, will increase the signal to noise of the CMB polarized signal as well as the reconstructed lensing potential of high redshift large scale structure. Projected constraints on cosmological parameters are extremely tight, but these can be improved even further with information from external experiments. Here, we examine quantitatively the extent to which external priors can lead to improvement in projected constraints from a CMB-Stage IV (S4) experiment on neutrino and dark energy properties. We find that CMB S4 constraints on neutrino mass could be strongly enhanced by external constraints on the cold dark matter densitymore » $$\\Omega_{c}h^{2}$$ and the Hubble constant $$H_{0}$$. If polarization on the largest scales ($$\\ell<50$$) will not be measured, an external prior on the primordial amplitude $$A_{s}$$ or the optical depth $$\\tau$$ will also be important. A CMB constraint on the number of relativistic degrees of freedom, $$N_{\\rm eff}$$, will benefit from an external prior on the spectral index $$n_{s}$$ and the baryon energy density $$\\Omega_{b}h^{2}$$. Lastly, an external prior on $$H_{0}$$ will help constrain the dark energy equation of state ($w$).« less
Akazawa, Naoki; Okawa, Naomi; Kishi, Masaki; Nakatani, Kiyoshi; Nishikawa, Katsuya; Tokumura, Daichi; Matsui, Yuji; Moriyama, Hideki
2016-09-01
The purpose of this study was to examine the effect of long-term self-massage at the musculotendinous junction on hamstring extensibility, stiffness, stretch tolerance, and structural indices. Single-blind, randomized, controlled trial. Laboratory. Thirty-seven healthy men. The right or left leg of each participant was randomly assigned to the massage group, and the other leg was assigned to the control group. The participants conducted self-massage at the musculotendinous junction for 3 min daily, five times per week, for 12 weeks. Hamstring extensibility, stiffness, stretch tolerance, and structural indices were measured by a blinded examiner prior to the massage intervention and after 6 and 12 weeks of intervention. The maximum hip flexion angle (HFA) and the maximum passive pressure after 6 and 12 weeks of intervention in the massage group were significantly higher than prior to intervention. The visual analog scale (for pain perception) at maximum HFA, the stiffness of the hamstring, and the structural indices did not differ in either group over the 12 week period. Our results suggest that long-term self-massage at the musculotendinous junction increases hamstring extensibility by improving stretch tolerance. However, this intervention does not change hamstring stiffness. University Hospital Medical Information Network registration number UMIN000011233. Copyright © 2016 Elsevier Ltd. All rights reserved.
35 mm PHOTOGRAPH TAKEN PRIOR TO DEMOLITION OF STRUCTURE. SOUTH ...
35 mm PHOTOGRAPH TAKEN PRIOR TO DEMOLITION OF STRUCTURE. SOUTH (SIDE) AND EAST (FRONT) ELEVATIONS OF BUILDING. VIEW TO NORTHWEST - Plattsburgh Air Force Base, Gas Station, New York Road, Plattsburgh, Clinton County, NY
On predicting monitoring system effectiveness
NASA Astrophysics Data System (ADS)
Cappello, Carlo; Sigurdardottir, Dorotea; Glisic, Branko; Zonta, Daniele; Pozzi, Matteo
2015-03-01
While the objective of structural design is to achieve stability with an appropriate level of reliability, the design of systems for structural health monitoring is performed to identify a configuration that enables acquisition of data with an appropriate level of accuracy in order to understand the performance of a structure or its condition state. However, a rational standardized approach for monitoring system design is not fully available. Hence, when engineers design a monitoring system, their approach is often heuristic with performance evaluation based on experience, rather than on quantitative analysis. In this contribution, we propose a probabilistic model for the estimation of monitoring system effectiveness based on information available in prior condition, i.e. before acquiring empirical data. The presented model is developed considering the analogy between structural design and monitoring system design. We assume that the effectiveness can be evaluated based on the prediction of the posterior variance or covariance matrix of the state parameters, which we assume to be defined in a continuous space. Since the empirical measurements are not available in prior condition, the estimation of the posterior variance or covariance matrix is performed considering the measurements as a stochastic variable. Moreover, the model takes into account the effects of nuisance parameters, which are stochastic parameters that affect the observations but cannot be estimated using monitoring data. Finally, we present an application of the proposed model to a real structure. The results show how the model enables engineers to predict whether a sensor configuration satisfies the required performance.
The SwissLipids knowledgebase for lipid biology
Liechti, Robin; Hyka-Nouspikel, Nevila; Niknejad, Anne; Gleizes, Anne; Götz, Lou; Kuznetsov, Dmitry; David, Fabrice P.A.; van der Goot, F. Gisou; Riezman, Howard; Bougueleret, Lydie; Xenarios, Ioannis; Bridge, Alan
2015-01-01
Motivation: Lipids are a large and diverse group of biological molecules with roles in membrane formation, energy storage and signaling. Cellular lipidomes may contain tens of thousands of structures, a staggering degree of complexity whose significance is not yet fully understood. High-throughput mass spectrometry-based platforms provide a means to study this complexity, but the interpretation of lipidomic data and its integration with prior knowledge of lipid biology suffers from a lack of appropriate tools to manage the data and extract knowledge from it. Results: To facilitate the description and exploration of lipidomic data and its integration with prior biological knowledge, we have developed a knowledge resource for lipids and their biology—SwissLipids. SwissLipids provides curated knowledge of lipid structures and metabolism which is used to generate an in silico library of feasible lipid structures. These are arranged in a hierarchical classification that links mass spectrometry analytical outputs to all possible lipid structures, metabolic reactions and enzymes. SwissLipids provides a reference namespace for lipidomic data publication, data exploration and hypothesis generation. The current version of SwissLipids includes over 244 000 known and theoretically possible lipid structures, over 800 proteins, and curated links to published knowledge from over 620 peer-reviewed publications. We are continually updating the SwissLipids hierarchy with new lipid categories and new expert curated knowledge. Availability: SwissLipids is freely available at http://www.swisslipids.org/. Contact: alan.bridge@isb-sib.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25943471
The Power Prior: Theory and Applications
Ibrahim, Joseph G.; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang
2015-01-01
The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A to Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Prequentist properties of power priors in posterior inference are established and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. PMID:26346180
Paudel, M R; Mackenzie, M; Fallone, B G; Rathee, S
2013-08-01
To evaluate the metal artifacts in kilovoltage computed tomography (kVCT) images that are corrected using a normalized metal artifact reduction (NMAR) method with megavoltage CT (MVCT) prior images. Tissue characterization phantoms containing bilateral steel inserts are used in all experiments. Two MVCT images, one without any metal artifact corrections and the other corrected using a modified iterative maximum likelihood polychromatic algorithm for CT (IMPACT) are translated to pseudo-kVCT images. These are then used as prior images without tissue classification in an NMAR technique for correcting the experimental kVCT image. The IMPACT method in MVCT included an additional model for the pair∕triplet production process and the energy dependent response of the MVCT detectors. An experimental kVCT image, without the metal inserts and reconstructed using the filtered back projection (FBP) method, is artificially patched with the known steel inserts to get a reference image. The regular NMAR image containing the steel inserts that uses tissue classified kVCT prior and the NMAR images reconstructed using MVCT priors are compared with the reference image for metal artifact reduction. The Eclipse treatment planning system is used to calculate radiotherapy dose distributions on the corrected images and on the reference image using the Anisotropic Analytical Algorithm with 6 MV parallel opposed 5×10 cm2 fields passing through the bilateral steel inserts, and the results are compared. Gafchromic film is used to measure the actual dose delivered in a plane perpendicular to the beams at the isocenter. The streaking and shading in the NMAR image using tissue classifications are significantly reduced. However, the structures, including metal, are deformed. Some uniform regions appear to have eroded from one side. There is a large variation of attenuation values inside the metal inserts. Similar results are seen in commercially corrected image. Use of MVCT prior images without tissue classification in NMAR significantly reduces these problems. The radiation dose calculated on the reference image is close to the dose measured using the film. Compared to the reference image, the calculated dose difference in the conventional NMAR image, the corrected images using uncorrected MVCT image, and IMPACT corrected MVCT image as priors is ∼15.5%, ∼5%, and ∼2.7%, respectively, at the isocenter. The deformation and erosion of the structures present in regular NMAR corrected images can be largely reduced by using MVCT priors without tissue segmentation. The attenuation value of metal being incorrect, large dose differences relative to the true value can result when using the conventional NMAR image. This difference can be significantly reduced if MVCT images are used as priors. Reduced tissue deformation, better tissue visualization, and correct information about the electron density of the tissues and metals in the artifact corrected images could help delineate the structures better, as well as calculate radiation dose more correctly, thus enhancing the quality of the radiotherapy treatment planning.
Superposing pure quantum states with partial prior information
NASA Astrophysics Data System (ADS)
Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter
2018-05-01
The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.
Bias in diet determination: incorporating traditional methods in Bayesian mixing models.
Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo
2013-01-01
There are not "universal methods" to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators' diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal's diet the sea lion's did not have a clear dominance of any prey. In contrast, SIMM-IP's diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs' estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys' contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators' diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method.
Bias in Diet Determination: Incorporating Traditional Methods in Bayesian Mixing Models
Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G.; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo
2013-01-01
There are not “universal methods” to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators’ diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal’s diet the sea lion’s did not have a clear dominance of any prey. In contrast, SIMM-IP’s diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs’ estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys’ contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators’ diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method. PMID:24224031
Decline in male circumcision in South Korea.
Kim, DaiSik; Koo, Sung-Ae; Pang, Myung-Geol
2012-12-11
To investigate the changing circumcision rate in South Korea in the last decade and to propose underlying causes for this change, in the context of the present fluctuating world-wide trends in circumcision. From 2009 to 2011, 3,296 South Korean males (or their parents) aged 0-64 years were asked about their circumcision status, their age at circumcision, and their information level regarding circumcision. We employed non-probability sampling considering the sensitive questions on the study theme. Currently the age-standardized circumcision rate for South Korean males aged 14-29 is found to be 75.8%. In an earlier study performed in 2002, the rate for the same age group was 86.3%. Of particular interest, males aged 14-16 show a circumcision rate of 56.4%, while the same age group 10 years ago displayed a much higher percentage, at 88.4%. In addition, the extraordinarily high circumcision rate of 95.2% found 10 years ago for the 17-19 age group is now reduced to 74.4%. Interestingly, of the circumcised males, the percentage circumcised in the last decade was only 25.2%; i.e., the majority of the currently circumcised males had undergone the operation prior to 2002, indicating that the actual change in the last decade is far greater. Consistent with this conjecture, the 2002 survey showed that the majority of circumcised males (75.7%) had undergone the operation in the decade prior to that point. Focusing on the flagship age group of 14-16, this drop suggests that, considering the population structure of Korean males, approximately one million fewer circumcision operations have been performed in the last decade relative to the case of non-decline. This decline is strongly correlated with the information available through internet, newspapers, lectures, books, and television: within the circumcised population, both the patients and their parents had less prior knowledge regarding circumcision, other than information obtained from person to person by oral communication. Within the uncircumcised population, the prior knowledge was far greater, suggesting that information discouraging circumcision played an important role. South Korean male circumcision is likely to be undergoing a steep decline. The cause for this decline seems to be the increase in information available on the pros and cons of circumcision.
When Generating Answers Benefits Arithmetic Skill: The Importance of Prior Knowledge
ERIC Educational Resources Information Center
Rittle-Johnson, Bethany; Kmicikewycz, Alexander Oleksij
2008-01-01
People remember information better if they generate the information while studying rather than read the information. However, prior research has not investigated whether this generation effect extends to related but unstudied items and has not been conducted in classroom settings. We compared third graders' success on studied and unstudied…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... techniques of other forms of information technology, e.g., permitting electronic submission of responses..., Equity Size, Prior History with HUD Loans and prior sales participation. By executing the Qualification...
Whitmore, Rebecca; Crooks, Valorie A; Snyder, Jeremy
2015-09-01
This study examines the experiences of informal caregivers in medical tourism through an ethics of care lens. We conducted semi-structured interviews with 20 Canadians who had accompanied their friends or family members abroad for surgery, asking questions that dealt with their experiences prior to, during and after travel. Thematic analysis revealed three themes central to an ethics of care: responsibility, vulnerability and mutuality. Ethics of care theorists have highlighted how care has been historically devalued. We posit that medical tourism reproduces dominant narratives about care in a novel care landscape. Informal care goes unaccounted for by the industry, as it occurs in largely private spaces at a geographic distance from the home countries of medical tourists. Copyright © 2015 Elsevier Ltd. All rights reserved.
Using texts in science education: cognitive processes and knowledge representation.
van den Broek, Paul
2010-04-23
Texts form a powerful tool in teaching concepts and principles in science. How do readers extract information from a text, and what are the limitations in this process? Central to comprehension of and learning from a text is the construction of a coherent mental representation that integrates the textual information and relevant background knowledge. This representation engenders learning if it expands the reader's existing knowledge base or if it corrects misconceptions in this knowledge base. The Landscape Model captures the reading process and the influences of reader characteristics (such as working-memory capacity, reading goal, prior knowledge, and inferential skills) and text characteristics (such as content/structure of presented information, processing demands, and textual cues). The model suggests factors that can optimize--or jeopardize--learning science from text.
Hamra, Ghassan; Richardson, David; Maclehose, Richard; Wing, Steve
2013-01-01
Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology.
Integrating Informative Priors from Experimental Research with Bayesian Methods
Hamra, Ghassan; Richardson, David; MacLehose, Richard; Wing, Steve
2013-01-01
Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology. PMID:23222512
Nowak, Michael D.; Smith, Andrew B.; Simpson, Carl; Zwickl, Derrick J.
2013-01-01
Molecular divergence time analyses often rely on the age of fossil lineages to calibrate node age estimates. Most divergence time analyses are now performed in a Bayesian framework, where fossil calibrations are incorporated as parametric prior probabilities on node ages. It is widely accepted that an ideal parameterization of such node age prior probabilities should be based on a comprehensive analysis of the fossil record of the clade of interest, but there is currently no generally applicable approach for calculating such informative priors. We provide here a simple and easily implemented method that employs fossil data to estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade, which can be used to fit an informative parametric prior probability distribution on a node age. Specifically, our method uses the extant diversity and the stratigraphic distribution of fossil lineages confidently assigned to a clade to fit a branching model of lineage diversification. Conditioning this on a simple model of fossil preservation, we estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade. The likelihood surface of missing history can then be translated into a parametric prior probability distribution on the age of the clade of interest. We show that the method performs well with simulated fossil distribution data, but that the likelihood surface of missing history can at times be too complex for the distribution-fitting algorithm employed by our software tool. An empirical example of the application of our method is performed to estimate echinoid node ages. A simulation-based sensitivity analysis using the echinoid data set shows that node age prior distributions estimated under poor preservation rates are significantly less informative than those estimated under high preservation rates. PMID:23755303
Shi, Jade; Nobrega, R. Paul; Schwantes, Christian; ...
2017-03-08
The dynamics of globular proteins can be described in terms of transitions between a folded native state and less-populated intermediates, or excited states, which can play critical roles in both protein folding and function. Excited states are by definition transient species, and therefore are difficult to characterize using current experimental techniques. We report an atomistic model of the excited state ensemble of a stabilized mutant of an extensively studied flavodoxin fold protein CheY. We employed a hybrid simulation and experimental approach in which an aggregate 42 milliseconds of all-atom molecular dynamics were used as an informative prior for the structuremore » of the excited state ensemble. The resulting prior was then refined against small-angle X-ray scattering (SAXS) data employing an established method (EROS). The most striking feature of the resulting excited state ensemble was an unstructured N-terminus stabilized by non-native contacts in a conformation that is topologically simpler than the native state. We then predict incisive single molecule FRET experiments, using these results, as a means of model validation. Our study demonstrates the paradigm of uniting simulation and experiment in a statistical model to study the structure of protein excited states and rationally design validating experiments.« less
Li, Zhan-Chao; Zhou, Xi-Bin; Dai, Zong; Zou, Xiao-Yong
2009-07-01
A prior knowledge of protein structural classes can provide useful information about its overall structure, so it is very important for quick and accurate determination of protein structural class with computation method in protein science. One of the key for computation method is accurate protein sample representation. Here, based on the concept of Chou's pseudo-amino acid composition (AAC, Chou, Proteins: structure, function, and genetics, 43:246-255, 2001), a novel method of feature extraction that combined continuous wavelet transform (CWT) with principal component analysis (PCA) was introduced for the prediction of protein structural classes. Firstly, the digital signal was obtained by mapping each amino acid according to various physicochemical properties. Secondly, CWT was utilized to extract new feature vector based on wavelet power spectrum (WPS), which contains more abundant information of sequence order in frequency domain and time domain, and PCA was then used to reorganize the feature vector to decrease information redundancy and computational complexity. Finally, a pseudo-amino acid composition feature vector was further formed to represent primary sequence by coupling AAC vector with a set of new feature vector of WPS in an orthogonal space by PCA. As a showcase, the rigorous jackknife cross-validation test was performed on the working datasets. The results indicated that prediction quality has been improved, and the current approach of protein representation may serve as a useful complementary vehicle in classifying other attributes of proteins, such as enzyme family class, subcellular localization, membrane protein types and protein secondary structure, etc.
Brayanov, Jordan B.
2010-01-01
Which is heavier: a pound of lead or a pound of feathers? This classic trick question belies a simple but surprising truth: when lifted, the pound of lead feels heavier—a phenomenon known as the size–weight illusion. To estimate the weight of an object, our CNS combines two imperfect sources of information: a prior expectation, based on the object's appearance, and direct sensory information from lifting it. Bayes' theorem (or Bayes' law) defines the statistically optimal way to combine multiple information sources for maximally accurate estimation. Here we asked whether the mechanisms for combining these information sources produce statistically optimal weight estimates for both perceptions and actions. We first studied the ability of subjects to hold one hand steady when the other removed an object from it, under conditions in which sensory information about the object's weight sometimes conflicted with prior expectations based on its size. Since the ability to steady the supporting hand depends on the generation of a motor command that accounts for lift timing and object weight, hand motion can be used to gauge biases in weight estimation by the motor system. We found that these motor system weight estimates reflected the integration of prior expectations with real-time proprioceptive information in a Bayesian, statistically optimal fashion that discounted unexpected sensory information. This produces a motor size–weight illusion that consistently biases weight estimates toward prior expectations. In contrast, when subjects compared the weights of two objects, their perceptions defied Bayes' law, exaggerating the value of unexpected sensory information. This produces a perceptual size–weight illusion that biases weight perceptions away from prior expectations. We term this effect “anti-Bayesian” because the bias is opposite that seen in Bayesian integration. Our findings suggest that two fundamentally different strategies for the integration of prior expectations with sensory information coexist in the nervous system for weight estimation. PMID:20089821
Berglund, Kristina; Fahlke, Claudia; Berggren, Ulf; Eriksson, Matts; Balldin, Jan
2006-01-01
Studies have shown that most individuals with alcohol problems have never received any treatment for their alcoholism. The purpose of the present study was to describe demographic and clinical characteristics in male individuals with excessive alcohol intake who were recruited by advertisements. These characteristics were compared between individuals with or without prior treatment histories. Subjects (n = 367) responded to the advertisements in a regional daily newspaper and called the investigators. A structured interview was performed and a complete dataset of demographic and clinical information was collected in 342 individuals. Individuals with no prior treatment history (n = 238) were found to be more often cohabitant, employed, and they reported fewer on-going psychiatric symptoms than individuals with treatment histories (n = 104). Since individuals with no prior treatment history seldom experience psychiatric symptoms, they are less likely to seek treatment in the health care system. It is therefore of importance to find ways to reach this 'hidden' group early with excessive alcohol consumption. One way to do so might be via alcohol treatment programs at working places since the majority of them are employed.
Communication: Three-fold covariance imaging of laser-induced Coulomb explosions
NASA Astrophysics Data System (ADS)
Pickering, James D.; Amini, Kasra; Brouard, Mark; Burt, Michael; Bush, Ian J.; Christensen, Lauge; Lauer, Alexandra; Nielsen, Jens H.; Slater, Craig S.; Stapelfeldt, Henrik
2016-04-01
We apply a three-fold covariance imaging method to analyse previously acquired data [C. S. Slater et al., Phys. Rev. A 89, 011401(R) (2014)] on the femtosecond laser-induced Coulomb explosion of spatially pre-aligned 3,5-dibromo-3',5'-difluoro-4'-cyanobiphenyl molecules. The data were acquired using the "Pixel Imaging Mass Spectrometry" camera. We show how three-fold covariance imaging of ionic photofragment recoil trajectories can be used to provide new information about the parent ion's molecular structure prior to its Coulomb explosion. In particular, we show how the analysis may be used to obtain information about molecular conformation and provide an alternative route for enantiomer determination.
Deformable segmentation via sparse representation and dictionary learning.
Zhang, Shaoting; Zhan, Yiqiang; Metaxas, Dimitris N
2012-10-01
"Shape" and "appearance", the two pillars of a deformable model, complement each other in object segmentation. In many medical imaging applications, while the low-level appearance information is weak or mis-leading, shape priors play a more important role to guide a correct segmentation, thanks to the strong shape characteristics of biological structures. Recently a novel shape prior modeling method has been proposed based on sparse learning theory. Instead of learning a generative shape model, shape priors are incorporated on-the-fly through the sparse shape composition (SSC). SSC is robust to non-Gaussian errors and still preserves individual shape characteristics even when such characteristics is not statistically significant. Although it seems straightforward to incorporate SSC into a deformable segmentation framework as shape priors, the large-scale sparse optimization of SSC has low runtime efficiency, which cannot satisfy clinical requirements. In this paper, we design two strategies to decrease the computational complexity of SSC, making a robust, accurate and efficient deformable segmentation system. (1) When the shape repository contains a large number of instances, which is often the case in 2D problems, K-SVD is used to learn a more compact but still informative shape dictionary. (2) If the derived shape instance has a large number of vertices, which often appears in 3D problems, an affinity propagation method is used to partition the surface into small sub-regions, on which the sparse shape composition is performed locally. Both strategies dramatically decrease the scale of the sparse optimization problem and hence speed up the algorithm. Our method is applied on a diverse set of biomedical image analysis problems. Compared to the original SSC, these two newly-proposed modules not only significant reduce the computational complexity, but also improve the overall accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
How Judgments Change Following Comparison of Current and Prior Information
Albarracin, Dolores; Wallace, Harry M.; Hart, William; Brown, Rick D.
2013-01-01
Although much observed judgment change is superficial and occurs without considering prior information, other forms of change also occur. Comparison between prior and new information about an issue may trigger change by influencing either or both the perceived strength and direction of the new information. In four experiments, participants formed and reported initial judgments of a policy based on favorable written information about it. Later, these participants read a second passage containing strong favorable or unfavorable information on the policy. Compared to control conditions, subtle and direct prompts to compare the initial and new information led to more judgment change in the direction of a second passage perceived to be strong. Mediation analyses indicated that comparison yielded greater perceived strength of the second passage, which in turn correlated positively with judgment change. Moreover, self-reports of comparison mediated the judgment change resulting from comparison prompts. PMID:23599557
Tremel, Joshua J; Ortiz, Daniella M; Fiez, Julie A
2018-06-01
When making a decision, we have to identify, collect, and evaluate relevant bits of information to ensure an optimal outcome. How we approach a given choice can be influenced by prior experience. Contextual factors and structural elements of these past decisions can cause a shift in how information is encoded and can in turn influence later decision-making. In this two-experiment study, we sought to manipulate declarative memory efficacy and decision-making in a concurrent discrimination learning task by altering the amount of information to be learned. Subjects learned correct responses to pairs of items across several repetitions of a 50- or 100-pair set and were tested for memory retention. In one experiment, this memory test interrupted learning after an initial encoding experience in order to test for early encoding differences and associate those differences with changes in decision-making. In a second experiment, we used fMRI to probe neural differences between the two list-length groups related to decision-making across learning and assessed subsequent memory retention. We found that a striatum-based system was associated with decision-making patterns when learning a longer list of items, while a medial cortical network was associated with patterns when learning a shorter list. Additionally, the hippocampus was exclusively active for the shorter list group. Altogether, these behavioral, computational, and imaging results provide evidence that multiple types of mnemonic representations contribute to experienced-based decision-making. Moreover, contextual and structural factors of the task and of prior decisions can influence what types of evidence are drawn upon during decision-making. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fast equilibration protocol for million atom systems of highly entangled linear polyethylene chains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sliozberg, Yelena R.; TKC Global, Inc., Aberdeen Proving Ground, Maryland 21005; Kröger, Martin
Equilibrated systems of entangled polymer melts cannot be produced using direct brute force equilibration due to the slow reptation dynamics exhibited by high molecular weight chains. Instead, these dense systems are produced using computational techniques such as Monte Carlo-Molecular Dynamics hybrid algorithms, though the use of soft potentials has also shown promise mainly for coarse-grained polymeric systems. Through the use of soft-potentials, the melt can be equilibrated via molecular dynamics at intermediate and long length scales prior to switching to a Lennard-Jones potential. We will outline two different equilibration protocols, which use various degrees of information to produce the startingmore » configurations. In one protocol, we use only the equilibrium bond angle, bond length, and target density during the construction of the simulation cell, where the information is obtained from available experimental data and extracted from the force field without performing any prior simulation. In the second protocol, we moreover utilize the equilibrium radial distribution function and dihedral angle distribution. This information can be obtained from experimental data or from a simulation of short unentangled chains. Both methods can be used to prepare equilibrated and highly entangled systems, but the second protocol is much more computationally efficient. These systems can be strictly monodisperse or optionally polydisperse depending on the starting chain distribution. Our protocols, which utilize a soft-core harmonic potential, will be applied for the first time to equilibrate a million particle system of polyethylene chains consisting of 1000 united atoms at various temperatures. Calculations of structural and entanglement properties demonstrate that this method can be used as an alternative towards the generation of entangled equilibrium structures.« less
ERIC Educational Resources Information Center
Wetzels, Sandra A. J.; Kester, Liesbeth; van Merrienboer, Jeroen J. G.; Broers, Nick J.
2011-01-01
Background: Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in…
Shah, Abhik; Woolf, Peter
2009-01-01
Summary In this paper, we introduce pebl, a Python library and application for learning Bayesian network structure from data and prior knowledge that provides features unmatched by alternative software packages: the ability to use interventional data, flexible specification of structural priors, modeling with hidden variables and exploitation of parallel processing. PMID:20161541
ERIC Educational Resources Information Center
Dickey, Patsy A.
1980-01-01
Forty female students were used to compare the incremental difference in heart rate of shorthand writers when they were informed and not informed of shorthand speeds prior to dictation. It was concluded that students' performances were enhanced by receiving instructions as to speed of dictation prior to the take. (Author/CT)
Macedo, Paula G; Kapa, Suraj; Mears, Jennifer A; Fratianni, Amy; Asirvatham, Samuel J
2010-07-01
Ablation procedures for atrial fibrillation have become an established and increasingly used option for managing patients with symptomatic arrhythmia. The anatomic structures relevant to the pathogenesis of atrial fibrillation and ablation procedures are varied and include the pulmonary veins, other thoracic veins, the left atrial myocardium, and autonomic ganglia. Exact regional anatomic knowledge of these structures is essential to allow correlation with fluoroscopy and electrograms and, importantly, to avoid complications from damage of adjacent structures within the chest. We present this information as a series of 2 articles. In a prior issue, we have discussed the thoracic vein anatomy relevant to paroxysmal atrial fibrillation. In the present article, we focus on the atria themselves, the autonomic ganglia, and anatomic issues relevant for minimizing complications during atrial fibrillation ablation.
Evaluation of two methods for using MR information in PET reconstruction
NASA Astrophysics Data System (ADS)
Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.
2013-02-01
Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed.
Pelcastre-Villafuerte, Blanca; Riquer-Fernández, Florinda; de León-Reyes, Verónica; Reyes-Morales, Hortensia; Gutiérrez-Trujillo, Gonzalo; Bronfman, Mario
2006-01-01
To describe and compare household dynamics in terms of structure, beliefs and nutrition-related behavior in the homes of malnourished and well-nourished children less than five years of age. The authors carried out a qualitative ethnographic study using participant observation, and in depth interviews. Interviews were conducted with the child's caretaker or key informants, prior oral informed consent. Child care and childhood feeding practices at home and in the community were the focus of observations. The study included two periods of field work conducted in 2001, in three rural municipalities from the Río Balsas region, in Guerrero state, Mexico. The study's ethical and methodological aspects were approved by the National Research Commission of the Mexican Institute of Social Security. Households were differentially characterized by number of members, composition, type of relationship, source of income, and interactions among household members and with the community. Monoparental structures, in an early stage of the household cycle, give rise to conditions that render the child prone to malnutrition. Extended family structure represented more favorable household dynamics.
Toward link predictability of complex networks
Lü, Linyuan; Pan, Liming; Zhou, Tao; Zhang, Yi-Cheng; Stanley, H. Eugene
2015-01-01
The organization of real networks usually embodies both regularities and irregularities, and, in principle, the former can be modeled. The extent to which the formation of a network can be explained coincides with our ability to predict missing links. To understand network organization, we should be able to estimate link predictability. We assume that the regularity of a network is reflected in the consistency of structural features before and after a random removal of a small set of links. Based on the perturbation of the adjacency matrix, we propose a universal structural consistency index that is free of prior knowledge of network organization. Extensive experiments on disparate real-world networks demonstrate that (i) structural consistency is a good estimation of link predictability and (ii) a derivative algorithm outperforms state-of-the-art link prediction methods in both accuracy and robustness. This analysis has further applications in evaluating link prediction algorithms and monitoring sudden changes in evolving network mechanisms. It will provide unique fundamental insights into the above-mentioned academic research fields, and will foster the development of advanced information filtering technologies of interest to information technology practitioners. PMID:25659742
Isomer Information from Ion Mobility Separation of High-Mannose Glycan Fragments
NASA Astrophysics Data System (ADS)
Harvey, David J.; Seabright, Gemma E.; Vasiljevic, Snezana; Crispin, Max; Struwe, Weston B.
2018-05-01
Extracted arrival time distributions of negative ion CID-derived fragments produced prior to traveling-wave ion mobility separation were evaluated for their ability to provide structural information on N-linked glycans. Fragmentation of high-mannose glycans released from several glycoproteins, including those from viral sources, provided over 50 fragments, many of which gave unique collisional cross-sections and provided additional information used to assign structural isomers. For example, cross-ring fragments arising from cleavage of the reducing terminal GlcNAc residue on Man8GlcNAc2 isomers have unique collision cross-sections enabling isomers to be differentiated in mixtures. Specific fragment collision cross-sections enabled identification of glycans, the antennae of which terminated in the antigenic α-galactose residue, and ions defining the composition of the 6-antenna of several of the glycans were also found to have different cross-sections from isomeric ions produced in the same spectra. Potential mechanisms for the formation of the various ions are discussed and the estimated collisional cross-sections are tabulated. [Figure not available: see fulltext.
Sang, Xiahan; LeBeau, James M
2014-03-01
We report the development of revolving scanning transmission electron microscopy--RevSTEM--a technique that enables characterization and removal of sample drift distortion from atomic resolution images without the need for a priori crystal structure information. To measure and correct the distortion, we acquire an image series while rotating the scan coordinate system between successive frames. Through theory and experiment, we show that the revolving image series captures the information necessary to analyze sample drift rate and direction. At atomic resolution, we quantify the image distortion using the projective standard deviation, a rapid, real-space method to directly measure lattice vector angles. By fitting these angles to a physical model, we show that the refined drift parameters provide the input needed to correct distortion across the series. We demonstrate that RevSTEM simultaneously removes the need for a priori structure information to correct distortion, leads to a dramatically improved signal-to-noise ratio, and enables picometer precision and accuracy regardless of drift rate. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Shidahara, M.; Tsoumpas, C.; McGinnity, C. J.; Kato, T.; Tamura, H.; Hammers, A.; Watabe, H.; Turkheimer, F. E.
2012-05-01
The objective of this study was to evaluate a resolution recovery (RR) method using a variety of simulated human brain [11C]raclopride positron emission tomography (PET) images. Simulated datasets of 15 numerical human phantoms were processed by a wavelet-based RR method using an anatomical prior. The anatomical prior was in the form of a hybrid segmented atlas, which combined an atlas for anatomical labelling and a PET image for functional labelling of each anatomical structure. We applied RR to both 60 min static and dynamic PET images. Recovery was quantified in 84 regions, comparing the typical ‘true’ value for the simulation, as obtained in normal subjects, simulated and RR PET images. The radioactivity concentration in the white matter, striatum and other cortical regions was successfully recovered for the 60 min static image of all 15 human phantoms; the dependence of the solution on accurate anatomical information was demonstrated by the difficulty of the technique to retrieve the subthalamic nuclei due to mismatch between the two atlases used for data simulation and recovery. Structural and functional synergy for resolution recovery (SFS-RR) improved quantification in the caudate and putamen, the main regions of interest, from -30.1% and -26.2% to -17.6% and -15.1%, respectively, for the 60 min static image and from -51.4% and -38.3% to -27.6% and -20.3% for the binding potential (BPND) image, respectively. The proposed methodology proved effective in the RR of small structures from brain [11C]raclopride PET images. The improvement is consistent across the anatomical variability of a simulated population as long as accurate anatomical segmentations are provided.
What are they up to? The role of sensory evidence and prior knowledge in action understanding.
Chambon, Valerian; Domenech, Philippe; Pacherie, Elisabeth; Koechlin, Etienne; Baraduc, Pierre; Farrer, Chlöé
2011-02-18
Explaining or predicting the behaviour of our conspecifics requires the ability to infer the intentions that motivate it. Such inferences are assumed to rely on two types of information: (1) the sensory information conveyed by movement kinematics and (2) the observer's prior expectations--acquired from past experience or derived from prior knowledge. However, the respective contribution of these two sources of information is still controversial. This controversy stems in part from the fact that "intention" is an umbrella term that may embrace various sub-types each being assigned different scopes and targets. We hypothesized that variations in the scope and target of intentions may account for variations in the contribution of visual kinematics and prior knowledge to the intention inference process. To test this hypothesis, we conducted four behavioural experiments in which participants were instructed to identify different types of intention: basic intentions (i.e. simple goal of a motor act), superordinate intentions (i.e. general goal of a sequence of motor acts), or social intentions (i.e. intentions accomplished in a context of reciprocal interaction). For each of the above-mentioned intentions, we varied (1) the amount of visual information available from the action scene and (2) participant's prior expectations concerning the intention that was more likely to be accomplished. First, we showed that intentional judgments depend on a consistent interaction between visual information and participant's prior expectations. Moreover, we demonstrated that this interaction varied according to the type of intention to be inferred, with participant's priors rather than perceptual evidence exerting a greater effect on the inference of social and superordinate intentions. The results are discussed by appealing to the specific properties of each type of intention considered and further interpreted in the light of a hierarchical model of action representation.
The cost of concreteness: the effect of nonessential information on analogical transfer.
Kaminski, Jennifer A; Sloutsky, Vladimir M; Heckler, Andrew F
2013-03-01
Most theories of analogical transfer focus on similarities between the learning and transfer domains, where transfer is more likely between domains that share common surface features, similar elements, or common interpretations of structure. We suggest that characteristics of the learning instantiation alone can give rise to different levels of transfer. We propose that concreteness of the learning instantiation can hinder analogical transfer of well-defined structured concepts, such as mathematical concepts. We operationalize the term concreteness as the amount of information communicated through a specific instantiation of a concept. The 5 reported experiments with undergraduate students tested the hypothesis by presenting participants with the concept of a commutative mathematical group of order 3. The experiments varied the level of concreteness of the training instantiation and measured transfer of learning to a new instantiation. The results support the hypothesis, demonstrating better transfer from more generic instantiations (i.e., ones that communicate minimal extraneous information) than from more concrete instantiations. Specifically, concreteness was found to create an obstacle to successful structural alignment across domains, whereas generic instantiations led to spontaneous structural alignment. These findings have important implications for the theory of learning and transfer and practical implications for the design of educational material. Although some concreteness may activate prior knowledge and perhaps offer a leg up in the learning process, this benefit may come at the cost of transfer.
Acar, Evrim; Plopper, George E.; Yener, Bülent
2012-01-01
The structure/function relationship is fundamental to our understanding of biological systems at all levels, and drives most, if not all, techniques for detecting, diagnosing, and treating disease. However, at the tissue level of biological complexity we encounter a gap in the structure/function relationship: having accumulated an extraordinary amount of detailed information about biological tissues at the cellular and subcellular level, we cannot assemble it in a way that explains the correspondingly complex biological functions these structures perform. To help close this information gap we define here several quantitative temperospatial features that link tissue structure to its corresponding biological function. Both histological images of human tissue samples and fluorescence images of three-dimensional cultures of human cells are used to compare the accuracy of in vitro culture models with their corresponding human tissues. To the best of our knowledge, there is no prior work on a quantitative comparison of histology and in vitro samples. Features are calculated from graph theoretical representations of tissue structures and the data are analyzed in the form of matrices and higher-order tensors using matrix and tensor factorization methods, with a goal of differentiating between cancerous and healthy states of brain, breast, and bone tissues. We also show that our techniques can differentiate between the structural organization of native tissues and their corresponding in vitro engineered cell culture models. PMID:22479315
The power prior: theory and applications.
Ibrahim, Joseph G; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang
2015-12-10
The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A-to-Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Frequentist properties of power priors in posterior inference are established, and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. Copyright © 2015 John Wiley & Sons, Ltd.
Structure of NCI Cooperative Groups Program Prior to NCTN
Learn how the National Cancer Institute’s Cooperative Groups Program was structured prior to its being replaced by NCI’s National Clinical Trials Network (NCTN). The NCTN gives funds and other support to cancer research organizations to conduct cancer clinical trials.
Development and Validation of a Lifecycle-based Prognostics Architecture with Test Bed Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hines, J. Wesley; Upadhyaya, Belle; Sharp, Michael
On-line monitoring and tracking of nuclear plant system and component degradation is being investigated as a method for improving the safety, reliability, and maintainability of aging nuclear power plants. Accurate prediction of the current degradation state of system components and structures is important for accurate estimates of their remaining useful life (RUL). The correct quantification and propagation of both the measurement uncertainty and model uncertainty is necessary for quantifying the uncertainty of the RUL prediction. This research project developed and validated methods to perform RUL estimation throughout the lifecycle of plant components. Prognostic methods should seamlessly operate from beginning ofmore » component life (BOL) to end of component life (EOL). We term this "Lifecycle Prognostics." When a component is put into use, the only information available may be past failure times of similar components used in similar conditions, and the predicted failure distribution can be estimated with reliability methods such as Weibull Analysis (Type I Prognostics). As the component operates, it begins to degrade and consume its available life. This life consumption may be a function of system stresses, and the failure distribution should be updated to account for the system operational stress levels (Type II Prognostics). When degradation becomes apparent, this information can be used to again improve the RUL estimate (Type III Prognostics). This research focused on developing prognostics algorithms for the three types of prognostics, developing uncertainty quantification methods for each of the algorithms, and, most importantly, developing a framework using Bayesian methods to transition between prognostic model types and update failure distribution estimates as new information becomes available. The developed methods were then validated on a range of accelerated degradation test beds. The ultimate goal of prognostics is to provide an accurate assessment for RUL predictions, with as little uncertainty as possible. From a reliability and maintenance standpoint, there would be improved safety by avoiding all failures. Calculated risk would decrease, saving money by avoiding unnecessary maintenance. One major bottleneck for data-driven prognostics is the availability of run-to-failure degradation data. Without enough degradation data leading to failure, prognostic models can yield RUL distributions with large uncertainty or mathematically unsound predictions. To address these issues a "Lifecycle Prognostics" method was developed to create RUL distributions from Beginning of Life (BOL) to End of Life (EOL). This employs established Type I, II, and III prognostic methods, and Bayesian transitioning between each Type. Bayesian methods, as opposed to classical frequency statistics, show how an expected value, a priori, changes with new data to form a posterior distribution. For example, when you purchase a component you have a prior belief, or estimation, of how long it will operate before failing. As you operate it, you may collect information related to its condition that will allow you to update your estimated failure time. Bayesian methods are best used when limited data are available. The use of a prior also means that information is conserved when new data are available. The weightings of the prior belief and information contained in the sampled data are dependent on the variance (uncertainty) of the prior, the variance (uncertainty) of the data, and the amount of measured data (number of samples). If the variance of the prior is small compared to the uncertainty of the data, the prior will be weighed more heavily. However, as more data are collected, the data will be weighted more heavily and will eventually swamp out the prior in calculating the posterior distribution of model parameters. Fundamentally Bayesian analysis updates a prior belief with new data to get a posterior belief. The general approach to applying the Bayesian method to lifecycle prognostics consisted of identifying the prior, which is the RUL estimate and uncertainty from the previous prognostics type, and combining it with observational data related to the newer prognostics type. The resulting lifecycle prognostics algorithm uses all available information throughout the component lifecycle.« less
Demographic analysis from summaries of an age-structured population
Link, William A.; Royle, J. Andrew; Hatfield, Jeff S.
2003-01-01
Demographic analyses of age-structured populations typically rely on life history data for individuals, or when individual animals are not identified, on information about the numbers of individuals in each age class through time. While it is usually difficult to determine the age class of a randomly encountered individual, it is often the case that the individual can be readily and reliably assigned to one of a set of age classes. For example, it is often possible to distinguish first-year from older birds. In such cases, the population age structure can be regarded as a latent variable governed by a process prior, and the data as summaries of this latent structure. In this article, we consider the problem of uncovering the latent structure and estimating process parameters from summaries of age class information. We present a demographic analysis for the critically endangered migratory population of whooping cranes (Grus americana), based only on counts of first-year birds and of older birds. We estimate age and year-specific survival rates. We address the controversial issue of whether management action on the breeding grounds has influenced recruitment, relating recruitment rates to the number of seventh-year and older birds, and examining the pattern of variation through time in this rate.
Structure Biology of Membrane Bound Enzymes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Dax
The overall goal of the proposed research is to understand the membrane-associated active processes catalyzed by an alkanemore » $$\\square$$-hydroxylase (AlkB) from eubacterium Pseudomonase oleovorans. AlkB performs oxygenation of unactivated hydrocarbons found in crude oils. The enzymatic reaction involves energy-demanding steps in the membrane with the uses of structurally unknown metal active sites featuring a diiron [FeFe] center. At present, a critical barrier to understanding the membrane-associated reaction mechanism is the lack of structural information. The structural biology efforts have been challenged by technical difficulties commonly encountered in crystallization and structural determination of membrane proteins. The specific aims of the current budget cycle are to crystalize AlkB and initiate X-ray analysis to set the stage for structural determination. The long-term goals of our structural biology efforts are to provide an atomic description of AlkB structure, and to uncover the mechanisms of selective modification of hydrocarbons. The structural information will help elucidating how the unactivated C-H bonds of saturated hydrocarbons are oxidized to initiate biodegradation and biotransformation processes. The knowledge gained will be fundamental to biotechnological applications to biofuel transformation of non-edible oil feedstock. Renewable biodiesel is a promising energy carry that can be used to reduce fossil fuel dependency. The proposed research capitalizes on prior BES-supported efforts on over-expression and purification of AlkB to explore the inner workings of a bioenergy-relevant membrane-bound enzyme.« less
Laidlaw, Toni Suzuki; Kaufman, David M; MacLeod, Heather; van Zanten, Sander; Simpson, David; Wrixon, William
2006-01-01
A substantial body of literature demonstrates that communication skills in medicine can be taught and retained through teaching and practice. Considerable evidence also reveals that characteristics such as gender, age, language and attitudes affect communication skills performance. Our study examined the characteristics, attitudes and prior communication skills training of residents to determine the relationship of each to patient-doctor communication. The relationship between communication skills proficiency and clinical knowledge application (biomedical and ethical) was also examined through the use of doctor-developed clinical content checklists, as very little research has been conducted in this area. A total of 78 first- and second-year residents across all departments at Dalhousie Medical School participated in a videotaped 4-station objective structured clinical examination presenting a range of communication and clinical knowledge challenges. A variety of instruments were used to gather information and assess performance. Two expert raters evaluated the videotapes. Significant relationships were observed between resident characteristics, prior communication skills training, clinical knowledge and communication skills performance. Females, younger residents and residents with English as first language scored significantly higher, as did residents with prior communication skills training. A significant positive relationship was found between the clinical content checklist and communication performance. Gender was the only characteristic related significantly to attitudes. Gender, age, language and prior communication skills training are related to communication skills performance and have implications for resident education. The positive relationship between communication skills proficiency and clinical knowledge application is important and should be explored further.
Informative priors on fetal fraction increase power of the noninvasive prenatal screen.
Xu, Hanli; Wang, Shaowei; Ma, Lin-Lin; Huang, Shuai; Liang, Lin; Liu, Qian; Liu, Yang-Yang; Liu, Ke-Di; Tan, Ze-Min; Ban, Hao; Guan, Yongtao; Lu, Zuhong
2017-11-09
PurposeNoninvasive prenatal screening (NIPS) sequences a mixture of the maternal and fetal cell-free DNA. Fetal trisomy can be detected by examining chromosomal dosages estimated from sequencing reads. The traditional method uses the Z-test, which compares a subject against a set of euploid controls, where the information of fetal fraction is not fully utilized. Here we present a Bayesian method that leverages informative priors on the fetal fraction.MethodOur Bayesian method combines the Z-test likelihood and informative priors of the fetal fraction, which are learned from the sex chromosomes, to compute Bayes factors. Bayesian framework can account for nongenetic risk factors through the prior odds, and our method can report individual positive/negative predictive values.ResultsOur Bayesian method has more power than the Z-test method. We analyzed 3,405 NIPS samples and spotted at least 9 (of 51) possible Z-test false positives.ConclusionBayesian NIPS is more powerful than the Z-test method, is able to account for nongenetic risk factors through prior odds, and can report individual positive/negative predictive values.Genetics in Medicine advance online publication, 9 November 2017; doi:10.1038/gim.2017.186.
Anatomy structure creation and editing using 3D implicit surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hibbard, Lyndon S.
2012-05-15
Purpose: To accurately reconstruct, and interactively reshape 3D anatomy structures' surfaces using small numbers of 2D contours drawn in the most visually informative views of 3D imagery. The innovation of this program is that the number of 2D contours can be very much smaller than the number of transverse sections, even for anatomy structures spanning many sections. This program can edit 3D structures from prior segmentations, including those from autosegmentation programs. The reconstruction and surface editing works with any image modality. Methods: Structures are represented by variational implicit surfaces defined by weighted sums of radial basis functions (RBFs). Such surfacesmore » are smooth, continuous, and closed and can be reconstructed with RBFs optimally located to efficiently capture shape in any combination of transverse (T), sagittal (S), and coronal (C) views. The accuracy of implicit surface reconstructions was measured by comparisons with the corresponding expert-contoured surfaces in 103 prostate cancer radiotherapy plans. Editing a pre-existing surface is done by overdrawing its profiles in image views spanning the affected part of the structure, deleting an appropriate set of prior RBFs, and merging the remainder with the new edit contour RBFs. Two methods were devised to identify RBFs to be deleted based only on the geometry of the initial surface and the locations of the new RBFs. Results: Expert-contoured surfaces were compared with implicit surfaces reconstructed from them over varying numbers and combinations of T/S/C planes. Studies revealed that surface-surface agreement increases monotonically with increasing RBF-sample density, and that the rate of increase declines over the same range. These trends were observed for all surface agreement metrics and for all the organs studied--prostate, bladder, and rectum. In addition, S and C contours may convey more shape information than T views for CT studies in which the axial slice thickness is greater than the pixel size. Surface editing accuracy likewise improves with larger sampling densities, and the rate of improvement similarly declines over the same conditions. Conclusions: Implicit surfaces based on RBFs are accurate representations of anatomic structures and can be interactively generated or modified to correct segmentation errors. The number of input contours is typically smaller than the number of T contours spanned by the structure.« less
22 CFR 129.8 - Prior notification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Prior notification. 129.8 Section 129.8 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS REGISTRATION AND LICENSING OF...,000, except for sharing of basic marketing information (e.g., information that does not include...
Bayesian Phase II optimization for time-to-event data based on historical information.
Bertsche, Anja; Fleischer, Frank; Beyersmann, Jan; Nehmiz, Gerhard
2017-01-01
After exploratory drug development, companies face the decision whether to initiate confirmatory trials based on limited efficacy information. This proof-of-concept decision is typically performed after a Phase II trial studying a novel treatment versus either placebo or an active comparator. The article aims to optimize the design of such a proof-of-concept trial with respect to decision making. We incorporate historical information and develop pre-specified decision criteria accounting for the uncertainty of the observed treatment effect. We optimize these criteria based on sensitivity and specificity, given the historical information. Specifically, time-to-event data are considered in a randomized 2-arm trial with additional prior information on the control treatment. The proof-of-concept criterion uses treatment effect size, rather than significance. Criteria are defined on the posterior distribution of the hazard ratio given the Phase II data and the historical control information. Event times are exponentially modeled within groups, allowing for group-specific conjugate prior-to-posterior calculation. While a non-informative prior is placed on the investigational treatment, the control prior is constructed via the meta-analytic-predictive approach. The design parameters including sample size and allocation ratio are then optimized, maximizing the probability of taking the right decision. The approach is illustrated with an example in lung cancer.
A space-frequency multiplicative regularization for force reconstruction problems
NASA Astrophysics Data System (ADS)
Aucejo, M.; De Smet, O.
2018-05-01
Dynamic forces reconstruction from vibration data is an ill-posed inverse problem. A standard approach to stabilize the reconstruction consists in using some prior information on the quantities to identify. This is generally done by including in the formulation of the inverse problem a regularization term as an additive or a multiplicative constraint. In the present article, a space-frequency multiplicative regularization is developed to identify mechanical forces acting on a structure. The proposed regularization strategy takes advantage of one's prior knowledge of the nature and the location of excitation sources, as well as that of their spectral contents. Furthermore, it has the merit to be free from the preliminary definition of any regularization parameter. The validity of the proposed regularization procedure is assessed numerically and experimentally. It is more particularly pointed out that properly exploiting the space-frequency characteristics of the excitation field to identify can improve the quality of the force reconstruction.
Zipf's word frequency law in natural language: a critical review and future directions.
Piantadosi, Steven T
2014-10-01
The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf's law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf's law and are then used to evaluate many of the theoretical explanations of Zipf's law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf's law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data.
Supervised Learning for Dynamical System Learning.
Hefny, Ahmed; Downey, Carlton; Gordon, Geoffrey J
2015-01-01
Recently there has been substantial interest in spectral methods for learning dynamical systems. These methods are popular since they often offer a good tradeoff between computational and statistical efficiency. Unfortunately, they can be difficult to use and extend in practice: e.g., they can make it difficult to incorporate prior information such as sparsity or structure. To address this problem, we present a new view of dynamical system learning: we show how to learn dynamical systems by solving a sequence of ordinary supervised learning problems, thereby allowing users to incorporate prior knowledge via standard techniques such as L 1 regularization. Many existing spectral methods are special cases of this new framework, using linear regression as the supervised learner. We demonstrate the effectiveness of our framework by showing examples where nonlinear regression or lasso let us learn better state representations than plain linear regression does; the correctness of these instances follows directly from our general analysis.
Bayesian hierarchical functional data analysis via contaminated informative priors.
Scarpa, Bruno; Dunson, David B
2009-09-01
A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.
Discovering Network Structure Beyond Communities
NASA Astrophysics Data System (ADS)
Nishikawa, Takashi; Motter, Adilson E.
2011-11-01
To understand the formation, evolution, and function of complex systems, it is crucial to understand the internal organization of their interaction networks. Partly due to the impossibility of visualizing large complex networks, resolving network structure remains a challenging problem. Here we overcome this difficulty by combining the visual pattern recognition ability of humans with the high processing speed of computers to develop an exploratory method for discovering groups of nodes characterized by common network properties, including but not limited to communities of densely connected nodes. Without any prior information about the nature of the groups, the method simultaneously identifies the number of groups, the group assignment, and the properties that define these groups. The results of applying our method to real networks suggest the possibility that most group structures lurk undiscovered in the fast-growing inventory of social, biological, and technological networks of scientific interest.
Aydogan, Gökhan; Flaig, Nicole; Ravi, Srekar N; Large, Edward W; McClure, Samuel M; Margulis, Elizabeth Hellmuth
2018-04-18
Prior expectations can bias evaluative judgments of sensory information. We show that information about a performer's status can bias the evaluation of musical stimuli, reflected by differential activity of the ventromedial prefrontal cortex (vmPFC). Moreover, we demonstrate that decreased susceptibility to this confirmation bias is (a) accompanied by the recruitment of and (b) correlated with the white-matter structure of the executive control network, particularly related to the dorsolateral prefrontal cortex (dlPFC). By using long-duration musical stimuli, we were able to track the initial biasing, subsequent perception, and ultimate evaluation of the stimuli, examining the full evolution of these biases over time. Our findings confirm the persistence of confirmation bias effects even when ample opportunity exists to gather information about true stimulus quality, and underline the importance of executive control in reducing bias.
Standardizing the information architecture for spacecraft operations
NASA Technical Reports Server (NTRS)
Easton, C. R.
1994-01-01
This paper presents an information architecture developed for the Space Station Freedom as a model from which to derive an information architecture standard for advanced spacecraft. The information architecture provides a way of making information available across a program, and among programs, assuming that the information will be in a variety of local formats, structures and representations. It provides a format that can be expanded to define all of the physical and logical elements that make up a program, add definitions as required, and import definitions from prior programs to a new program. It allows a spacecraft and its control center to work in different representations and formats, with the potential for supporting existing spacecraft from new control centers. It supports a common view of data and control of all spacecraft, regardless of their own internal view of their data and control characteristics, and of their communications standards, protocols and formats. This information architecture is central to standardizing spacecraft operations, in that it provides a basis for information transfer and translation, such that diverse spacecraft can be monitored and controlled in a common way.
McLuckey, Scott A.; Mentinova, Marija
2011-01-01
A range of strategies and tools has been developed to facilitate the determination of primary structures of analyte molecules of interest via tandem mass spectrometry (MS/MS). The two main factors that determine the primary structural information present in an MS/MS spectrum are the type of ion generated from the analyte molecule and the dissociation method. The ion-type subjected to dissociation is determined by the ionization method/conditions and ion transformation processes that might take place after initial gas-phase ion formation. Furthermore, the range of analyte-related ion types can be expanded via derivatization reactions prior to mass spectrometry. Dissociation methods include those that simply alter the population of internal states of the mass-selected ion (i.e., activation methods like collision-induced dissociation) as well as processes that rely on transformation of the ion-type prior to dissociation (e.g., electron capture dissociation). A variety of ionic interactions has been studied for the purpose of ion dissociation and ion transformation that include ion/neutral, ion/photon, ion/electron, and ion/ion interactions. A wide range of phenomena has been observed, many of which have been explored/developed as means for structural analysis. The techniques arising from these phenomena are discussed within the context of the elements of structure determination in tandem mass spectrometry, viz., ion-type definition and dissociation. Unique aspects of the various ion interactions are emphasized along with any barriers to widespread implementation. PMID:21472539
2018-01-01
Abstract In real-world environments, humans comprehend speech by actively integrating prior knowledge (P) and expectations with sensory input. Recent studies have revealed effects of prior information in temporal and frontal cortical areas and have suggested that these effects are underpinned by enhanced encoding of speech-specific features, rather than a broad enhancement or suppression of cortical activity. However, in terms of the specific hierarchical stages of processing involved in speech comprehension, the effects of integrating bottom-up sensory responses and top-down predictions are still unclear. In addition, it is unclear whether the predictability that comes with prior information may differentially affect speech encoding relative to the perceptual enhancement that comes with that prediction. One way to investigate these issues is through examining the impact of P on indices of cortical tracking of continuous speech features. Here, we did this by presenting participants with degraded speech sentences that either were or were not preceded by a clear recording of the same sentences while recording non-invasive electroencephalography (EEG). We assessed the impact of prior information on an isolated index of cortical tracking that reflected phoneme-level processing. Our findings suggest the possibility that prior information affects the early encoding of natural speech in a dual manner. Firstly, the availability of prior information, as hypothesized, enhanced the perceived clarity of degraded speech, which was positively correlated with changes in phoneme-level encoding across subjects. In addition, P induced an overall reduction of this cortical measure, which we interpret as resulting from the increase in predictability. PMID:29662947
The Effects of Prior Knowledge Activation on Free Recall and Study Time Allocation.
ERIC Educational Resources Information Center
Machiels-Bongaerts, Maureen; And Others
The effects of mobilizing prior knowledge on information processing were studied. Two hypotheses, the cognitive set-point hypothesis and the selective attention hypothesis, try to account for the facilitation effects of prior knowledge activation. These hypotheses predict different recall patterns as a result of mobilizing prior knowledge. In…
21 CFR 1.280 - How must you submit prior notice?
Code of Federal Regulations, 2010 CFR
2010-04-01
... to FDA. You must submit all prior notice information in the English language, except that an... Commercial System (ABI/ACS); or (2) The FDA PNSI at http://www.access.fda.gov. You must submit prior notice through the FDA Prior Notice System Interface (FDA PNSI) for articles of food imported or offered for...
The Counter-Intuitive Non-Informative Prior for the Bernoulli Family
ERIC Educational Resources Information Center
Zhu, Mu; Lu, Arthur Y.
2004-01-01
In Bayesian statistics, the choice of the prior distribution is often controversial. Different rules for selecting priors have been suggested in the literature, which, sometimes, produce priors that are difficult for the students to understand intuitively. In this article, we use a simple heuristic to illustrate to the students the rather…
van de Schoot, Rens; Broere, Joris J.; Perryck, Koen H.; Zondervan-Zwijnenburg, Mariëlle; van Loey, Nancy E.
2015-01-01
Background The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS) following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis-) specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis. PMID:25765534
van de Schoot, Rens; Broere, Joris J; Perryck, Koen H; Zondervan-Zwijnenburg, Mariëlle; van Loey, Nancy E
2015-01-01
Background : The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS) following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods : First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis-) specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results : Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion : We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis.
Heidari, M.; Ranjithan, S.R.
1998-01-01
In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paudel, M. R.; Mackenzie, M.; Rathee, S.
2013-08-15
Purpose: To evaluate the metal artifacts in kilovoltage computed tomography (kVCT) images that are corrected using a normalized metal artifact reduction (NMAR) method with megavoltage CT (MVCT) prior images.Methods: Tissue characterization phantoms containing bilateral steel inserts are used in all experiments. Two MVCT images, one without any metal artifact corrections and the other corrected using a modified iterative maximum likelihood polychromatic algorithm for CT (IMPACT) are translated to pseudo-kVCT images. These are then used as prior images without tissue classification in an NMAR technique for correcting the experimental kVCT image. The IMPACT method in MVCT included an additional model formore » the pair/triplet production process and the energy dependent response of the MVCT detectors. An experimental kVCT image, without the metal inserts and reconstructed using the filtered back projection (FBP) method, is artificially patched with the known steel inserts to get a reference image. The regular NMAR image containing the steel inserts that uses tissue classified kVCT prior and the NMAR images reconstructed using MVCT priors are compared with the reference image for metal artifact reduction. The Eclipse treatment planning system is used to calculate radiotherapy dose distributions on the corrected images and on the reference image using the Anisotropic Analytical Algorithm with 6 MV parallel opposed 5 × 10 cm{sup 2} fields passing through the bilateral steel inserts, and the results are compared. Gafchromic film is used to measure the actual dose delivered in a plane perpendicular to the beams at the isocenter.Results: The streaking and shading in the NMAR image using tissue classifications are significantly reduced. However, the structures, including metal, are deformed. Some uniform regions appear to have eroded from one side. There is a large variation of attenuation values inside the metal inserts. Similar results are seen in commercially corrected image. Use of MVCT prior images without tissue classification in NMAR significantly reduces these problems. The radiation dose calculated on the reference image is close to the dose measured using the film. Compared to the reference image, the calculated dose difference in the conventional NMAR image, the corrected images using uncorrected MVCT image, and IMPACT corrected MVCT image as priors is ∼15.5%, ∼5%, and ∼2.7%, respectively, at the isocenter.Conclusions: The deformation and erosion of the structures present in regular NMAR corrected images can be largely reduced by using MVCT priors without tissue segmentation. The attenuation value of metal being incorrect, large dose differences relative to the true value can result when using the conventional NMAR image. This difference can be significantly reduced if MVCT images are used as priors. Reduced tissue deformation, better tissue visualization, and correct information about the electron density of the tissues and metals in the artifact corrected images could help delineate the structures better, as well as calculate radiation dose more correctly, thus enhancing the quality of the radiotherapy treatment planning.« less
Active polarization descattering.
Treibitz, Tali; Schechner, Yoav Y
2009-03-01
Vision in scattering media is important but challenging. Images suffer from poor visibility due to backscattering and attenuation. Most prior methods for scene recovery use active illumination scanners (structured and gated), which can be slow and cumbersome, while natural illumination is inapplicable to dark environments. The current paper addresses the need for a non-scanning recovery method, that uses active scene irradiance. We study the formation of images under widefield artificial illumination. Based on the formation model, the paper presents an approach for recovering the object signal. It also yields rough information about the 3D scene structure. The approach can work with compact, simple hardware, having active widefield, polychromatic polarized illumination. The camera is fitted with a polarization analyzer. Two frames of the scene are taken, with different states of the analyzer or polarizer. A recovery algorithm follows the acquisition. It allows both the backscatter and the object reflection to be partially polarized. It thus unifies and generalizes prior polarization-based methods, which had assumed exclusive polarization of either of these components. The approach is limited to an effective range, due to image noise and illumination falloff. Thus, the limits and noise sensitivity are analyzed. We demonstrate the approach in underwater field experiments.
Moya, Cristina
2013-07-01
Ethnic categories uniquely structure human social worlds. People readily form stereotypes about these, and other social categories, but it is unclear whether certain dimensions are privileged for making predictions about strangers when information is limited. If humans have been living in culturally-structured groups for much of their evolutionary history, we might expect them to have adaptations for prioritizing ethno-linguistic cues as a basis for making predictions about others. We provide a strong test of this possibility through a series of studies in a field context along the Quechua-Aymara linguistic boundary in the Peruvian Altiplano where the language boundary is not particularly socially meaningful. We find evidence of such psychological priors among children and adults at this site by showing that their age, and the social categories' novelty affect participants' reliance on ethno-linguistic inductive inferences (i.e. one-to-many predictions). Studies 1-3 show that participants make more ethno-linguistic inferences when the social categories are more removed from their real-world context. Additionally, in Study 4 when the category is marked with acoustic cues of language use, young children rely heavily on ethno-linguistic predictions, even though adults do not.
Moya, Cristina
2013-01-01
Ethnic categories uniquely structure human social worlds. People readily form stereotypes about these, and other social categories, but it is unclear whether certain dimensions are privileged for making predictions about strangers when information is limited. If humans have been living in culturally-structured groups for much of their evolutionary history, we might expect them to have adaptations for prioritizing ethno-linguistic cues as a basis for making predictions about others. We provide a strong test of this possibility through a series of studies in a field context along the Quechua–Aymara linguistic boundary in the Peruvian Altiplano where the language boundary is not particularly socially meaningful. We find evidence of such psychological priors among children and adults at this site by showing that their age, and the social categories’ novelty affect participants’ reliance on ethno-linguistic inductive inferences (i.e. one-to-many predictions). Studies 1–3 show that participants make more ethno-linguistic inferences when the social categories are more removed from their real-world context. Additionally, in Study 4 when the category is marked with acoustic cues of language use, young children rely heavily on ethno-linguistic predictions, even though adults do not. PMID:24072962
Family Structure and Long-Term Care Insurance Purchase
Van Houtven, Courtney Harold; Coe, Norma B.; Konetzka, R. Tamara
2015-01-01
While it has long been assumed that family structure and potential sources of informal care play a large role in the purchase decisions for long-term care insurance (LTCI), current empirical evidence is inconclusive. Our study examines the relationship between family structure and LTCI purchase and addresses several major limitations of the prior literature by using a long panel of data and considering modern family relationships, such as presence of stepchildren. We find that family structure characteristics from one’s own generation, particularly about one’s spouse, are associated with purchase, but that few family structure attributes from the younger generation have an influence. Family factors that may indicate future caregiver supply are negatively associated with purchase: having a coresidential child, signaling close proximity, and having a currently working spouse, signaling a healthy and able spouse, that LTC planning has not occurred yet, or that there is less need for asset protection afforded by LTCI. Dynamic factors, such as increasing wealth or turning 65, are associated with higher likelihood of LTCI purchase. PMID:25760583
Seghouane, Abd-Krim; Iqbal, Asif
2017-09-01
Sequential dictionary learning algorithms have been successfully applied to functional magnetic resonance imaging (fMRI) data analysis. fMRI data sets are, however, structured data matrices with the notions of temporal smoothness in the column direction. This prior information, which can be converted into a constraint of smoothness on the learned dictionary atoms, has seldomly been included in classical dictionary learning algorithms when applied to fMRI data analysis. In this paper, we tackle this problem by proposing two new sequential dictionary learning algorithms dedicated to fMRI data analysis by accounting for this prior information. These algorithms differ from the existing ones in their dictionary update stage. The steps of this stage are derived as a variant of the power method for computing the SVD. The proposed algorithms generate regularized dictionary atoms via the solution of a left regularized rank-one matrix approximation problem where temporal smoothness is enforced via regularization through basis expansion and sparse basis expansion in the dictionary update stage. Applications on synthetic data experiments and real fMRI data sets illustrating the performance of the proposed algorithms are provided.
Bouhaddou, Omar; Davis, Mike; Donahue, Margaret; Mallia, Anthony; Griffin, Stephania; Teal, Jennifer; Nebeker, Jonathan
2016-01-01
Care coordination across healthcare organizations depends upon health information exchange. Various policies and laws govern permissible exchange, particularly when the information includes privacy sensitive conditions. The Department of Veterans Affairs (VA) privacy policy has required either blanket consent or manual sensitivity review prior to exchanging any health information. The VA experience has been an expensive, administratively demanding burden on staffand Veterans alike, particularly for patients without privacy sensitive conditions. Until recently, automatic sensitivity determination has not been feasible. This paper proposes a policy-driven algorithmic approach (Security Labeling Service or SLS) to health information exchange that automatically detects the presence or absence of specific privacy sensitive conditions and then, to only require a Veteran signed consent for release when actually present. The SLS was applied successfully to a sample of real patient Consolidated-Clinical Document Architecture(C-CDA) documents. The SLS identified standard terminology codes by both parsing structured entries and analyzing textual information using Natural Language Processing (NLP). PMID:28269828
Bouhaddou, Omar; Davis, Mike; Donahue, Margaret; Mallia, Anthony; Griffin, Stephania; Teal, Jennifer; Nebeker, Jonathan
2016-01-01
Care coordination across healthcare organizations depends upon health information exchange. Various policies and laws govern permissible exchange, particularly when the information includes privacy sensitive conditions. The Department of Veterans Affairs (VA) privacy policy has required either blanket consent or manual sensitivity review prior to exchanging any health information. The VA experience has been an expensive, administratively demanding burden on staffand Veterans alike, particularly for patients without privacy sensitive conditions. Until recently, automatic sensitivity determination has not been feasible. This paper proposes a policy-driven algorithmic approach (Security Labeling Service or SLS) to health information exchange that automatically detects the presence or absence of specific privacy sensitive conditions and then, to only require a Veteran signed consent for release when actually present. The SLS was applied successfully to a sample of real patient Consolidated-Clinical Document Architecture(C-CDA) documents. The SLS identified standard terminology codes by both parsing structured entries and analyzing textual information using Natural Language Processing (NLP).
Physicians' Perceptions and Use of a Health Information Exchange: A Pilot Program in South Korea
Lee, Sang-il; Kim, Jeong-Whun; Hwang, Hee; Cho, Eun-Young; Kim, Yoon; Ha, Kyooseob
2012-01-01
Abstract Objective: We examined physicians' perceived needs, benefits, and concerns regarding health information exchange (HIE) technology prior to experiencing it and their actual usage of exchanged information in care processes during an HIE pilot program in South Korea. Materials and Methods: We conducted a survey through a structured questionnaire to collect data on physician perceptions about an HIE prior to implementation. We analyzed responses using descriptive statistics and one-way analyses of variance. We also conducted a post-implementation survey through a computerized tool designed to collect data on physician assessment of HIE item usefulness. We defined two indices to measure the volume of information flow and usefulness for individual HIE items and analyzed the indices with Fisher's exact test. Results: Physicians' overall perceptions about the need for an HIE and benefits of the technology were positive despite their concerns about information safety and security, system costs, and disputes between care providers in cases of malpractice. We found that physician practice settings significantly influenced the details of their perceptions. In the both pre- and post-implementation studies, the most needed and valued information were pathology and lab results, diagnostic imaging, medication, and working diagnosis. Physicians most agreed with the benefit potentials in the quality domain, least agreed with those in time and cost savings of healthcare delivery, and least worried about the decrease in revenues resulting from the technology. Conclusions: The overall physician acceptance of the HIE technology in South Korea is promising, but the adoption and diffusion strategy needs to be tailored to the type of physician practice. Concerted efforts may be needed to realize the much-anticipated potential of healthcare cost savings. PMID:22352898
Adaptive allocation for binary outcomes using decreasingly informative priors.
Sabo, Roy T
2014-01-01
A method of outcome-adaptive allocation is presented using Bayes methods, where a natural lead-in is incorporated through the use of informative yet skeptical prior distributions for each treatment group. These prior distributions are modeled on unobserved data in such a way that their influence on the allocation scheme decreases as the trial progresses. Simulation studies show this method to behave comparably to the Bayesian adaptive allocation method described by Thall and Wathen (2007), who incorporate a natural lead-in through sample-size-based exponents.
Proactive interference effects on sentence production
FERREIRA, VICTOR S.; FIRATO, CARLA E.
2007-01-01
Proactive interference refers to recall difficulties caused by prior similar memory-related processing. Information-processing approaches to sentence production predict that retrievability affects sentence form: Speakers may word sentences so that material that is difficult to retrieve is spoken later. In this experiment, speakers produced sentence structures that could include an optional that, thereby delaying the mention of a subsequent noun phrase. This subsequent noun phrase was either (1) conceptually similar to three previous noun phrases in the same sentence, leading to greater proactive interference, or (2) conceptually dissimilar, leading to less proactive interference. Speakers produced more thats (and were more disfluencies) before conceptually similar noun phrases, suggesting that retrieval difficulties during sentence production affect the syntactic structures of sentences that speakers produce. PMID:12613685
Hyde, Damon; Schulz, Ralf; Brooks, Dana; Miller, Eric; Ntziachristos, Vasilis
2009-04-01
Hybrid imaging systems combining x-ray computed tomography (CT) and fluorescence tomography can improve fluorescence imaging performance by incorporating anatomical x-ray CT information into the optical inversion problem. While the use of image priors has been investigated in the past, little is known about the optimal use of forward photon propagation models in hybrid optical systems. In this paper, we explore the impact on reconstruction accuracy of the use of propagation models of varying complexity, specifically in the context of these hybrid imaging systems where significant structural information is known a priori. Our results demonstrate that the use of generically known parameters provides near optimal performance, even when parameter mismatch remains.
Advanced prior modeling for 3D bright field electron tomography
NASA Astrophysics Data System (ADS)
Sreehari, Suhas; Venkatakrishnan, S. V.; Drummy, Lawrence F.; Simmons, Jeffrey P.; Bouman, Charles A.
2015-03-01
Many important imaging problems in material science involve reconstruction of images containing repetitive non-local structures. Model-based iterative reconstruction (MBIR) could in principle exploit such redundancies through the selection of a log prior probability term. However, in practice, determining such a log prior term that accounts for the similarity between distant structures in the image is quite challenging. Much progress has been made in the development of denoising algorithms like non-local means and BM3D, and these are known to successfully capture non-local redundancies in images. But the fact that these denoising operations are not explicitly formulated as cost functions makes it unclear as to how to incorporate them in the MBIR framework. In this paper, we formulate a solution to bright field electron tomography by augmenting the existing bright field MBIR method to incorporate any non-local denoising operator as a prior model. We accomplish this using a framework we call plug-and-play priors that decouples the log likelihood and the log prior probability terms in the MBIR cost function. We specifically use 3D non-local means (NLM) as the prior model in the plug-and-play framework, and showcase high quality tomographic reconstructions of a simulated aluminum spheres dataset, and two real datasets of aluminum spheres and ferritin structures. We observe that streak and smear artifacts are visibly suppressed, and that edges are preserved. Also, we report lower RMSE values compared to the conventional MBIR reconstruction using qGGMRF as the prior model.
Mueller, Jutta L; Rueschemeyer, Shirley-Ann; Ono, Kentaro; Sugiura, Motoaki; Sadato, Norihiro; Nakamura, Akinori
2014-01-01
The present study used functional magnetic resonance imaging (fMRI) to investigate the neural correlates of language acquisition in a realistic learning environment. Japanese native speakers were trained in a miniature version of German prior to fMRI scanning. During scanning they listened to (1) familiar sentences, (2) sentences including a novel sentence structure, and (3) sentences containing a novel word while visual context provided referential information. Learning-related decreases of brain activation over time were found in a mainly left-hemispheric network comprising classical frontal and temporal language areas as well as parietal and subcortical regions and were largely overlapping for novel words and the novel sentence structure in initial stages of learning. Differences occurred at later stages of learning during which content-specific activation patterns in prefrontal, parietal and temporal cortices emerged. The results are taken as evidence for a domain-general network supporting the initial stages of language learning which dynamically adapts as learners become proficient.
7 CFR 1955.145 - Land acquisition to effect sale.
Code of Federal Regulations, 2010 CFR
2010-01-01
... organization-type loans, prior approval must be obtained from the appropriate Assistant Administrator prior to... and where it is economically feasible to relocate the structure thereby making it a program property... structure if economically feasible. The remaining NP parcel of land will be sold for its market value. (b...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-06
... Proposed Information Collection to OMB Multifamily Project Applications and Construction Prior to Initial... facilities is also required as part of the application for firm commitment for mortgage insurance. Project owners/sponsors may apply for permission to commence construction prior to initial endorsement. DATES...
Calibrated birth-death phylogenetic time-tree priors for bayesian inference.
Heled, Joseph; Drummond, Alexei J
2015-05-01
Here we introduce a general class of multiple calibration birth-death tree priors for use in Bayesian phylogenetic inference. All tree priors in this class separate ancestral node heights into a set of "calibrated nodes" and "uncalibrated nodes" such that the marginal distribution of the calibrated nodes is user-specified whereas the density ratio of the birth-death prior is retained for trees with equal values for the calibrated nodes. We describe two formulations, one in which the calibration information informs the prior on ranked tree topologies, through the (conditional) prior, and the other which factorizes the prior on divergence times and ranked topologies, thus allowing uniform, or any arbitrary prior distribution on ranked topologies. Although the first of these formulations has some attractive properties, the algorithm we present for computing its prior density is computationally intensive. However, the second formulation is always faster and computationally efficient for up to six calibrations. We demonstrate the utility of the new class of multiple-calibration tree priors using both small simulations and a real-world analysis and compare the results to existing schemes. The two new calibrated tree priors described in this article offer greater flexibility and control of prior specification in calibrated time-tree inference and divergence time dating, and will remove the need for indirect approaches to the assessment of the combined effect of calibration densities and tree priors in Bayesian phylogenetic inference. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Merging information in geophysics: the triumvirat of geology, geophysics, and petrophysics
NASA Astrophysics Data System (ADS)
Revil, A.
2016-12-01
We know that geophysical inversion is non-unique and that many classical regularization techniques are unphysical. Despite this, we like to use them because of their simplicity and because geophysicists are often afraid to bias the inverse problem by introducing too much prior information (in a broad sense). It is also clear that geophysics is done on geological objects that are not random structures. Spending some time with a geologist in the field, before organizing a field geophysical campaign, is always an instructive experience. Finally, the measured properties are connected to physicochemical and textural parameters of the porous media and the interfaces between the various phases of a porous body. .Some fundamental parameters may control the geophysical observtions or their time variations. If we want to improve our geophysical tomograms, we need to be risk-takers and acknowledge, or rather embrqce, the cross-fertilization arising by coupling geology, geophysics, and ptrophysics. In this presentation, I will discuss various techniques to do so. They will include non-stationary geostatistical descriptors, facies deformation, cross-coupled petrophysical properties using petrophysical clustering, and image-guided inversion. I will show various applications to a number of relevant cases in hydrogeophysics. From these applications, it may become clear that there are many ways to address inverse or time-lapse inverse problems and geophysicists have to be pragmatic regarding the methods used depending on the degree of available prior information.
Language identification from visual-only speech signals
Ronquest, Rebecca E.; Levi, Susannah V.; Pisoni, David B.
2010-01-01
Our goal in the present study was to examine how observers identify English and Spanish from visual-only displays of speech. First, we replicated the recent findings of Soto-Faraco et al. (2007) with Spanish and English bilingual and monolingual observers using different languages and a different experimental paradigm (identification). We found that prior linguistic experience affected response bias but not sensitivity (Experiment 1). In two additional experiments, we investigated the visual cues that observers use to complete the language-identification task. The results of Experiment 2 indicate that some lexical information is available in the visual signal but that it is limited. Acoustic analyses confirmed that our Spanish and English stimuli differed acoustically with respect to linguistic rhythmic categories. In Experiment 3, we tested whether this rhythmic difference could be used by observers to identify the language when the visual stimuli is temporally reversed, thereby eliminating lexical information but retaining rhythmic differences. The participants performed above chance even in the backward condition, suggesting that the rhythmic differences between the two languages may aid language identification in visual-only speech signals. The results of Experiments 3A and 3B also confirm previous findings that increased stimulus length facilitates language identification. Taken together, the results of these three experiments replicate earlier findings and also show that prior linguistic experience, lexical information, rhythmic structure, and utterance length influence visual-only language identification. PMID:20675804
Fan, Yue; Wang, Xiao; Peng, Qinke
2017-01-01
Gene regulatory networks (GRNs) play an important role in cellular systems and are important for understanding biological processes. Many algorithms have been developed to infer the GRNs. However, most algorithms only pay attention to the gene expression data but do not consider the topology information in their inference process, while incorporating this information can partially compensate for the lack of reliable expression data. Here we develop a Bayesian group lasso with spike and slab priors to perform gene selection and estimation for nonparametric models. B-spline basis functions are used to capture the nonlinear relationships flexibly and penalties are used to avoid overfitting. Further, we incorporate the topology information into the Bayesian method as a prior. We present the application of our method on DREAM3 and DREAM4 datasets and two real biological datasets. The results show that our method performs better than existing methods and the topology information prior can improve the result.
Terzyan, Simon S; Burgett, Anthony W G; Heroux, Annie; Smith, Clyde A; Mooers, Blaine H M; Hanigan, Marie H
2015-07-10
γ-Glutamyl transpeptidase 1 (GGT1) is a cell surface, N-terminal nucleophile hydrolase that cleaves glutathione and other γ-glutamyl compounds. GGT1 expression is essential in cysteine homeostasis, and its induction has been implicated in the pathology of asthma, reperfusion injury, and cancer. In this study, we report four new crystal structures of human GGT1 (hGGT1) that show conformational changes within the active site as the enzyme progresses from the free enzyme to inhibitor-bound tetrahedral transition states and finally to the glutamate-bound structure prior to the release of this final product of the reaction. The structure of the apoenzyme shows flexibility within the active site. The serine-borate-bound hGGT1 crystal structure demonstrates that serine-borate occupies the active site of the enzyme, resulting in an enzyme-inhibitor complex that replicates the enzyme's tetrahedral intermediate/transition state. The structure of GGsTop-bound hGGT1 reveals its interactions with the enzyme and why neutral phosphonate diesters are more potent inhibitors than monoanionic phosphonates. These structures are the first structures for any eukaryotic GGT that include a molecule in the active site covalently bound to the catalytic Thr-381. The glutamate-bound structure shows the conformation of the enzyme prior to release of the final product and reveals novel information regarding the displacement of the main chain atoms that form the oxyanion hole and movement of the lid loop region when the active site is occupied. These data provide new insights into the mechanism of hGGT1-catalyzed reactions and will be invaluable in the development of new classes of hGGT1 inhibitors for therapeutic use. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Terzyan, Simon S.; Burgett, Anthony W. G.; Heroux, Annie; ...
2015-05-26
γ-Glutamyl transpeptidase 1 (GGT1) is a cell surface, N-terminal nucleophile hydrolase that cleaves glutathione and other γ-glutamyl compounds. GGT1 expression is essential in cysteine homeostasis, and its induction has been implicated in the pathology of asthma, reperfusion injury, and cancer. In this study, we report four new crystal structures of human GGT1 (hGGT1) that show conformational changes within the active site as the enzyme progresses from the free enzyme to inhibitor-bound tetrahedral transition states and finally to the glutamate-bound structure prior to the release of this final product of the reaction. The structure of the apoenzyme shows flexibility within themore » active site. The serine-borate-bound hGGT1 crystal structure demonstrates that serine-borate occupies the active site of the enzyme, resulting in an enzyme-inhibitor complex that replicates the enzyme's tetrahedral intermediate/transition state. The structure of GGsTop-bound hGGT1 reveals its interactions with the enzyme and why neutral phosphonate diesters are more potent inhibitors than monoanionic phosphonates. These structures are the first structures for any eukaryotic GGT that include a molecule in the active site covalently bound to the catalytic Thr-381. The glutamate-bound structure shows the conformation of the enzyme prior to release of the final product and reveals novel information regarding the displacement of the main chain atoms that form the oxyanion hole and movement of the lid loop region when the active site is occupied. Lastly,tThese data provide new insights into the mechanism of hGGT1-catalyzed reactions and will be invaluable in the development of new classes of hGGT1 inhibitors for therapeutic use.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terzyan, Simon S.; Burgett, Anthony W. G.; Heroux, Annie
γ-Glutamyl transpeptidase 1 (GGT1) is a cell surface, N-terminal nucleophile hydrolase that cleaves glutathione and other γ-glutamyl compounds. GGT1 expression is essential in cysteine homeostasis, and its induction has been implicated in the pathology of asthma, reperfusion injury, and cancer. In this study, we report four new crystal structures of human GGT1 (hGGT1) that show conformational changes within the active site as the enzyme progresses from the free enzyme to inhibitor-bound tetrahedral transition states and finally to the glutamate-bound structure prior to the release of this final product of the reaction. The structure of the apoenzyme shows flexibility within themore » active site. The serine-borate-bound hGGT1 crystal structure demonstrates that serine-borate occupies the active site of the enzyme, resulting in an enzyme-inhibitor complex that replicates the enzyme's tetrahedral intermediate/transition state. The structure of GGsTop-bound hGGT1 reveals its interactions with the enzyme and why neutral phosphonate diesters are more potent inhibitors than monoanionic phosphonates. These structures are the first structures for any eukaryotic GGT that include a molecule in the active site covalently bound to the catalytic Thr-381. The glutamate-bound structure shows the conformation of the enzyme prior to release of the final product and reveals novel information regarding the displacement of the main chain atoms that form the oxyanion hole and movement of the lid loop region when the active site is occupied. Lastly,tThese data provide new insights into the mechanism of hGGT1-catalyzed reactions and will be invaluable in the development of new classes of hGGT1 inhibitors for therapeutic use.« less
Influence of prior information on pain involves biased perceptual decision-making.
Wiech, Katja; Vandekerckhove, Joachim; Zaman, Jonas; Tuerlinckx, Francis; Vlaeyen, Johan W S; Tracey, Irene
2014-08-04
Prior information about features of a stimulus is a strong modulator of perception. For instance, the prospect of more intense pain leads to an increased perception of pain, whereas the expectation of analgesia reduces pain, as shown in placebo analgesia and expectancy modulations during drug administration. This influence is commonly assumed to be rooted in altered sensory processing and expectancy-related modulations in the spinal cord, are often taken as evidence for this notion. Contemporary models of perception, however, suggest that prior information can also modulate perception by biasing perceptual decision-making - the inferential process underlying perception in which prior information is used to interpret sensory information. In this type of bias, the information is already present in the system before the stimulus is observed. Computational models can distinguish between changes in sensory processing and altered decision-making as they result in different response times for incorrect choices in a perceptual decision-making task (Figure S1A,B). Using a drift-diffusion model, we investigated the influence of both processes in two independent experiments. The results of both experiments strongly suggest that these changes in pain perception are predominantly based on altered perceptual decision-making. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Multilevel modeling of single-case data: A comparison of maximum likelihood and Bayesian estimation.
Moeyaert, Mariola; Rindskopf, David; Onghena, Patrick; Van den Noortgate, Wim
2017-12-01
The focus of this article is to describe Bayesian estimation, including construction of prior distributions, and to compare parameter recovery under the Bayesian framework (using weakly informative priors) and the maximum likelihood (ML) framework in the context of multilevel modeling of single-case experimental data. Bayesian estimation results were found similar to ML estimation results in terms of the treatment effect estimates, regardless of the functional form and degree of information included in the prior specification in the Bayesian framework. In terms of the variance component estimates, both the ML and Bayesian estimation procedures result in biased and less precise variance estimates when the number of participants is small (i.e., 3). By increasing the number of participants to 5 or 7, the relative bias is close to 5% and more precise estimates are obtained for all approaches, except for the inverse-Wishart prior using the identity matrix. When a more informative prior was added, more precise estimates for the fixed effects and random effects were obtained, even when only 3 participants were included. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Hobbs, Brian P.; Carlin, Bradley P.; Mandrekar, Sumithra J.; Sargent, Daniel J.
2011-01-01
Summary Bayesian clinical trial designs offer the possibility of a substantially reduced sample size, increased statistical power, and reductions in cost and ethical hazard. However when prior and current information conflict, Bayesian methods can lead to higher than expected Type I error, as well as the possibility of a costlier and lengthier trial. This motivates an investigation of the feasibility of hierarchical Bayesian methods for incorporating historical data that are adaptively robust to prior information that reveals itself to be inconsistent with the accumulating experimental data. In this paper, we present several models that allow for the commensurability of the information in the historical and current data to determine how much historical information is used. A primary tool is elaborating the traditional power prior approach based upon a measure of commensurability for Gaussian data. We compare the frequentist performance of several methods using simulations, and close with an example of a colon cancer trial that illustrates a linear models extension of our adaptive borrowing approach. Our proposed methods produce more precise estimates of the model parameters, in particular conferring statistical significance to the observed reduction in tumor size for the experimental regimen as compared to the control regimen. PMID:21361892
NASA Astrophysics Data System (ADS)
White, Dave D.; Virden, Randy J.; van Riper, Carena J.
2008-10-01
It is generally accepted that recreation use in natural environments results in some degree of negative social and environmental impact. Environmental managers are tasked with mitigating the impact while providing beneficial recreation opportunities. Research on the factors that influence visitors’ perceptions of environmental and social conditions is necessary to inform sound environmental management of protected natural areas. This study examines the effect of prior experience with the setting and two dimensions of place attachment (i.e., place identity and place dependence) on visitors’ perceptions of three types of recreation impacts (i.e., depreciative behavior, environmental impacts, and recreation conflict). Principal components analysis, confirmatory factor analysis, and structural equation modeling were used to test the study hypotheses using data collected from 351 visitors through on-site questionnaires (response rate of 93 percent). The results show that prior experience exhibited a moderate and significant direct positive effect on place identity, place dependence, and visitors’ perceptions of recreation impacts. Contrary to study hypotheses and prior research, neither place dependence nor place identity exhibited a significant effect on the dependent variables. The results show that prior experience causes visitors to be more sensitive to depreciative behaviors, environmental impacts, and recreation conflict. These findings raise concerns over potential visitor displacement and deterioration of site conditions. Implications for resource managers are discussed, which include education, modifying visitor use patterns, and site design strategies.
White, Dave D; Virden, Randy J; van Riper, Carena J
2008-10-01
It is generally accepted that recreation use in natural environments results in some degree of negative social and environmental impact. Environmental managers are tasked with mitigating the impact while providing beneficial recreation opportunities. Research on the factors that influence visitors' perceptions of environmental and social conditions is necessary to inform sound environmental management of protected natural areas. This study examines the effect of prior experience with the setting and two dimensions of place attachment (i.e., place identity and place dependence) on visitors' perceptions of three types of recreation impacts (i.e., depreciative behavior, environmental impacts, and recreation conflict). Principal components analysis, confirmatory factor analysis, and structural equation modeling were used to test the study hypotheses using data collected from 351 visitors through on-site questionnaires (response rate of 93 percent). The results show that prior experience exhibited a moderate and significant direct positive effect on place identity, place dependence, and visitors' perceptions of recreation impacts. Contrary to study hypotheses and prior research, neither place dependence nor place identity exhibited a significant effect on the dependent variables. The results show that prior experience causes visitors to be more sensitive to depreciative behaviors, environmental impacts, and recreation conflict. These findings raise concerns over potential visitor displacement and deterioration of site conditions. Implications for resource managers are discussed, which include education, modifying visitor use patterns, and site design strategies.
Prior-Based Quantization Bin Matching for Cloud Storage of JPEG Images.
Liu, Xianming; Cheung, Gene; Lin, Chia-Wen; Zhao, Debin; Gao, Wen
2018-07-01
Millions of user-generated images are uploaded to social media sites like Facebook daily, which translate to a large storage cost. However, there exists an asymmetry in upload and download data: only a fraction of the uploaded images are subsequently retrieved for viewing. In this paper, we propose a cloud storage system that reduces the storage cost of all uploaded JPEG photos, at the expense of a controlled increase in computation mainly during download of requested image subset. Specifically, the system first selectively re-encodes code blocks of uploaded JPEG images using coarser quantization parameters for smaller storage sizes. Then during download, the system exploits known signal priors-sparsity prior and graph-signal smoothness prior-for reverse mapping to recover original fine quantization bin indices, with either deterministic guarantee (lossless mode) or statistical guarantee (near-lossless mode). For fast reverse mapping, we use small dictionaries and sparse graphs that are tailored for specific clusters of similar blocks, which are classified via tree-structured vector quantizer. During image upload, cluster indices identifying the appropriate dictionaries and graphs for the re-quantized blocks are encoded as side information using a differential distributed source coding scheme to facilitate reverse mapping during image download. Experimental results show that our system can reap significant storage savings (up to 12.05%) at roughly the same image PSNR (within 0.18 dB).
Influence of social norms and palatability on amount consumed and food choice.
Pliner, Patricia; Mann, Nikki
2004-04-01
In two parallel studies, we examined the effect of social influence and palatability on amount consumed and on food choice. In Experiment 1, which looked at amount consumed, participants were provided with either palatable or unpalatable food; they were also given information about how much previous participants had eaten (large or small amounts) or were given no information. In the case of palatable food, participants ate more when led to believe that prior participants had eaten a great deal than when led to believe that prior participants had eaten small amounts or when provided with no information. This social-influence effect was not present when participants received unpalatable food. In Experiment 2, which looked at food choice, some participants learned that prior participants had chosen the palatable food, others learned that prior participants had chosen the unpalatable food, while still others received no information about prior participants' choices. The social-influence manipulation had no effect on participants' food choices; nearly all of them chose the palatable food. The results were discussed in the context of Churchfield's (1995) distinction between judgments about matters of fact and judgments about preferences. The results were also used to illustrate the importance of palatability as a determinant of eating behavior.
What Are They Up To? The Role of Sensory Evidence and Prior Knowledge in Action Understanding
Chambon, Valerian; Domenech, Philippe; Pacherie, Elisabeth; Koechlin, Etienne; Baraduc, Pierre; Farrer, Chlöé
2011-01-01
Explaining or predicting the behaviour of our conspecifics requires the ability to infer the intentions that motivate it. Such inferences are assumed to rely on two types of information: (1) the sensory information conveyed by movement kinematics and (2) the observer's prior expectations – acquired from past experience or derived from prior knowledge. However, the respective contribution of these two sources of information is still controversial. This controversy stems in part from the fact that “intention” is an umbrella term that may embrace various sub-types each being assigned different scopes and targets. We hypothesized that variations in the scope and target of intentions may account for variations in the contribution of visual kinematics and prior knowledge to the intention inference process. To test this hypothesis, we conducted four behavioural experiments in which participants were instructed to identify different types of intention: basic intentions (i.e. simple goal of a motor act), superordinate intentions (i.e. general goal of a sequence of motor acts), or social intentions (i.e. intentions accomplished in a context of reciprocal interaction). For each of the above-mentioned intentions, we varied (1) the amount of visual information available from the action scene and (2) participant's prior expectations concerning the intention that was more likely to be accomplished. First, we showed that intentional judgments depend on a consistent interaction between visual information and participant's prior expectations. Moreover, we demonstrated that this interaction varied according to the type of intention to be inferred, with participant's priors rather than perceptual evidence exerting a greater effect on the inference of social and superordinate intentions. The results are discussed by appealing to the specific properties of each type of intention considered and further interpreted in the light of a hierarchical model of action representation. PMID:21364992
NASA Astrophysics Data System (ADS)
Rougier, Jonty; Cashman, Kathy; Sparks, Stephen
2016-04-01
We have analysed the Large Magnitude Explosive Volcanic Eruptions database (LaMEVE) for volcanoes that classify as stratovolcanoes. A non-parametric statistical approach is used to assess the global recording rate for large (M4+). The approach imposes minimal structure on the shape of the recording rate through time. We find that the recording rates have declined rapidly, going backwards in time. Prior to 1600 they are below 50%, and prior to 1100 they are below 20%. Even in the recent past, e.g. the 1800s, they are likely to be appreciably less than 100%.The assessment for very large (M5+) eruptions is more uncertain, due to the scarcity of events. Having taken under-recording into account the large-eruption rates of stratovolcanoes are modelled exchangeably, in order to derive an informative prior distribution as an input into a subsequent volcano-by-volcano hazard assessment. The statistical model implies that volcano-by-volcano predictions can be grouped by the number of recorded large eruptions. Further, it is possible to combine all volcanoes together into a global large eruption prediction, with an M4+ rate computed from the LaMEVE database of 0.57/yr.
NASA Astrophysics Data System (ADS)
Mattern, Nancy Page Garland
Four causal models describing the relationships between attitudes and achievement have been proposed in the literature. The cross-effects, or reciprocal effects, model highlights the effects of prior attitudes on later achievement (over and above the effect of previous achievement) and of prior achievement on later attitudes (above the effect of previous attitudes). In the achievement predominant model, the effect of prior achievement on later attitudes is emphasized, controlling for the effect of previous attitudes. The effect of prior attitudes on later achievement, controlling for the effect of previous achievement, is emphasized in the attitudes predominant model. In the no cross-effects model there are no significant cross paths from prior attitudes to later achievement or from prior achievement to later attitudes. To determine the best-fitting model for rural seventh and eighth grade science girls and boys, the causal relationships over time between attitudes toward science and achievement in science were examined by gender using structural equation modeling. Data were collected in two waves, over one school year. A baseline measurement model was estimated in simultaneous two-group solutions and was a good fit to the data. Next, the four structural models were estimated and model fits compared. The three models nested within the structural cross-effects model showed significant decay of fit when compared to the fit of the cross-effects model. The cross-effects model was the best fit overall for middle school girls and boys. The cross-effects model was then tested for invariance across gender. There was significant decay of fit when model form, factor path loadings, and structural paths were constrained to be equal for girls and boys. Two structural paths, the path from prior achievement to later attitudes, and the path from prior attitudes to later attitudes, were the sources of gender non-invariance. Separate models were estimated for girls and boys, and the fits of nested models were compared. The no cross-effects model was the best-fitting model for rural middle school girls. The new no attitudes-path model was the best-fitting model for boys. Implications of these findings for teaching middle school students were discussed.
SPHERE: SPherical Harmonic Elastic REgistration of HARDI Data
Yap, Pew-Thian; Chen, Yasheng; An, Hongyu; Yang, Yang; Gilmore, John H.; Lin, Weili
2010-01-01
In contrast to the more common Diffusion Tensor Imaging (DTI), High Angular Resolution Diffusion Imaging (HARDI) allows superior delineation of angular microstructures of brain white matter, and makes possible multiple-fiber modeling of each voxel for better characterization of brain connectivity. However, the complex orientation information afforded by HARDI makes registration of HARDI images more complicated than scalar images. In particular, the question of how much orientation information is needed for satisfactory alignment has not been sufficiently addressed. Low order orientation representation is generally more robust than high order representation, although the latter provides more information for correct alignment of fiber pathways. However, high order representation, when naïvely utilized, might not necessarily be conducive to improving registration accuracy since similar structures with significant orientation differences prior to proper alignment might be mistakenly taken as non-matching structures. We present in this paper a HARDI registration algorithm, called SPherical Harmonic Elastic REgistration (SPHERE), which in a principled means hierarchically extracts orientation information from HARDI data for structural alignment. The image volumes are first registered using robust, relatively direction invariant features derived from the Orientation Distribution Function (ODF), and the alignment is then further refined using spherical harmonic (SH) representation with gradually increasing orders. This progression from non-directional, single-directional to multi-directional representation provides a systematic means of extracting directional information given by diffusion-weighted imaging. Coupled with a template-subject-consistent soft-correspondence-matching scheme, this approach allows robust and accurate alignment of HARDI data. Experimental results show marked increase in accuracy over a state-of-the-art DTI registration algorithm. PMID:21147231
Side information in coded aperture compressive spectral imaging
NASA Astrophysics Data System (ADS)
Galvis, Laura; Arguello, Henry; Lau, Daniel; Arce, Gonzalo R.
2017-02-01
Coded aperture compressive spectral imagers sense a three-dimensional cube by using two-dimensional projections of the coded and spectrally dispersed source. These imagers systems often rely on FPA detectors, SLMs, micromirror devices (DMDs), and dispersive elements. The use of the DMDs to implement the coded apertures facilitates the capture of multiple projections, each admitting a different coded aperture pattern. The DMD allows not only to collect the sufficient number of measurements for spectrally rich scenes or very detailed spatial scenes but to design the spatial structure of the coded apertures to maximize the information content on the compressive measurements. Although sparsity is the only signal characteristic usually assumed for reconstruction in compressing sensing, other forms of prior information such as side information have been included as a way to improve the quality of the reconstructions. This paper presents the coded aperture design in a compressive spectral imager with side information in the form of RGB images of the scene. The use of RGB images as side information of the compressive sensing architecture has two main advantages: the RGB is not only used to improve the reconstruction quality but to optimally design the coded apertures for the sensing process. The coded aperture design is based on the RGB scene and thus the coded aperture structure exploits key features such as scene edges. Real reconstructions of noisy compressed measurements demonstrate the benefit of the designed coded apertures in addition to the improvement in the reconstruction quality obtained by the use of side information.
Jiang, Yu; Simon, Steve; Mayo, Matthew S; Gajewski, Byron J
2015-02-20
Slow recruitment in clinical trials leads to increased costs and resource utilization, which includes both the clinic staff and patient volunteers. Careful planning and monitoring of the accrual process can prevent the unnecessary loss of these resources. We propose two hierarchical extensions to the existing Bayesian constant accrual model: the accelerated prior and the hedging prior. The new proposed priors are able to adaptively utilize the researcher's previous experience and current accrual data to produce the estimation of trial completion time. The performance of these models, including prediction precision, coverage probability, and correct decision-making ability, is evaluated using actual studies from our cancer center and simulation. The results showed that a constant accrual model with strongly informative priors is very accurate when accrual is on target or slightly off, producing smaller mean squared error, high percentage of coverage, and a high number of correct decisions as to whether or not continue the trial, but it is strongly biased when off target. Flat or weakly informative priors provide protection against an off target prior but are less efficient when the accrual is on target. The accelerated prior performs similar to a strong prior. The hedging prior performs much like the weak priors when the accrual is extremely off target but closer to the strong priors when the accrual is on target or only slightly off target. We suggest improvements in these models and propose new models for future research. Copyright © 2014 John Wiley & Sons, Ltd.
Making Connections in Math: Activating a Prior Knowledge Analogue Matters for Learning
ERIC Educational Resources Information Center
Sidney, Pooja G.; Alibali, Martha W.
2015-01-01
This study investigated analogical transfer of conceptual structure from a prior-knowledge domain to support learning in a new domain of mathematics: division by fractions. Before a procedural lesson on division by fractions, fifth and sixth graders practiced with a surface analogue (other operations on fractions) or a structural analogue (whole…
ERIC Educational Resources Information Center
Song, H. S.; Kalet, A. L.; Plass, J. L.
2016-01-01
This study examined the direct and indirect effects of medical clerkship students' prior knowledge, self-regulation and motivation on learning performance in complex multimedia learning environments. The data from 386 medical clerkship students from six medical schools were analysed using structural equation modeling. The structural model revealed…
18 CFR 415.51 - Prior non-conforming structures.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Prior non-conforming structures. 415.51 Section 415.51 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION... damaged by any means, including a flood, to the extent of 50 percent or more of its market value at that...
ERIC Educational Resources Information Center
Novick, Laura R.; Catley, Kefyn M.
2014-01-01
Science is an important domain for investigating students' responses to information that contradicts their prior knowledge. In previous studies of this topic, this information was communicated verbally. The present research used diagrams, specifically trees (cladograms) depicting evolutionary relationships among taxa. Effects of college…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-01
... Food Under the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 AGENCY... appropriate, and other forms of information technology. Prior Notice of Imported Food Under the Public Health... 0910-0520)--Revision The Public Health Security and Bioterrorism Preparedness and Response Act of 2002...
Ten-Month-Old Infants Use Prior Information to Identify an Actor's Goal
ERIC Educational Resources Information Center
Sommerville, Jessica A.; Crane, Catharyn C.
2009-01-01
For adults, prior information about an individual's likely goals, preferences or dispositions plays a powerful role in interpreting ambiguous behavior and predicting and interpreting behavior in novel contexts. Across two studies, we investigated whether 10-month-old infants' ability to identify the goal of an ambiguous action sequence was…
78 FR 32359 - Information Required in Prior Notice of Imported Food
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-30
... or animal food based on food safety reasons, such as intentional or unintentional contamination of an... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 1 [Docket No. FDA-2011-N-0179] RIN 0910-AG65 Information Required in Prior Notice of Imported Food AGENCY: Food and Drug...
40 CFR 60.2953 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2011 CFR
2011-07-01
... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning...
40 CFR 60.2195 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2011 CFR
2011-07-01
... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY..., 2001 Recordkeeping and Reporting § 60.2195 What information must I submit prior to initial startup? You... startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning capacity. (c) The...
40 CFR 60.2953 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2012 CFR
2012-07-01
... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning...
40 CFR 60.2953 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2010 CFR
2010-07-01
... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning...
40 CFR 60.2195 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2012 CFR
2012-07-01
... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY..., 2001 Recordkeeping and Reporting § 60.2195 What information must I submit prior to initial startup? You... startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning capacity. (c) The...
40 CFR 60.2195 - What information must I submit prior to initial startup?
Code of Federal Regulations, 2010 CFR
2010-07-01
... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY..., 2001 Recordkeeping and Reporting § 60.2195 What information must I submit prior to initial startup? You... startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning capacity. (c) The...
Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki
2015-01-01
This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape–location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape–location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. PMID:26277022
Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu
2015-12-01
This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape-location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape-location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.
WISDOM, a polarimetric GPR for the shallow subsurface characterization
NASA Astrophysics Data System (ADS)
Ciarletti, V.; Plettemeier, D.; Hassen-Kodja, R.; Clifford, S. M.; Wisdom Team
2011-12-01
WISDOM (Water Ice and Subsurface Deposit Observations on Mars) is a polarimetric Ground Penetrating Radar (GPR) that has been selected to be part of the Pasteur payload onboard the Rover of the 2018 ExoMars mission. It will perform large-scale scientific investigations of the sub-surface of the landing site and provide precise information about the subsurface structure prior to drilling. WISDOM has been designed to provide accurate information on the sub-surface structure down to a depth in excess to 2 meters (commensurate to the drill capacities) with a vertical resolution of a several centimetres. It will give access to the geological structure, electromagnetic nature, and, possibly, to the hydrological state of the shallow subsurface by retrieving the layering and properties of the layers and buried reflectors. The data will also be used to determine the most promising locations to collect underground samples with the drilling system mounted on board the rover. Polarimetric measurements have been recently acquired on perfectly known targets as well as in natural environments. They demonstrated the ability to provide a better understanding of sub-surface structure and significantly reduce the ambiguity associated with identifying the location of off-nadir reflectors, relative to the rover path. This work describes the instrument and its operating modes with particular emphasis on its polarimetric capacities.
Jacquet, Pierre O.; Roy, Alice C.; Chambon, Valérian; Borghi, Anna M.; Salemme, Roméo; Farnè, Alessandro; Reilly, Karen T.
2016-01-01
Predicting intentions from observing another agent’s behaviours is often thought to depend on motor resonance – i.e., the motor system’s response to a perceived movement by the activation of its stored motor counterpart, but observers might also rely on prior expectations, especially when actions take place in perceptually uncertain situations. Here we assessed motor resonance during an action prediction task using transcranial magnetic stimulation to probe corticospinal excitability (CSE) and report that experimentally-induced updates in observers’ prior expectations modulate CSE when predictions are made under situations of perceptual uncertainty. We show that prior expectations are updated on the basis of both biomechanical and probabilistic prior information and that the magnitude of the CSE modulation observed across participants is explained by the magnitude of change in their prior expectations. These findings provide the first evidence that when observers predict others’ intentions, motor resonance mechanisms adapt to changes in their prior expectations. We propose that this adaptive adjustment might reflect a regulatory control mechanism that shares some similarities with that observed during action selection. Such a mechanism could help arbitrate the competition between biomechanical and probabilistic prior information when appropriate for prediction. PMID:27243157
Jacquet, Pierre O; Roy, Alice C; Chambon, Valérian; Borghi, Anna M; Salemme, Roméo; Farnè, Alessandro; Reilly, Karen T
2016-05-31
Predicting intentions from observing another agent's behaviours is often thought to depend on motor resonance - i.e., the motor system's response to a perceived movement by the activation of its stored motor counterpart, but observers might also rely on prior expectations, especially when actions take place in perceptually uncertain situations. Here we assessed motor resonance during an action prediction task using transcranial magnetic stimulation to probe corticospinal excitability (CSE) and report that experimentally-induced updates in observers' prior expectations modulate CSE when predictions are made under situations of perceptual uncertainty. We show that prior expectations are updated on the basis of both biomechanical and probabilistic prior information and that the magnitude of the CSE modulation observed across participants is explained by the magnitude of change in their prior expectations. These findings provide the first evidence that when observers predict others' intentions, motor resonance mechanisms adapt to changes in their prior expectations. We propose that this adaptive adjustment might reflect a regulatory control mechanism that shares some similarities with that observed during action selection. Such a mechanism could help arbitrate the competition between biomechanical and probabilistic prior information when appropriate for prediction.
Comparing hard and soft prior bounds in geophysical inverse problems
NASA Technical Reports Server (NTRS)
Backus, George E.
1988-01-01
In linear inversion of a finite-dimensional data vector y to estimate a finite-dimensional prediction vector z, prior information about X sub E is essential if y is to supply useful limits for z. The one exception occurs when all the prediction functionals are linear combinations of the data functionals. Two forms of prior information are compared: a soft bound on X sub E is a probability distribution p sub x on X which describes the observer's opinion about where X sub E is likely to be in X; a hard bound on X sub E is an inequality Q sub x(X sub E, X sub E) is equal to or less than 1, where Q sub x is a positive definite quadratic form on X. A hard bound Q sub x can be softened to many different probability distributions p sub x, but all these p sub x's carry much new information about X sub E which is absent from Q sub x, and some information which contradicts Q sub x. Both stochastic inversion (SI) and Bayesian inference (BI) estimate z from y and a soft prior bound p sub x. If that probability distribution was obtained by softening a hard prior bound Q sub x, rather than by objective statistical inference independent of y, then p sub x contains so much unsupported new information absent from Q sub x that conclusions about z obtained with SI or BI would seen to be suspect.
Comparing hard and soft prior bounds in geophysical inverse problems
NASA Technical Reports Server (NTRS)
Backus, George E.
1987-01-01
In linear inversion of a finite-dimensional data vector y to estimate a finite-dimensional prediction vector z, prior information about X sub E is essential if y is to supply useful limits for z. The one exception occurs when all the prediction functionals are linear combinations of the data functionals. Two forms of prior information are compared: a soft bound on X sub E is a probability distribution p sub x on X which describeds the observer's opinion about where X sub E is likely to be in X; a hard bound on X sub E is an inequality Q sub x(X sub E, X sub E) is equal to or less than 1, where Q sub x is a positive definite quadratic form on X. A hard bound Q sub x can be softened to many different probability distributions p sub x, but all these p sub x's carry much new information about X sub E which is absent from Q sub x, and some information which contradicts Q sub x. Both stochastic inversion (SI) and Bayesian inference (BI) estimate z from y and a soft prior bound p sub x. If that probability distribution was obtained by softening a hard prior bound Q sub x, rather than by objective statistical inference independent of y, then p sub x contains so much unsupported new information absent from Q sub x that conclusions about z obtained with SI or BI would seen to be suspect.
NASA Astrophysics Data System (ADS)
Korteniemi, J.; Kukkonen, S.
2018-04-01
Outflow channel formation on the eastern Hellas rim region is traditionally thought to have been triggered by activity phases of the nearby volcanoes Hadriacus and Tyrrhenus Montes: As a result of volcanic heating subsurface volatiles were mobilized. It is, however, under debate, whether eastern Hellas volcanism was in fact more extensive, and if there were volcanic centers separate from the identified central volcanoes. This work describes previously unrecognized structures in the Niger-Dao Valles outflow channel complex. We interpret them as volcanic edifices: cones, a shield, and a caldera. The structures provide evidence of an additional volcanic center within the valles and indicate volcanic activity both prior to and following the formation of the outflow events. They expand the extent, type, and duration of volcanic activity in the Circum-Hellas Volcanic Province and provide new information on interaction between volcanism and fluvial activity.
Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostrouchov, George; Doll, William E.; Beard, Les P.
2009-01-01
Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less
The Influence of Topic Status on Written and Spoken Sentence Production
Cowles, H. Wind; Ferreira, Victor S.
2012-01-01
Four experiments investigate the influence of topic status and givenness on how speakers and writers structure sentences. The results of these experiments show that when a referent is previously given, it is more likely to be produced early in both sentences and word lists, confirming prior work showing that givenness increases the accessibility of given referents. When a referent is previously given and assigned topic status, it is even more likely to be produced early in a sentence, but not in a word list. Thus, there appears to be an early mention advantage for topics that is present in both written and spoken modalities, but is specific to sentence production. These results suggest that information-structure constructs like topic exert an influence that is not based only on increased accessibility, but also reflects mapping to syntactic structure during sentence production. PMID:22408281
Ubiquitousness of link-density and link-pattern communities in real-world networks
NASA Astrophysics Data System (ADS)
Šubelj, L.; Bajec, M.
2012-01-01
Community structure appears to be an intrinsic property of many complex real-world networks. However, recent work shows that real-world networks reveal even more sophisticated modules than classical cohesive (link-density) communities. In particular, networks can also be naturally partitioned according to similar patterns of connectedness among the nodes, revealing link-pattern communities. We here propose a propagation based algorithm that can extract both link-density and link-pattern communities, without any prior knowledge of the true structure. The algorithm was first validated on different classes of synthetic benchmark networks with community structure, and also on random networks. We have further applied the algorithm to different social, information, technological and biological networks, where it indeed reveals meaningful (composites of) link-density and link-pattern communities. The results thus seem to imply that, similarly as link-density counterparts, link-pattern communities appear ubiquitous in nature and design.
Low dose tomographic fluoroscopy: 4D intervention guidance with running prior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, Barbara; Kuntz, Jan; Brehm, Marcus
Purpose: Today's standard imaging technique in interventional radiology is the single- or biplane x-ray fluoroscopy which delivers 2D projection images as a function of time (2D+T). This state-of-the-art technology, however, suffers from its projective nature and is limited by the superposition of the patient's anatomy. Temporally resolved tomographic volumes (3D+T) would significantly improve the visualization of complex structures. A continuous tomographic data acquisition, if carried out with today's technology, would yield an excessive patient dose. Recently the authors proposed a method that enables tomographic fluoroscopy at the same dose level as projective fluoroscopy which means that if scanning time ofmore » an intervention guided by projective fluoroscopy is the same as that of an intervention guided by tomographic fluoroscopy, almost the same dose is administered to the patient. The purpose of this work is to extend authors' previous work and allow for patient motion during the intervention.Methods: The authors propose the running prior technique for adaptation of a prior image. This adaptation is realized by a combination of registration and projection replacement. In a first step the prior is deformed to the current position via affine and deformable registration. Then the information from outdated projections is replaced by newly acquired projections using forward and backprojection steps. The thus adapted volume is the running prior. The proposed method is validated by simulated as well as measured data. To investigate motion during intervention a moving head phantom was simulated. Real in vivo data of a pig are acquired by a prototype CT system consisting of a flat detector and a continuously rotating clinical gantry.Results: With the running prior technique it is possible to correct for motion without additional dose. For an application in intervention guidance both steps of the running prior technique, registration and replacement, are necessary. Reconstructed volumes based on the running prior show high image quality without introducing new artifacts and the interventional materials are displayed at the correct position.Conclusions: The running prior improves the robustness of low dose 3D+T intervention guidance toward intended or unintended patient motion.« less
Explanation and Prior Knowledge Interact to Guide Learning
ERIC Educational Resources Information Center
Williams, Joseph J.; Lombrozo, Tania
2013-01-01
How do explaining and prior knowledge contribute to learning? Four experiments explored the relationship between explanation and prior knowledge in category learning. The experiments independently manipulated whether participants were prompted to explain the category membership of study observations and whether category labels were informative in…
Blind test of physics-based prediction of protein structures.
Shell, M Scott; Ozkan, S Banu; Voelz, Vincent; Wu, Guohong Albert; Dill, Ken A
2009-02-01
We report here a multiprotein blind test of a computer method to predict native protein structures based solely on an all-atom physics-based force field. We use the AMBER 96 potential function with an implicit (GB/SA) model of solvation, combined with replica-exchange molecular-dynamics simulations. Coarse conformational sampling is performed using the zipping and assembly method (ZAM), an approach that is designed to mimic the putative physical routes of protein folding. ZAM was applied to the folding of six proteins, from 76 to 112 monomers in length, in CASP7, a community-wide blind test of protein structure prediction. Because these predictions have about the same level of accuracy as typical bioinformatics methods, and do not utilize information from databases of known native structures, this work opens up the possibility of predicting the structures of membrane proteins, synthetic peptides, or other foldable polymers, for which there is little prior knowledge of native structures. This approach may also be useful for predicting physical protein folding routes, non-native conformations, and other physical properties from amino acid sequences.
Blind Test of Physics-Based Prediction of Protein Structures
Shell, M. Scott; Ozkan, S. Banu; Voelz, Vincent; Wu, Guohong Albert; Dill, Ken A.
2009-01-01
We report here a multiprotein blind test of a computer method to predict native protein structures based solely on an all-atom physics-based force field. We use the AMBER 96 potential function with an implicit (GB/SA) model of solvation, combined with replica-exchange molecular-dynamics simulations. Coarse conformational sampling is performed using the zipping and assembly method (ZAM), an approach that is designed to mimic the putative physical routes of protein folding. ZAM was applied to the folding of six proteins, from 76 to 112 monomers in length, in CASP7, a community-wide blind test of protein structure prediction. Because these predictions have about the same level of accuracy as typical bioinformatics methods, and do not utilize information from databases of known native structures, this work opens up the possibility of predicting the structures of membrane proteins, synthetic peptides, or other foldable polymers, for which there is little prior knowledge of native structures. This approach may also be useful for predicting physical protein folding routes, non-native conformations, and other physical properties from amino acid sequences. PMID:19186130
Bayesian Multiscale Modeling of Closed Curves in Point Clouds
Gu, Kelvin; Pati, Debdeep; Dunson, David B.
2014-01-01
Modeling object boundaries based on image or point cloud data is frequently necessary in medical and scientific applications ranging from detecting tumor contours for targeted radiation therapy, to the classification of organisms based on their structural information. In low-contrast images or sparse and noisy point clouds, there is often insufficient data to recover local segments of the boundary in isolation. Thus, it becomes critical to model the entire boundary in the form of a closed curve. To achieve this, we develop a Bayesian hierarchical model that expresses highly diverse 2D objects in the form of closed curves. The model is based on a novel multiscale deformation process. By relating multiple objects through a hierarchical formulation, we can successfully recover missing boundaries by borrowing structural information from similar objects at the appropriate scale. Furthermore, the model’s latent parameters help interpret the population, indicating dimensions of significant structural variability and also specifying a ‘central curve’ that summarizes the collection. Theoretical properties of our prior are studied in specific cases and efficient Markov chain Monte Carlo methods are developed, evaluated through simulation examples and applied to panorex teeth images for modeling teeth contours and also to a brain tumor contour detection problem. PMID:25544786
Microscopic Characterization of the Brazilian Giant Samba Virus.
Schrad, Jason R; Young, Eric J; Abrahão, Jônatas S; Cortines, Juliana R; Parent, Kristin N
2017-02-14
Prior to the discovery of the mimivirus in 2003, viruses were thought to be physically small and genetically simple. Mimivirus, with its ~750-nm particle size and its ~1.2-Mbp genome, shattered these notions and changed what it meant to be a virus. Since this discovery, the isolation and characterization of giant viruses has exploded. One of the more recently discovered giant viruses, Samba virus, is a Mimivirus that was isolated from the Rio Negro in the Brazilian Amazon. Initial characterization of Samba has revealed some structural information, although the preparation techniques used are prone to the generation of structural artifacts. To generate more native-like structural information for Samba, we analyzed the virus through cryo-electron microscopy, cryo-electron tomography, scanning electron microscopy, and fluorescence microscopy. These microscopy techniques demonstrated that Samba particles have a capsid diameter of ~527 nm and a fiber length of ~155 nm, making Samba the largest Mimivirus yet characterized. We also compared Samba to a fiberless mimivirus variant. Samba particles, unlike those of mimivirus, do not appear to be rigid, and quasi-icosahedral, although the two viruses share many common features, including a multi-layered capsid and an asymmetric nucleocapsid, which may be common amongst the Mimiviruses .
Microscopic Characterization of the Brazilian Giant Samba Virus
Schrad, Jason R.; Young, Eric J.; Abrahão, Jônatas S.; Cortines, Juliana R.; Parent, Kristin N.
2017-01-01
Prior to the discovery of the mimivirus in 2003, viruses were thought to be physically small and genetically simple. Mimivirus, with its ~750-nm particle size and its ~1.2-Mbp genome, shattered these notions and changed what it meant to be a virus. Since this discovery, the isolation and characterization of giant viruses has exploded. One of the more recently discovered giant viruses, Samba virus, is a Mimivirus that was isolated from the Rio Negro in the Brazilian Amazon. Initial characterization of Samba has revealed some structural information, although the preparation techniques used are prone to the generation of structural artifacts. To generate more native-like structural information for Samba, we analyzed the virus through cryo-electron microscopy, cryo-electron tomography, scanning electron microscopy, and fluorescence microscopy. These microscopy techniques demonstrated that Samba particles have a capsid diameter of ~527 nm and a fiber length of ~155 nm, making Samba the largest Mimivirus yet characterized. We also compared Samba to a fiberless mimivirus variant. Samba particles, unlike those of mimivirus, do not appear to be rigid, and quasi-icosahedral, although the two viruses share many common features, including a multi-layered capsid and an asymmetric nucleocapsid, which may be common amongst the Mimiviruses. PMID:28216551
Tian, Xinyu; Wang, Xuefeng; Chen, Jun
2014-01-01
Classic multinomial logit model, commonly used in multiclass regression problem, is restricted to few predictors and does not take into account the relationship among variables. It has limited use for genomic data, where the number of genomic features far exceeds the sample size. Genomic features such as gene expressions are usually related by an underlying biological network. Efficient use of the network information is important to improve classification performance as well as the biological interpretability. We proposed a multinomial logit model that is capable of addressing both the high dimensionality of predictors and the underlying network information. Group lasso was used to induce model sparsity, and a network-constraint was imposed to induce the smoothness of the coefficients with respect to the underlying network structure. To deal with the non-smoothness of the objective function in optimization, we developed a proximal gradient algorithm for efficient computation. The proposed model was compared to models with no prior structure information in both simulations and a problem of cancer subtype prediction with real TCGA (the cancer genome atlas) gene expression data. The network-constrained mode outperformed the traditional ones in both cases.
Management and organization reforms at the Muhimbili National Hospital: challenges and prospects.
Mwangu, M A; Mbembati, N A A; Muhondwa, E P Y; Leshabari, M T
2008-08-01
To establish the state of organization structures and management situation existing at the Muhimbili National Hospital (MNH) and Muhimbili University College of Health Sciences (MUCHS) prior to the start of the MNH reforms and physical infrastructure rehabilitations. A checklist of key information items was used to get facts and figures about the organization of the MNH and management situation. Interviews with MNH and MUCHS leaders, and documentation of existing hospital data were done to gather the necessary information. The survey reveals that there are a number of organizational, managerial and human resource deficiencies that are impinging on the smooth running of the hospital as a national referral entity. The survey also revealed a complex relationship existing between the hospital and the college (MUCHS) that has a bearing on the functioning of both entities. In order for the hospital to function effectively as a referral hospital with a training component inbuilt, four basic things need to be put in place among others: a sound organization structure; adequate staffing levels especially of specialist cadre; a functional information system especially for inpatient services and a good working relationship with the college.
Basics for sensorimotor information processing: some implications for learning
Vidal, Franck; Meckler, Cédric; Hasbroucq, Thierry
2015-01-01
In sensorimotor activities, learning requires efficient information processing, whether in car driving, sport activities or human–machine interactions. Several factors may affect the efficiency of such processing: they may be extrinsic (i.e., task-related) or intrinsic (i.e., subjects-related). The effects of these factors are intimately related to the structure of human information processing. In the present article we will focus on some of them, which are poorly taken into account, even when minimizing errors or their consequences is an essential issue at stake. Among the extrinsic factors, we will discuss, first, the effects of the quantity and quality of information, secondly, the effects of instruction and thirdly motor program learning. Among the intrinsic factors, we will discuss first the influence of prior information, secondly how individual strategies affect performance and, thirdly, we will stress the fact that although the human brain is not structured to function errorless (which is not new) humans are able to detect their errors very quickly and (in most of the cases), fast enough to correct them before they result in an overt failure. Extrinsic and intrinsic factors are important to take into account for learning because (1) they strongly affect performance, either in terms of speed or accuracy, which facilitates or impairs learning, (2) the effect of certain extrinsic factors may be strongly modified by learning and (3) certain intrinsic factors might be exploited for learning strategies. PMID:25762944
Primary care contact prior to suicide in individuals with mental illness
Pearson, Anna; Saini, Pooja; Da Cruz, Damian; Miles, Caroline; While, David; Swinson, Nicola; Williams, Alyson; Shaw, Jenny; Appleby, Louis; Kapur, Navneet
2009-01-01
Background Previous studies have reported differing rates of consultation with GPs prior to suicide. Patients with a psychiatric history have higher rates of consultation and consult closer to the time of their death. Aim To investigate the frequency and nature of general practice consultations in the year before suicide for patients in current, or recent, contact with secondary mental health services. Design of study Retrospective case-note study and semi-structured interviews. Setting General practices in the northwest of England. Method General practice data were obtained by a retrospective review of medical records (n = 247) and semi-structured interviews with GPs (n = 159). Results GP records were reviewed in 247 of the 286 cases (86%). Overall, 91% of individuals (n = 224) consulted their GP on at least one occasion in the year before death. The median number of consultations was 7 (interquartile range = 3–10). Interviews were carried out with GPs with regard to 159 patients. GPs reported concerns about their patient's safety in 43 (27%) cases, but only 16% of them thought that the suicide could have been prevented. Agreement between GPs and mental health teams regarding risk of suicide was poor. Both sets of clinicians rated moderate to high levels of risk in only 3% of cases for whom information was available (n = 139) (overall κ = 0.024). Conclusion Consultation prior to suicide is common but suicide prevention in primary care is challenging. Possible strategies might include examining the potential benefits of risk assessment and collaborative working between primary and secondary care. PMID:19861027
A qualitative study of experience of parents of adolescents who received ECT.
Grover, Sandeep; Varadharajan, Natarajan; Avasthi, Ajit
2017-12-01
To evaluate the experience of parents of adolescents who received ECT for severe mental illness. Using qualitative methods, 6 parents of 5 adolescents were interviewed by using a self-designed semi-structured interview after the completion of ECT course. The clinicians involved in the ECT procedure, i.e., seeking informed consent and administration of ECT were not aware about the study. All the interviews were recorded and the content was analysed and themes were generated. Parents of all the 5 adolescents expressed that their children were considered for ECT only after the patient had not responded to medication and were unmanageable. Prior to ECT the treating doctors did explain to them about the ECT procedure, they were given information booklet and they were not coerced to consent for ECT. Some of the parents reported that they had dilemma prior to giving consent and were scared prior to the first ECT. However, as the clinical condition of their children improved, they felt that ECT was a good treatment. Majority of the parents felt that ECT was delayed for their children. When asked about restriction in use of ECT in children and adolescents, the parents expressed that it is important for law makers to understand the distress of the parents,when their children are acutely ill. They expressed that decision of administration of ECT must be left to the family and the treating clinicians. Parents of adolescents considered for ECT are generally satisfied with the treatment procedure. Copyright © 2017 Elsevier B.V. All rights reserved.
Barriers and facilitators to ED physician use of the test and treatment for BPPV
Forman, Jane; Damschroder, Laura; Telian, Steven A.; Fagerlin, Angela; Johnson, Patricia; Brown, Devin L.; An, Lawrence C.; Morgenstern, Lewis B.; Meurer, William J.
2017-01-01
Abstract Background: The test and treatment for benign paroxysmal positional vertigo (BPPV) are evidence-based practices supported by clinical guideline statements. Yet these practices are underutilized in the emergency department (ED) and interventions to promote their use are needed. To inform the development of an intervention, we interviewed ED physicians to explore barriers and facilitators to the current use of the Dix-Hallpike test (DHT) and the canalith repositioning maneuver (CRM). Methods: We conducted semi-structured in-person interviews with ED physicians who were recruited at annual ED society meetings in the United States. We analyzed data thematically using qualitative content analysis methods. Results: Based on 50 interviews with ED physicians, barriers that contributed to infrequent use of DHT/CRM that emerged were (1) prior negative experiences or forgetting how to perform them and (2) reliance on the history of present illness to identify BPPV, or using the DHT but misattributing patterns of nystagmus. Based on participants' responses, the principal facilitator of DHT/CRM use was prior positive experiences using these, even if infrequent. When asked which clinical supports would facilitate more frequent use of DHT/CRM, participants agreed supports needed to be brief, readily accessible, and easy to use, and to include well-annotated video examples. Conclusions: Interventions to promote the use of the DHT/CRM in the ED need to overcome prior negative experiences with the DHT/CRM, overreliance on the history of present illness, and the underuse and misattribution of patterns of nystagmus. Future resources need to be sensitive to provider preferences for succinct information and video examples. PMID:28680765
Inverse Problems in Complex Models and Applications to Earth Sciences
NASA Astrophysics Data System (ADS)
Bosch, M. E.
2015-12-01
The inference of the subsurface earth structure and properties requires the integration of different types of data, information and knowledge, by combined processes of analysis and synthesis. To support the process of integrating information, the regular concept of data inversion is evolving to expand its application to models with multiple inner components (properties, scales, structural parameters) that explain multiple data (geophysical survey data, well-logs, core data). The probabilistic inference methods provide the natural framework for the formulation of these problems, considering a posterior probability density function (PDF) that combines the information from a prior information PDF and the new sets of observations. To formulate the posterior PDF in the context of multiple datasets, the data likelihood functions are factorized assuming independence of uncertainties for data originating across different surveys. A realistic description of the earth medium requires modeling several properties and structural parameters, which relate to each other according to dependency and independency notions. Thus, conditional probabilities across model components also factorize. A common setting proceeds by structuring the model parameter space in hierarchical layers. A primary layer (e.g. lithology) conditions a secondary layer (e.g. physical medium properties), which conditions a third layer (e.g. geophysical data). In general, less structured relations within model components and data emerge from the analysis of other inverse problems. They can be described with flexibility via direct acyclic graphs, which are graphs that map dependency relations between the model components. Examples of inverse problems in complex models can be shown at various scales. At local scale, for example, the distribution of gas saturation is inferred from pre-stack seismic data and a calibrated rock-physics model. At regional scale, joint inversion of gravity and magnetic data is applied for the estimation of lithological structure of the crust, with the lithotype body regions conditioning the mass density and magnetic susceptibility fields. At planetary scale, the Earth mantle temperature and element composition is inferred from seismic travel-time and geodetic data.
Harris, Wendy; Zhang, You; Yin, Fang-Fang; Ren, Lei
2017-01-01
Purpose To investigate the feasibility of using structural-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. Methods A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion-model extracted by global PCA and free-form deformation (GMM-FD) technique, using a data fidelity constraint and deformation energy minimization. In this study, a new structural-PCA method was developed to build a structural motion-model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume to evaluate the method. The estimation accuracy was evaluated by the Volume-Percent-Difference (VPD)/Center-of-Mass-Shift (COMS) between lesions in the estimated and “ground-truth” on board 4D-CBCT. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy. The method was also evaluated against 3 lung patients. Results The SMM-WFD method achieved substantially better accuracy than the GMM-FD method for CBCT estimation using extremely small scan angles or projections. Using orthogonal 15° scanning angles, the VPD/COMS were 3.47±2.94% and 0.23±0.22mm for SMM-WFD and 25.23±19.01% and 2.58±2.54mm for GMM-FD among all 8 XCAT scenarios. Compared to GMM-FD, SMM-WFD was more robust against reduction of the scanning angles down to orthogonal 10° with VPD/COMS of 6.21±5.61% and 0.39±0.49mm, and more robust against reduction of projection numbers down to only 8 projections in total for both orthogonal-view 30° and orthogonal-view 15° scan angles. SMM-WFD method was also more robust than the GMM-FD method against increasing levels of noise in the projection images. Additionally, the SMM-WFD technique provided better tumor estimation for all three lung patients compared to the GMM-FD technique. Conclusion Compared to the GMM-FD technique, the SMM-WFD technique can substantially improve the 4D-CBCT estimation accuracy using extremely small scan angles and low number of projections to provide fast low dose 4D target verification. PMID:28079267
Geometrical structure of Neural Networks: Geodesics, Jeffrey's Prior and Hyper-ribbons
NASA Astrophysics Data System (ADS)
Hayden, Lorien; Alemi, Alex; Sethna, James
2014-03-01
Neural networks are learning algorithms which are employed in a host of Machine Learning problems including speech recognition, object classification and data mining. In practice, neural networks learn a low dimensional representation of high dimensional data and define a model manifold which is an embedding of this low dimensional structure in the higher dimensional space. In this work, we explore the geometrical structure of a neural network model manifold. A Stacked Denoising Autoencoder and a Deep Belief Network are trained on handwritten digits from the MNIST database. Construction of geodesics along the surface and of slices taken from the high dimensional manifolds reveal a hierarchy of widths corresponding to a hyper-ribbon structure. This property indicates that neural networks fall into the class of sloppy models, in which certain parameter combinations dominate the behavior. Employing this information could prove valuable in designing both neural network architectures and training algorithms. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No . DGE-1144153.
Active sensing in the categorization of visual patterns
Yang, Scott Cheng-Hsin; Lengyel, Máté; Wolpert, Daniel M
2016-01-01
Interpreting visual scenes typically requires us to accumulate information from multiple locations in a scene. Using a novel gaze-contingent paradigm in a visual categorization task, we show that participants' scan paths follow an active sensing strategy that incorporates information already acquired about the scene and knowledge of the statistical structure of patterns. Intriguingly, categorization performance was markedly improved when locations were revealed to participants by an optimal Bayesian active sensor algorithm. By using a combination of a Bayesian ideal observer and the active sensor algorithm, we estimate that a major portion of this apparent suboptimality of fixation locations arises from prior biases, perceptual noise and inaccuracies in eye movements, and the central process of selecting fixation locations is around 70% efficient in our task. Our results suggest that participants select eye movements with the goal of maximizing information about abstract categories that require the integration of information from multiple locations. DOI: http://dx.doi.org/10.7554/eLife.12215.001 PMID:26880546
Automatic segmentation of brain MRIs and mapping neuroanatomy across the human lifespan
NASA Astrophysics Data System (ADS)
Keihaninejad, Shiva; Heckemann, Rolf A.; Gousias, Ioannis S.; Rueckert, Daniel; Aljabar, Paul; Hajnal, Joseph V.; Hammers, Alexander
2009-02-01
A robust model for the automatic segmentation of human brain images into anatomically defined regions across the human lifespan would be highly desirable, but such structural segmentations of brain MRI are challenging due to age-related changes. We have developed a new method, based on established algorithms for automatic segmentation of young adults' brains. We used prior information from 30 anatomical atlases, which had been manually segmented into 83 anatomical structures. Target MRIs came from 80 subjects (~12 individuals/decade) from 20 to 90 years, with equal numbers of men, women; data from two different scanners (1.5T, 3T), using the IXI database. Each of the adult atlases was registered to each target MR image. By using additional information from segmentation into tissue classes (GM, WM and CSF) to initialise the warping based on label consistency similarity before feeding this into the previous normalised mutual information non-rigid registration, the registration became robust enough to accommodate atrophy and ventricular enlargement with age. The final segmentation was obtained by combination of the 30 propagated atlases using decision fusion. Kernel smoothing was used for modelling the structural volume changes with aging. Example linear correlation coefficients with age were, for lateral ventricular volume, rmale=0.76, rfemale=0.58 and, for hippocampal volume, rmale=-0.6, rfemale=-0.4 (allρ<0.01).
Code of Federal Regulations, 2010 CFR
2010-07-01
..., the parties may submit any additional relevant information relating to the violation, either prior to... to submit additional information or request a safety and health conference with the District Manager... parties to discuss the issues involved prior to the conference. (d) MSHA will consider all relevant...
6 CFR 5.46 - Procedure when response to demand is required prior to receiving instructions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 6 Domestic Security 1 2010-01-01 2010-01-01 false Procedure when response to demand is required prior to receiving instructions. 5.46 Section 5.46 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY DISCLOSURE OF RECORDS AND INFORMATION Disclosure of Information in Litigation § 5...
Is Bayesian Estimation Proper for Estimating the Individual's Ability? Research Report 80-3.
ERIC Educational Resources Information Center
Samejima, Fumiko
The effect of prior information in Bayesian estimation is considered, mainly from the standpoint of objective testing. In the estimation of a parameter belonging to an individual, the prior information is, in most cases, the density function of the population to which the individual belongs. Bayesian estimation was compared with maximum likelihood…
Small-Sample Equating with Prior Information. Research Report. ETS RR-09-25
ERIC Educational Resources Information Center
Livingston, Samuel A.; Lewis, Charles
2009-01-01
This report proposes an empirical Bayes approach to the problem of equating scores on test forms taken by very small numbers of test takers. The equated score is estimated separately at each score point, making it unnecessary to model either the score distribution or the equating transformation. Prior information comes from equatings of other…
Rowe, Heather; Fisher, Jane; Quinlivan, Julie
2009-03-01
Prenatal maternal serum screening allows assessment of risk of chromosomal abnormalities in the fetus and is increasingly being offered to all women regardless of age or prior risk. However ensuring informed choice to participate in screening is difficult and the psychological implications of making an informed decision are uncertain. The aim of this study was to compare the growth of maternal-fetal emotional attachment in groups of women whose decisions about participation in screening were informed or not informed. A prospective longitudinal design was used. English speaking women were recruited in antenatal clinics prior to the offer of second trimester maternal screening. Three self-report questionnaires completed over the course of pregnancy used validated measures of informed choice and maternal-fetal emotional attachment. Attachment scores throughout pregnancy in informed and not-informed groups were compared in repeated measures analysis. 134 completed the first assessment (recruitment 73%) and 68 (58%) provided compete data. The informed group had significantly lower attachment scores (p = 0.023) than the not-informed group prior to testing, but scores were similar (p = 0.482) after test results were known. The findings raise questions about the impact of delayed maternal-fetal attachment and appropriate interventions to facilitate informed choice to participate in screening.
Barba-Montoya, Jose; Dos Reis, Mario; Yang, Ziheng
2017-09-01
Fossil calibrations are the utmost source of information for resolving the distances between molecular sequences into estimates of absolute times and absolute rates in molecular clock dating analysis. The quality of calibrations is thus expected to have a major impact on divergence time estimates even if a huge amount of molecular data is available. In Bayesian molecular clock dating, fossil calibration information is incorporated in the analysis through the prior on divergence times (the time prior). Here, we evaluate three strategies for converting fossil calibrations (in the form of minimum- and maximum-age bounds) into the prior on times, which differ according to whether they borrow information from the maximum age of ancestral nodes and minimum age of descendent nodes to form constraints for any given node on the phylogeny. We study a simple example that is analytically tractable, and analyze two real datasets (one of 10 primate species and another of 48 seed plant species) using three Bayesian dating programs: MCMCTree, MrBayes and BEAST2. We examine how different calibration strategies, the birth-death process, and automatic truncation (to enforce the constraint that ancestral nodes are older than descendent nodes) interact to determine the time prior. In general, truncation has a great impact on calibrations so that the effective priors on the calibration node ages after the truncation can be very different from the user-specified calibration densities. The different strategies for generating the effective prior also had considerable impact, leading to very different marginal effective priors. Arbitrary parameters used to implement minimum-bound calibrations were found to have a strong impact upon the prior and posterior of the divergence times. Our results highlight the importance of inspecting the joint time prior used by the dating program before any Bayesian dating analysis. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
2008-03-01
amount of arriving data, extract actionable information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center...information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center climate and it becomes clear that analysts, police... fusion centers, including specifics about how these problems manifest at the Illinois State Police (ISP) Statewide Terrorism and Intelligence Center
Roberts, J Scott; Gornick, Michele C; Carere, Deanna Alexis; Uhlmann, Wendy R; Ruffin, Mack T; Green, Robert C
2017-01-01
To describe the interests, decision making, and responses of consumers of direct-to-consumer personal genomic testing (DTC-PGT) services. Prior to 2013 regulatory restrictions on DTC-PGT services, 1,648 consumers from 2 leading companies completed Web surveys before and after receiving test results. Prior to testing, DTC-PGT consumers were as interested in ancestry (74% very interested) and trait information (72%) as they were in disease risks (72%). Among disease risks, heart disease (68% very interested), breast cancer (67%), and Alzheimer disease (66%) were of greatest interest prior to testing. Interest in disease risks was associated with female gender and poorer self-reported health (p < 0.01). Many consumers (38%) did not consider the possibility of unwanted information before purchasing services; this group was more likely to be older, male, and less educated (p < 0.05). After receiving results, 59% of respondents said test information would influence management of their health; 2% reported regret about seeking testing and 1% reported harm from results. DTC-PGT has attracted controversy because of the health-related information it provides, but nonmedical information is of equal or greater interest to consumers. Although many consumers did not fully consider potential risks prior to testing, DTC-PGT was generally perceived as useful in informing future health decisions. © 2017 S. Karger AG, Basel.
Younge, Sinead N.; Boyer, Cherrie B.; Geter, Angelica; Barker, Judith C.; Corneille, Maya
2015-01-01
The purpose of this study was to provide formative data on the sexual behaviors of emerging adult Black men who attended a historically Black college/university. A convenience sample of 19 participants completed a demographic questionnaire and a semi-structured interview. This study utilized a phenomenological qualitative approach to explore the role of the developmental stage that emerging adulthood has on sexual health. Some of the major themes that emerged included maturation, sexual decision-making, respectability, a future orientation, and masculinity. Despite sexual initiation beginning prior to entering college, participants discussed how the college environment presented them with new information, experiences, and attitudes. This study provides useful information for the future investigation of emerging adult Black men who attend HBCUs. PMID:26146649
Effects of Prior Knowledge on Memory: Implications for Education
ERIC Educational Resources Information Center
Shing, Yee Lee; Brod, Garvin
2016-01-01
The encoding, consolidation, and retrieval of events and facts form the basis for acquiring new skills and knowledge. Prior knowledge can enhance those memory processes considerably and thus foster knowledge acquisition. But prior knowledge can also hinder knowledge acquisition, in particular when the to-be-learned information is inconsistent with…
Menarche: Prior Knowledge and Experience.
ERIC Educational Resources Information Center
Skandhan, K. P.; And Others
1988-01-01
Recorded menstruation information among 305 young women in India, assessing the differences between those who did and did not have knowledge of menstruation prior to menarche. Those with prior knowledge considered menarche to be a normal physiological function and had a higher rate of regularity, lower rate of dysmenorrhea, and earlier onset of…
The future of structural fieldwork - UAV assisted aerial photogrammetry
NASA Astrophysics Data System (ADS)
Vollgger, Stefan; Cruden, Alexander
2015-04-01
Unmanned aerial vehicles (UAVs), commonly referred to as drones, are opening new and low cost possibilities to acquire high-resolution aerial images and digital surface models (DSM) for applications in structural geology. UAVs can be programmed to fly autonomously along a user defined grid to systematically capture high-resolution photographs, even in difficult to access areas. The photographs are subsequently processed using software that employ SIFT (scale invariant feature transform) and SFM (structure from motion) algorithms. These photogrammetric routines allow the extraction of spatial information (3D point clouds, digital elevation models, 3D meshes, orthophotos) from 2D images. Depending on flight altitude and camera setup, sub-centimeter spatial resolutions can be achieved. By "digitally mapping" georeferenced 3D models and images, orientation data can be extracted directly and used to analyse the structural framework of the mapped object or area. We present UAV assisted aerial mapping results from a coastal platform near Cape Liptrap (Victoria, Australia), where deformed metasediments of the Palaeozoic Lachlan Fold Belt are exposed. We also show how orientation and spatial information of brittle and ductile structures extracted from the photogrammetric model can be linked to the progressive development of folds and faults in the region. Even though there are both technical and legislative limitations, which might prohibit the use of UAVs without prior commercial licensing and training, the benefits that arise from the resulting high-resolution, photorealistic models can substantially contribute to the collection of new data and insights for applications in structural geology.
Petit, Caroline; Samson, Adeline; Morita, Satoshi; Ursino, Moreno; Guedj, Jérémie; Jullien, Vincent; Comets, Emmanuelle; Zohar, Sarah
2018-06-01
The number of trials conducted and the number of patients per trial are typically small in paediatric clinical studies. This is due to ethical constraints and the complexity of the medical process for treating children. While incorporating prior knowledge from adults may be extremely valuable, this must be done carefully. In this paper, we propose a unified method for designing and analysing dose-finding trials in paediatrics, while bridging information from adults. The dose-range is calculated under three extrapolation options, linear, allometry and maturation adjustment, using adult pharmacokinetic data. To do this, it is assumed that target exposures are the same in both populations. The working model and prior distribution parameters of the dose-toxicity and dose-efficacy relationships are obtained using early-phase adult toxicity and efficacy data at several dose levels. Priors are integrated into the dose-finding process through Bayesian model selection or adaptive priors. This calibrates the model to adjust for misspecification, if the adult and pediatric data are very different. We performed a simulation study which indicates that incorporating prior adult information in this way may improve dose selection in children.
The Evolution of Primate Communication and Metacommunication
2016-01-01
Abstract Against the prior view that primate communication is based only on signal decoding, comparative evidence suggests that primates are able, no less than humans, to intentionally perform or understand impulsive or habitual communicational actions with a structured evaluative nonconceptual content. These signals convey an affordance‐sensing that immediately motivates conspecifics to act. Although humans have access to a strategic form of propositional communication adapted to teaching and persuasion, they share with nonhuman primates the capacity to communicate in impulsive or habitual ways. They are also similarly able to monitor fluency, informativeness and relevance of messages or signals through nonconceptual cues. PMID:27134332
NASA Astrophysics Data System (ADS)
Dai, Meng-Xue; Chen, Jing-Bo; Cao, Jian
2017-07-01
Full-waveform inversion (FWI) is an ill-posed optimization problem which is sensitive to noise and initial model. To alleviate the ill-posedness of the problem, regularization techniques are usually adopted. The ℓ1-norm penalty is a robust regularization method that preserves contrasts and edges. The Orthant-Wise Limited-Memory Quasi-Newton (OWL-QN) method extends the widely-used limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method to the ℓ1-regularized optimization problems and inherits the efficiency of L-BFGS. To take advantage of the ℓ1-regularized method and the prior model information obtained from sonic logs and geological information, we implement OWL-QN algorithm in ℓ1-regularized FWI with prior model information in this paper. Numerical experiments show that this method not only improve the inversion results but also has a strong anti-noise ability.
NASA Astrophysics Data System (ADS)
Caticha, Ariel
2007-11-01
What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.
Bayesian sample size calculations in phase II clinical trials using a mixture of informative priors.
Gajewski, Byron J; Mayo, Matthew S
2006-08-15
A number of researchers have discussed phase II clinical trials from a Bayesian perspective. A recent article by Mayo and Gajewski focuses on sample size calculations, which they determine by specifying an informative prior distribution and then calculating a posterior probability that the true response will exceed a prespecified target. In this article, we extend these sample size calculations to include a mixture of informative prior distributions. The mixture comes from several sources of information. For example consider information from two (or more) clinicians. The first clinician is pessimistic about the drug and the second clinician is optimistic. We tabulate the results for sample size design using the fact that the simple mixture of Betas is a conjugate family for the Beta- Binomial model. We discuss the theoretical framework for these types of Bayesian designs and show that the Bayesian designs in this paper approximate this theoretical framework. Copyright 2006 John Wiley & Sons, Ltd.
Winograd, Michael R; Rosenfeld, J Peter
2014-12-01
In P300-Concealed Information Tests used with mock crime scenarios, the amount of detail revealed to a participant prior to the commission of the mock crime can have a serious impact on a study's validity. We predicted that exposure to crime details through instructions would bias detection rates toward enhanced sensitivity. In a 2 × 2 factorial design, participants were either informed (through mock crime instructions) or naïve as to the identity of a to-be-stolen item, and then either committed (guilty) or did not commit (innocent) the crime. Results showed that prior knowledge of the stolen item was sufficient to cause 69% of innocent-informed participants to be incorrectly classified as guilty. Further, we found a trend toward enhanced detection rate for guilty-informed participants over guilty-naïve participants. Results suggest that revealing details to participants through instructions biases detection rates in the P300-CIT toward enhanced sensitivity. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.
2017-12-01
In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.
Horodecki, Michał; Oppenheim, Jonathan; Winter, Andreas
2005-08-04
Information--be it classical or quantum--is measured by the amount of communication needed to convey it. In the classical case, if the receiver has some prior information about the messages being conveyed, less communication is needed. Here we explore the concept of prior quantum information: given an unknown quantum state distributed over two systems, we determine how much quantum communication is needed to transfer the full state to one system. This communication measures the partial information one system needs, conditioned on its prior information. We find that it is given by the conditional entropy--a quantity that was known previously, but lacked an operational meaning. In the classical case, partial information must always be positive, but we find that in the quantum world this physical quantity can be negative. If the partial information is positive, its sender needs to communicate this number of quantum bits to the receiver; if it is negative, then sender and receiver instead gain the corresponding potential for future quantum communication. We introduce a protocol that we term 'quantum state merging' which optimally transfers partial information. We show how it enables a systematic understanding of quantum network theory, and discuss several important applications including distributed compression, noiseless coding with side information, multiple access channels and assisted entanglement distillation.
Batke, Monika; Gütlein, Martin; Partosch, Falko; Gundert-Remy, Ursula; Helma, Christoph; Kramer, Stefan; Maunz, Andreas; Seeland, Madeleine; Bitsch, Annette
2016-01-01
Interest is increasing in the development of non-animal methods for toxicological evaluations. These methods are however, particularly challenging for complex toxicological endpoints such as repeated dose toxicity. European Legislation, e.g., the European Union's Cosmetic Directive and REACH, demands the use of alternative methods. Frameworks, such as the Read-across Assessment Framework or the Adverse Outcome Pathway Knowledge Base, support the development of these methods. The aim of the project presented in this publication was to develop substance categories for a read-across with complex endpoints of toxicity based on existing databases. The basic conceptual approach was to combine structural similarity with shared mechanisms of action. Substances with similar chemical structure and toxicological profile form candidate categories suitable for read-across. We combined two databases on repeated dose toxicity, RepDose database, and ELINCS database to form a common database for the identification of categories. The resulting database contained physicochemical, structural, and toxicological data, which were refined and curated for cluster analyses. We applied the Predictive Clustering Tree (PCT) approach for clustering chemicals based on structural and on toxicological information to detect groups of chemicals with similar toxic profiles and pathways/mechanisms of toxicity. As many of the experimental toxicity values were not available, this data was imputed by predicting them with a multi-label classification method, prior to clustering. The clustering results were evaluated by assessing chemical and toxicological similarities with the aim of identifying clusters with a concordance between structural information and toxicity profiles/mechanisms. From these chosen clusters, seven were selected for a quantitative read-across, based on a small ratio of NOAEL of the members with the highest and the lowest NOAEL in the cluster (< 5). We discuss the limitations of the approach. Based on this analysis we propose improvements for a follow-up approach, such as incorporation of metabolic information and more detailed mechanistic information. The software enables the user to allocate a substance in a cluster and to use this information for a possible read- across. The clustering tool is provided as a free web service, accessible at http://mlc-reach.informatik.uni-mainz.de.
Noorbaloochi, Sharareh; Sharon, Dahlia; McClelland, James L
2015-08-05
We used electroencephalography (EEG) and behavior to examine the role of payoff bias in a difficult two-alternative perceptual decision under deadline pressure in humans. The findings suggest that a fast guess process, biased by payoff and triggered by stimulus onset, occurred on a subset of trials and raced with an evidence accumulation process informed by stimulus information. On each trial, the participant judged whether a rectangle was shifted to the right or left and responded by squeezing a right- or left-hand dynamometer. The payoff for each alternative (which could be biased or unbiased) was signaled 1.5 s before stimulus onset. The choice response was assigned to the first hand reaching a squeeze force criterion and reaction time was defined as time to criterion. Consistent with a fast guess account, fast responses were strongly biased toward the higher-paying alternative and the EEG exhibited an abrupt rise in the lateralized readiness potential (LRP) on a subset of biased payoff trials contralateral to the higher-paying alternative ∼ 150 ms after stimulus onset and 50 ms before stimulus information influenced the LRP. This rise was associated with poststimulus dynamometer activity favoring the higher-paying alternative and predicted choice and response time. Quantitative modeling supported the fast guess account over accounts of payoff effects supported in other studies. Our findings, taken with previous studies, support the idea that payoff and prior probability manipulations produce flexible adaptations to task structure and do not reflect a fixed policy for the integration of payoff and stimulus information. Humans and other animals often face situations in which they must make choices based on uncertain sensory information together with information about expected outcomes (gains or losses) about each choice. We investigated how differences in payoffs between available alternatives affect neural activity, overt choice, and the timing of choice responses. In our experiment, in which participants were under strong time pressure, neural and behavioral findings together with model fitting suggested that our human participants often made a fast guess toward the higher reward rather than integrating stimulus and payoff information. Our findings, taken with findings from other studies, support the idea that payoff and prior probability manipulations produce flexible adaptations to task structure and do not reflect a fixed policy. Copyright © 2015 the authors 0270-6474/15/3510989-23$15.00/0.
Genetic diversity and patterns of population structure in Creole goats from the Americas.
Ginja, C; Gama, L T; Martínez, A; Sevane, N; Martin-Burriel, I; Lanari, M R; Revidatti, M A; Aranguren-Méndez, J A; Bedotti, D O; Ribeiro, M N; Sponenberg, P; Aguirre, E L; Alvarez-Franco, L A; Menezes, M P C; Chacón, E; Galarza, A; Gómez-Urviola, N; Martínez-López, O R; Pimenta-Filho, E C; da Rocha, L L; Stemmer, A; Landi, V; Delgado-Bermejo, J V
2017-06-01
Biodiversity studies are more efficient when large numbers of breeds belonging to several countries are involved, as they allow for an in-depth analysis of the within- and between-breed components of genetic diversity. A set of 21 microsatellites was used to investigate the genetic composition of 24 Creole goat breeds (910 animals) from 10 countries to estimate levels of genetic variability, infer population structure and understand genetic relationships among populations across the American continent. Three commercial transboundary breeds were included in the analyses to investigate admixture with Creole goats. Overall, the genetic diversity of Creole populations (mean number of alleles = 5.82 ± 1.14, observed heterozygosity = 0.585 ± 0.074) was moderate and slightly lower than what was detected in other studies with breeds from other regions. The Bayesian clustering analysis without prior information on source populations identified 22 breed clusters. Three groups comprised more than one population, namely from Brazil (Azul and Graúna; Moxotó and Repartida) and Argentina (Long and shorthair Chilluda, Pampeana Colorada and Angora-type goat). Substructure was found in Criolla Paraguaya. When prior information on sample origin was considered, 92% of the individuals were assigned to the source population (threshold q ≥ 0.700). Creole breeds are well-differentiated entities (mean coefficient of genetic differentiation = 0.111 ± 0.048, with the exception of isolated island populations). Dilution from admixture with commercial transboundary breeds appears to be negligible. Significant levels of inbreeding were detected (inbreeding coefficient > 0 in most Creole goat populations, P < 0.05). Our results provide a broad perspective on the extant genetic diversity of Creole goats, however further studies are needed to understand whether the observed geographical patterns of population structure may reflect the mode of goat colonization in the Americas. © 2017 Stichting International Foundation for Animal Genetics.
Martínez, Carlos Alberto; Khare, Kshitij; Banerjee, Arunava; Elzo, Mauricio A
2017-03-21
It is important to consider heterogeneity of marker effects and allelic frequencies in across population genome-wide prediction studies. Moreover, all regression models used in genome-wide prediction overlook randomness of genotypes. In this study, a family of hierarchical Bayesian models to perform across population genome-wide prediction modeling genotypes as random variables and allowing population-specific effects for each marker was developed. Models shared a common structure and differed in the priors used and the assumption about residual variances (homogeneous or heterogeneous). Randomness of genotypes was accounted for by deriving the joint probability mass function of marker genotypes conditional on allelic frequencies and pedigree information. As a consequence, these models incorporated kinship and genotypic information that not only permitted to account for heterogeneity of allelic frequencies, but also to include individuals with missing genotypes at some or all loci without the need for previous imputation. This was possible because the non-observed fraction of the design matrix was treated as an unknown model parameter. For each model, a simpler version ignoring population structure, but still accounting for randomness of genotypes was proposed. Implementation of these models and computation of some criteria for model comparison were illustrated using two simulated datasets. Theoretical and computational issues along with possible applications, extensions and refinements were discussed. Some features of the models developed in this study make them promising for genome-wide prediction, the use of information contained in the probability distribution of genotypes is perhaps the most appealing. Further studies to assess the performance of the models proposed here and also to compare them with conventional models used in genome-wide prediction are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations
NASA Technical Reports Server (NTRS)
Mantz, A.; Allen, S. W.
2011-01-01
Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.
The impact of the rate prior on Bayesian estimation of divergence times with multiple Loci.
Dos Reis, Mario; Zhu, Tianqi; Yang, Ziheng
2014-07-01
Bayesian methods provide a powerful way to estimate species divergence times by combining information from molecular sequences with information from the fossil record. With the explosive increase of genomic data, divergence time estimation increasingly uses data of multiple loci (genes or site partitions). Widely used computer programs to estimate divergence times use independent and identically distributed (i.i.d.) priors on the substitution rates for different loci. The i.i.d. prior is problematic. As the number of loci (L) increases, the prior variance of the average rate across all loci goes to zero at the rate 1/L. As a consequence, the rate prior dominates posterior time estimates when many loci are analyzed, and if the rate prior is misspecified, the estimated divergence times will converge to wrong values with very narrow credibility intervals. Here we develop a new prior on the locus rates based on the Dirichlet distribution that corrects the problematic behavior of the i.i.d. prior. We use computer simulation and real data analysis to highlight the differences between the old and new priors. For a dataset for six primate species, we show that with the old i.i.d. prior, if the prior rate is too high (or too low), the estimated divergence times are too young (or too old), outside the bounds imposed by the fossil calibrations. In contrast, with the new Dirichlet prior, posterior time estimates are insensitive to the rate prior and are compatible with the fossil calibrations. We re-analyzed a phylogenomic data set of 36 mammal species and show that using many fossil calibrations can alleviate the adverse impact of a misspecified rate prior to some extent. We recommend the use of the new Dirichlet prior in Bayesian divergence time estimation. [Bayesian inference, divergence time, relaxed clock, rate prior, partition analysis.]. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Effects of Prior Knowledge and Concept-Map Structure on Disorientation, Cognitive Load, and Learning
ERIC Educational Resources Information Center
Amadieu, Franck; van Gog, Tamara; Paas, Fred; Tricot, Andre; Marine, Claudette
2009-01-01
This study explored the effects of prior knowledge (high vs. low; HPK and LPK) and concept-map structure (hierarchical vs. network; HS and NS) on disorientation, cognitive load, and learning from non-linear documents on "the infection process of a retrograde virus (HIV)". Participants in the study were 24 adults. Overall subjective ratings of…
Generalized multiple kernel learning with data-dependent priors.
Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li
2015-06-01
Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.
ERIC Educational Resources Information Center
Marcoulides, Katerina M.
2018-01-01
This study examined the use of Bayesian analysis methods for the estimation of item parameters in a two-parameter logistic item response theory model. Using simulated data under various design conditions with both informative and non-informative priors, the parameter recovery of Bayesian analysis methods were examined. Overall results showed that…
Code of Federal Regulations, 2010 CFR
2010-04-01
... submitting the information required by such form on magnetic tape or by other media, provided that the prior... required by such form on magnetic tape or other approved media, provided that the prior consent of the Commissioner of Social Security (or other authorized officer or employee thereof) has been obtained. [T.D. 6883...
21 CFR 1.281 - What information must be in a prior notice?
Code of Federal Regulations, 2010 CFR
2010-04-01
... by truck, bus, or rail, the trip number; (v) For food arriving as containerized cargo by water, air... arrived by truck, bus, or rail, the trip number; (v) For food that arrived as containerized cargo by water... 21 Food and Drugs 1 2010-04-01 2010-04-01 false What information must be in a prior notice? 1.281...
ERIC Educational Resources Information Center
Brown, Julie
2017-01-01
This article presents an overview of the findings of a recently completed study exploring the potentially transformative impact upon learners of recognition of prior informal learning (RPL). The specific transformative dimension being reported is learner identity. In addition to providing a starting point for an evidence base within Scotland, the…
Intracavitary applicator in relation to complications of pelvic radiation: the Ernst system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rotman, M.; John, M.J.; Roussis, K.
Case studies were reviewed for 100 consecutive patients with carcinoma of the cervix, Stages I to III, who were treated prior to 1968 at a large municipal hospital in New York City. Treatments consisted of orthovoltage therapy prior to or following intracavitary radium. A 250 kV x-ray unit delivered a 3000 rad tumor dose in 3 weeks through four oblique fields. Intracavitary radium delivered 6000 to 7200 mg hr using the Ernst applicator. The 9% incidence of fistulae was 4-fold higher than that found in subsequent years using /sup 60/Co teletherapy and the afterloading Fletcher-Suit applicator. A review of themore » dosimetry relating to the use of the Ernst applicator demonstrates inherent structural characteristics which lend themselves to such complications. Where less than an ideal application is possible, this rigid applicator compacts itself in accordion-like fashion, producing a so-called short-system. Coupled with a reduced source to applicator-surface distance, such applications result in unacceptable dose anisotropy and excessive radiation of critical structures where a predetermined dose is to be delivered to anatomic Point A. Information gleaned from this study can be extrapolated to other rigid unprotected short-surface distance intra-vaginal applicators that have proliferated in recent years.« less
Collins, Doug; Benedict, Chris; Bary, Andy; Cogger, Craig
2015-01-01
The spatial heterogeneity of soil and weed populations poses a challenge to researchers. Unlike aboveground variability, below-ground variability is more difficult to discern without a strategic soil sampling pattern. While blocking is commonly used to control environmental variation, this strategy is rarely informed by data about current soil conditions. Fifty georeferenced sites were located in a 0.65 ha area prior to establishing a long-term field experiment. Soil organic matter (OM) and weed seed bank populations were analyzed at each site and the spatial structure was modeled with semivariograms and interpolated with kriging to map the surface. These maps were used to formulate three strategic blocking patterns and the efficiency of each pattern was compared to a completely randomized design and a west to east model not informed by soil variability. Compared to OM, weeds were more variable across the landscape and had a shorter range of autocorrelation, and models to increase blocking efficiency resulted in less increase in power. Weeds and OM were not correlated, so no model examined improved power equally for both parameters. Compared to the west to east blocking pattern, the final blocking pattern chosen resulted in a 7-fold increase in power for OM and a 36% increase in power for weeds.
Thrombus segmentation by texture dynamics from microscopic image sequences
NASA Astrophysics Data System (ADS)
Brieu, Nicolas; Serbanovic-Canic, Jovana; Cvejic, Ana; Stemple, Derek; Ouwehand, Willem; Navab, Nassir; Groher, Martin
2010-03-01
The genetic factors of thrombosis are commonly explored by microscopically imaging the coagulation of blood cells induced by injuring a vessel of mice or of zebrafish mutants. The latter species is particularly interesting since skin transparency permits to non-invasively acquire microscopic images of the scene with a CCD camera and to estimate the parameters characterizing the thrombus development. These parameters are currently determined by manual outlining, which is both error prone and extremely time consuming. Even though a technique for automatic thrombus extraction would be highly valuable for gene analysts, little work can be found, which is mainly due to very low image contrast and spurious structures. In this work, we propose to semi-automatically segment the thrombus over time from microscopic image sequences of wild-type zebrafish larvae. To compensate the lack of valuable spatial information, our main idea consists of exploiting the temporal information by modeling the variations of the pixel intensities over successive temporal windows with a linear Markov-based dynamic texture formalization. We then derive an image from the estimated model parameters, which represents the probability of a pixel to belong to the thrombus. We employ this probability image to accurately estimate the thrombus position via an active contour segmentation incorporating also prior and spatial information of the underlying intensity images. The performance of our approach is tested on three microscopic image sequences. We show that the thrombus is accurately tracked over time in each sequence if the respective parameters controlling prior influence and contour stiffness are correctly chosen.
Structure-Based Low-Rank Model With Graph Nuclear Norm Regularization for Noise Removal.
Ge, Qi; Jing, Xiao-Yuan; Wu, Fei; Wei, Zhi-Hui; Xiao, Liang; Shao, Wen-Ze; Yue, Dong; Li, Hai-Bo
2017-07-01
Nonlocal image representation methods, including group-based sparse coding and block-matching 3-D filtering, have shown their great performance in application to low-level tasks. The nonlocal prior is extracted from each group consisting of patches with similar intensities. Grouping patches based on intensity similarity, however, gives rise to disturbance and inaccuracy in estimation of the true images. To address this problem, we propose a structure-based low-rank model with graph nuclear norm regularization. We exploit the local manifold structure inside a patch and group the patches by the distance metric of manifold structure. With the manifold structure information, a graph nuclear norm regularization is established and incorporated into a low-rank approximation model. We then prove that the graph-based regularization is equivalent to a weighted nuclear norm and the proposed model can be solved by a weighted singular-value thresholding algorithm. Extensive experiments on additive white Gaussian noise removal and mixed noise removal demonstrate that the proposed method achieves a better performance than several state-of-the-art algorithms.
A Neuro-Oncology Workstation for Structuring, Modeling, and Visualizing Patient Records
Hsu, William; Arnold, Corey W.; Taira, Ricky K.
2016-01-01
The patient medical record contains a wealth of information consisting of prior observations, interpretations, and interventions that need to be interpreted and applied towards decisions regarding current patient care. Given the time constraints and the large—often extraneous—amount of data available, clinicians are tasked with the challenge of performing a comprehensive review of how a disease progresses in individual patients. To facilitate this process, we demonstrate a neuro-oncology workstation that assists in structuring and visualizing medical data to promote an evidence-based approach for understanding a patient’s record. The workstation consists of three components: 1) a structuring tool that incorporates natural language processing to assist with the extraction of problems, findings, and attributes for structuring observations, events, and inferences stated within medical reports; 2) a data modeling tool that provides a comprehensive and consistent representation of concepts for the disease-specific domain; and 3) a visual workbench for visualizing, navigating, and querying the structured data to enable retrieval of relevant portions of the patient record. We discuss this workstation in the context of reviewing cases of glioblastoma multiforme patients. PMID:27583308
A Neuro-Oncology Workstation for Structuring, Modeling, and Visualizing Patient Records.
Hsu, William; Arnold, Corey W; Taira, Ricky K
2010-11-01
The patient medical record contains a wealth of information consisting of prior observations, interpretations, and interventions that need to be interpreted and applied towards decisions regarding current patient care. Given the time constraints and the large-often extraneous-amount of data available, clinicians are tasked with the challenge of performing a comprehensive review of how a disease progresses in individual patients. To facilitate this process, we demonstrate a neuro-oncology workstation that assists in structuring and visualizing medical data to promote an evidence-based approach for understanding a patient's record. The workstation consists of three components: 1) a structuring tool that incorporates natural language processing to assist with the extraction of problems, findings, and attributes for structuring observations, events, and inferences stated within medical reports; 2) a data modeling tool that provides a comprehensive and consistent representation of concepts for the disease-specific domain; and 3) a visual workbench for visualizing, navigating, and querying the structured data to enable retrieval of relevant portions of the patient record. We discuss this workstation in the context of reviewing cases of glioblastoma multiforme patients.
The application of SSADM to modelling the logical structure of proteins.
Saldanha, J; Eccles, J
1991-10-01
A logical design that describes the overall structure of proteins, together with a more detailed design describing secondary and some supersecondary structures, has been constructed using the computer-aided software engineering (CASE) tool, Auto-mate. Auto-mate embodies the philosophy of the Structured Systems Analysis and Design Method (SSADM) which enables the logical design of computer systems. Our design will facilitate the building of large information systems, such as databases and knowledgebases in the field of protein structure, by the derivation of system requirements from our logical model prior to producing the final physical system. In addition, the study has highlighted the ease of employing SSADM as a formalism in which to conduct the transferral of concepts from an expert into a design for a knowledge-based system that can be implemented on a computer (the knowledge-engineering exercise). It has been demonstrated how SSADM techniques may be extended for the purpose of modelling the constituent Prolog rules. This facilitates the integration of the logical system design model with the derived knowledge-based system.
Abiological origin of described stromatolites older than 3.2 Ga
NASA Technical Reports Server (NTRS)
Lowe, D. R.
1994-01-01
The three well-documented occurrences of three-dimensional stromatolites older than 3.2 Ga meet most criteria for biogenicity except the presence of fossil bacteria. However, they also show features more consistent with nonbiological origins. Small conical structures in the Strelley Pool chert in the upper part of the Warrawoona Group (3.5-3.2 Ga), Western Australia, lack the structure typical of stromatolites and probably formed mainly through evaporitc precipitation. A domal structure from the North Pole chert, Warrawoona Group, formed by soft-sediment deformation or originally flat layers. Laminated chert containing domal and pseudocolumnar structures in the Onverwacht Group (3.5-3.3 Ga), Barberton Greenstone Belt, South Africa, extends downward into veins and cavities, where it formed through inorganic precipitation. Although bacterial communities were widespread on Earth prior to 3.2 Ga, these particular three-dimensional structures are probably abiotic in origin and do not provide information on the paleobiology or paleoecology of early organisms. The paucity of Archean stromatolites older than 3.2 Ga probably reflects the paucity of known and possibly extant carbonate deposits of this age.
Chen, Jinsong; Zhang, Dake; Choi, Jaehwa
2015-12-01
It is common to encounter latent variables with ordinal data in social or behavioral research. Although a mediated effect of latent variables (latent mediated effect, or LME) with ordinal data may appear to be a straightforward combination of LME with continuous data and latent variables with ordinal data, the methodological challenges to combine the two are not trivial. This research covers model structures as complex as LME and formulates both point and interval estimates of LME for ordinal data using the Bayesian full-information approach. We also combine weighted least squares (WLS) estimation with the bias-corrected bootstrapping (BCB; Efron Journal of the American Statistical Association, 82, 171-185, 1987) method or the traditional delta method as the limited-information approach. We evaluated the viability of these different approaches across various conditions through simulation studies, and provide an empirical example to illustrate the approaches. We found that the Bayesian approach with reasonably informative priors is preferred when both point and interval estimates are of interest and the sample size is 200 or above.
Zipf’s word frequency law in natural language: A critical review and future directions
2014-01-01
The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf ’ s law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf’s law and are then used to evaluate many of the theoretical explanations of Zipf’s law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf’s law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data. PMID:24664880
Theories of Impaired Consciousness in Epilepsy
Yu, Lissa; Blumenfeld, Hal
2015-01-01
Although the precise mechanisms for control of consciousness are not fully understood, emerging data show that conscious information processing depends on the activation of certain networks in the brain and that the impairment of consciousness is related to abnormal activity in these systems. Epilepsy can lead to transient impairment of consciousness, providing a window into the mechanisms necessary for normal consciousness. Thus, despite differences in behavioral manifestations, cause, and electrophysiology, generalized tonic–clonic, absence, and partial seizures engage similar anatomical structures and pathways. We review prior concepts of impaired consciousness in epilepsy, focusing especially on temporal lobe complex partial seizures, which are a common and debilitating form of epileptic unconsciousness. We discuss a “network inhibition hypothesis” in which focal temporal lobe seizure activity disrupts normal cortical–subcortical interactions, leading to depressed neocortical function and impaired consciousness. This review of the major prior theories of impaired consciousness in epilepsy allows us to put more recent data into context and to reach a better understanding of the mechanisms important for normal consciousness. PMID:19351355
Observations of sea ice and icebergs in the western Barents Sea during the winter of 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loeset, S.; Carstens, T.
1995-12-31
A multisensor ice data acquisition program for the western Barents Sea was carried out during three field campaigns in the mid winter and fall of 1987. The main purpose of the program was to obtain comprehensive information about the ice in the area at that time. The reasoning was that prior to any oil/gas exploration and production in the Barents Sea, the physical environment has to be quantitatively surveyed in order to ensure safe operations related to human safety, the regular operability and safety of the structure and protection of the environment. Prior to this field investigation program in 1987more » data on sea ice and icebergs for engineering purposes for the western Barents Sea were meager. The present paper highlights some of the findings with emphasis on ice edge speeds, ice edge displacement and ice drift. For icebergs, the paper focuses on population, size distributions and geometric parameters.« less
Li, Jiansen; Song, Ying; Zhu, Zhen; Zhao, Jun
2017-05-01
Dual-dictionary learning (Dual-DL) method utilizes both a low-resolution dictionary and a high-resolution dictionary, which are co-trained for sparse coding and image updating, respectively. It can effectively exploit a priori knowledge regarding the typical structures, specific features, and local details of training sets images. The prior knowledge helps to improve the reconstruction quality greatly. This method has been successfully applied in magnetic resonance (MR) image reconstruction. However, it relies heavily on the training sets, and dictionaries are fixed and nonadaptive. In this research, we improve Dual-DL by using self-adaptive dictionaries. The low- and high-resolution dictionaries are updated correspondingly along with the image updating stage to ensure their self-adaptivity. The updated dictionaries incorporate both the prior information of the training sets and the test image directly. Both dictionaries feature improved adaptability. Experimental results demonstrate that the proposed method can efficiently and significantly improve the quality and robustness of MR image reconstruction.
Proportion estimation using prior cluster purities
NASA Technical Reports Server (NTRS)
Terrell, G. R. (Principal Investigator)
1980-01-01
The prior distribution of CLASSY component purities is studied, and this information incorporated into maximum likelihood crop proportion estimators. The method is tested on Transition Year spring small grain segments.
Bayesian road safety analysis: incorporation of past evidence and effect of hyper-prior choice.
Miranda-Moreno, Luis F; Heydari, Shahram; Lord, Dominique; Fu, Liping
2013-09-01
This paper aims to address two related issues when applying hierarchical Bayesian models for road safety analysis, namely: (a) how to incorporate available information from previous studies or past experiences in the (hyper) prior distributions for model parameters and (b) what are the potential benefits of incorporating past evidence on the results of a road safety analysis when working with scarce accident data (i.e., when calibrating models with crash datasets characterized by a very low average number of accidents and a small number of sites). A simulation framework was developed to evaluate the performance of alternative hyper-priors including informative and non-informative Gamma, Pareto, as well as Uniform distributions. Based on this simulation framework, different data scenarios (i.e., number of observations and years of data) were defined and tested using crash data collected at 3-legged rural intersections in California and crash data collected for rural 4-lane highway segments in Texas. This study shows how the accuracy of model parameter estimates (inverse dispersion parameter) is considerably improved when incorporating past evidence, in particular when working with the small number of observations and crash data with low mean. The results also illustrates that when the sample size (more than 100 sites) and the number of years of crash data is relatively large, neither the incorporation of past experience nor the choice of the hyper-prior distribution may affect the final results of a traffic safety analysis. As a potential solution to the problem of low sample mean and small sample size, this paper suggests some practical guidance on how to incorporate past evidence into informative hyper-priors. By combining evidence from past studies and data available, the model parameter estimates can significantly be improved. The effect of prior choice seems to be less important on the hotspot identification. The results show the benefits of incorporating prior information when working with limited crash data in road safety studies. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.
Maximizing the information learned from finite data selects a simple model
NASA Astrophysics Data System (ADS)
Mattingly, Henry H.; Transtrum, Mark K.; Abbott, Michael C.; Machta, Benjamin B.
2018-02-01
We use the language of uninformative Bayesian prior choice to study the selection of appropriately simple effective models. We advocate for the prior which maximizes the mutual information between parameters and predictions, learning as much as possible from limited data. When many parameters are poorly constrained by the available data, we find that this prior puts weight only on boundaries of the parameter space. Thus, it selects a lower-dimensional effective theory in a principled way, ignoring irrelevant parameter directions. In the limit where there are sufficient data to tightly constrain any number of parameters, this reduces to the Jeffreys prior. However, we argue that this limit is pathological when applied to the hyperribbon parameter manifolds generic in science, because it leads to dramatic dependence on effects invisible to experiment.
NASA Astrophysics Data System (ADS)
McBride, S.; Tilley, E. N.; Johnston, D. M.; Becker, J.; Orchiston, C.
2015-12-01
This research evaluates the public education earthquake information prior to the Canterbury Earthquake sequence (2010-present), and examines communication learnings to create recommendations for improvement in implementation for these types of campaigns in future. The research comes from a practitioner perspective of someone who worked on these campaigns in Canterbury prior to the Earthquake Sequence and who also was the Public Information Manager Second in Command during the earthquake response in February 2011. Documents, specifically those addressing seismic risk, that were created prior to the earthquake sequence, were analyzed, using a "best practice matrix" created by the researcher, for how closely these aligned to best practice academic research. Readability tests and word counts are also employed to assist with triangulation of the data as was practitioner involvement. This research also outlines the lessons learned by practitioners and explores their experiences in regards to creating these materials and how they perceive these now, given all that has happened since the inception of the booklets. The findings from the research showed these documents lacked many of the attributes of best practice. The overly long, jargon filled text had little positive outcome expectancy messages. This probably would have failed to persuade anyone that earthquakes were a real threat in Canterbury. Paradoxically, it is likely these booklets may have created fatalism in publics who read the booklets. While the overall intention was positive, for scientists to explain earthquakes, tsunami, landslides and other risks to encourage the public to prepare for these events, the implementation could be greatly improved. This final component of the research highlights points of improvement for implementation for more successful campaigns in future. The importance of preparedness and science information campaigns can be not only in preparing the population but also into development of crisis communication plans. These plans are prepared in advance of a major emergency and symbiotic development of strategies, messages, themes and organizational structures in the preparedness stage can impact successful crisis communication plan implementation during an emergency.
Topical video object discovery from key frames by modeling word co-occurrence prior.
Zhao, Gangqiang; Yuan, Junsong; Hua, Gang; Yang, Jiong
2015-12-01
A topical video object refers to an object, that is, frequently highlighted in a video. It could be, e.g., the product logo and the leading actor/actress in a TV commercial. We propose a topic model that incorporates a word co-occurrence prior for efficient discovery of topical video objects from a set of key frames. Previous work using topic models, such as latent Dirichelet allocation (LDA), for video object discovery often takes a bag-of-visual-words representation, which ignored important co-occurrence information among the local features. We show that such data driven co-occurrence information from bottom-up can conveniently be incorporated in LDA with a Gaussian Markov prior, which combines top-down probabilistic topic modeling with bottom-up priors in a unified model. Our experiments on challenging videos demonstrate that the proposed approach can discover different types of topical objects despite variations in scale, view-point, color and lighting changes, or even partial occlusions. The efficacy of the co-occurrence prior is clearly demonstrated when compared with topic models without such priors.
Gust prediction via artificial hair sensor array and neural network
NASA Astrophysics Data System (ADS)
Pankonien, Alexander M.; Thapa Magar, Kaman S.; Beblo, Richard V.; Reich, Gregory W.
2017-04-01
Gust Load Alleviation (GLA) is an important aspect of flight dynamics and control that reduces structural loadings and enhances ride quality. In conventional GLA systems, the structural response to aerodynamic excitation informs the control scheme. A phase lag, imposed by inertia, between the excitation and the measurement inherently limits the effectiveness of these systems. Hence, direct measurement of the aerodynamic loading can eliminate this lag, providing valuable information for effective GLA system design. Distributed arrays of Artificial Hair Sensors (AHS) are ideal for surface flow measurements that can be used to predict other necessary parameters such as aerodynamic forces, moments, and turbulence. In previous work, the spatially distributed surface flow velocities obtained from an array of artificial hair sensors using a Single-State (or feedforward) Neural Network were found to be effective in estimating the steady aerodynamic parameters such as air speed, angle of attack, lift and moment coefficient. This paper extends the investigation of the same configuration to unsteady force and moment estimation, which is important for active GLA control design. Implementing a Recurrent Neural Network that includes previous-timestep sensor information, the hair sensor array is shown to be capable of capturing gust disturbances with a wide range of periods, reducing predictive error in lift and moment by 68% and 52% respectively. The L2 norms of the first layer of the weight matrices were compared showing a 23% emphasis on prior versus current information. The Recurrent architecture also improves robustness, exhibiting only a 30% increase in predictive error when undertrained as compared to a 170% increase by the Single-State NN. This diverse, localized information can thus be directly implemented into a control scheme that alleviates the gusts without waiting for a structural response or requiring user-intensive sensor calibration.
Greenwood, Daniel; Davids, Keith; Renshaw, Ian
2014-01-01
Coordination of dynamic interceptive movements is predicated on cyclical relations between an individual's actions and information sources from the performance environment. To identify dynamic informational constraints, which are interwoven with individual and task constraints, coaches' experiential knowledge provides a complementary source to support empirical understanding of performance in sport. In this study, 15 expert coaches from 3 sports (track and field, gymnastics and cricket) participated in a semi-structured interview process to identify potential informational constraints which they perceived to regulate action during run-up performance. Expert coaches' experiential knowledge revealed multiple information sources which may constrain performance adaptations in such locomotor pointing tasks. In addition to the locomotor pointing target, coaches' knowledge highlighted two other key informational constraints: vertical reference points located near the locomotor pointing target and a check mark located prior to the locomotor pointing target. This study highlights opportunities for broadening the understanding of perception and action coupling processes, and the identified information sources warrant further empirical investigation as potential constraints on athletic performance. Integration of experiential knowledge of expert coaches with theoretically driven empirical knowledge represents a promising avenue to drive future applied science research and pedagogical practice.
The Emergence of Knowledge and How it Supports the Memory for Novel Related Information.
Sommer, Tobias
2017-03-01
Current theories suggest that memories for novel information and events, over time and with repeated retrieval, lose the association to their initial learning context. They are consolidated into a more stable form and transformed into semantic knowledge, that is, semanticized. Novel, related information can then be rapidly integrated into such knowledge, leading to superior memory. We tested these hypotheses in a longitudinal, 302-day, human functional magnetic resonance imaging study in which participants first overlearned and consolidated associative structures. This phase was associated with a shift from hippocampal- to ventrolateral prefrontal cortex (vlPFC)-mediated retrieval, consistent with semanticization. Next, participants encoded novel, related information whose encoding into the already acquired knowledge was orchestrated by the ventromedial prefrontal cortex. Novel related information exhibited reduced forgetting compared with novel control information, which corresponded to a faster shift from hippocampal- to vlPFC-mediated retrieval. In sum, the current results suggest that memory for novel information can be enhanced by anchoring it to prior knowledge via acceleration of the processes observed during semanticization. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Implicit Coordination Strategies for Effective Team Communication.
Butchibabu, Abhizna; Sparano-Huiban, Christopher; Sonenberg, Liz; Shah, Julie
2016-06-01
We investigated implicit communication strategies for anticipatory information sharing during team performance of tasks with varying degrees of complexity. We compared the strategies used by teams with the highest level of performance to those used by the lowest-performing teams to evaluate the frequency and methods of communications used as a function of task structure. High-performing teams share information by anticipating the needs of their teammates rather than explicitly requesting the exchange of information. As the complexity of a task increases to involve more interdependence among teammates, the impact of coordination on team performance also increases. This observation motivated us to conduct a study of anticipatory information sharing as a function of task complexity. We conducted an experiment in which 13 teams of four people performed collaborative search-and-deliver tasks with varying degrees of complexity in a simulation environment. We elaborated upon prior characterizations of communication as implicit versus explicit by dividing implicit communication into two subtypes: (a) deliberative/goal information and (b) reactive status updates. We then characterized relationships between task structure, implicit communication, and team performance. We found that the five teams with the fastest task completion times and lowest idle times exhibited higher rates of deliberative communication versus reactive communication during high-complexity tasks compared with the five teams with the slowest completion times and longest idle times (p = .039). Teams in which members proactively communicated information about their next goal to teammates exhibited improved team performance. The findings from our work can inform the design of communication strategies for team training to improve performance of complex tasks. © 2016, Human Factors and Ergonomics Society.
Zengel, Bettina; Ambler, James K; McCarthy, Randy J; Skowronski, John J
2017-01-01
This article reports results from a study in which participants encountered either (a) previously known informants who were positive (e.g. Abraham Lincoln), neutral (e.g., Jay Leno), or negative (e.g., Adolf Hitler), or (b) previously unknown informants. The informants ostensibly described either a trait-implicative positive behavior, a trait-implicative negative behavior, or a neutral behavior. These descriptions were framed as either the behavior of the informant or the behavior of another person. Results yielded evidence of informant-trait linkages for both self-informants and for informants who described another person. These effects were not moderated by informant type, behavior valence, or the congruency or incongruency between the prior knowledge of the informant and the behavior valence. Results are discussed in terms of theories of Spontaneous Trait Inference and Spontaneous Trait Transference.
An investigation of multitasking information behavior and the influence of working memory and flow
NASA Astrophysics Data System (ADS)
Alexopoulou, Peggy; Hepworth, Mark; Morris, Anne
2015-02-01
This study explored the multitasking information behaviour of Web users and how this is influenced by working memory, flow and Personal, Artefact and Task characteristics, as described in the PAT model. The research was exploratory using a pragmatic, mixed method approach. Thirty University students participated; 10 psychologists, 10 accountants and 10 mechanical engineers. The data collection tools used were: pre and post questionnaires, a working memory test, a flow state scale test, audio-visual data, web search logs, think aloud data, observation, and the critical decision method. All participants searched information on the Web for four topics: two for which they had prior knowledge and two more without prior knowledge. Perception of task complexity was found to be related to working memory. People with low working memory reported a significant increase in task complexity after they had completed information searching tasks for which they had no prior knowledge, this was not the case for tasks with prior knowledge. Regarding flow and task complexity, the results confirmed the suggestion of the PAT model (Finneran and Zhang, 2003), which proposed that a complex task can lead to anxiety and low flow levels as well as to perceived challenge and high flow levels. However, the results did not confirm the suggestion of the PAT model regarding the characteristics of web search systems and especially perceived vividness. All participants experienced high vividness. According to the PAT model, however, only people with high flow should experience high levels of vividness. Flow affected the degree of change of knowledge of the participants. People with high flow gained more knowledge for tasks without prior knowledge rather than people with low flow. Furthermore, accountants felt that tasks without prior knowledge were less complex at the end of the web seeking procedure than psychologists and mechanical engineers. Finally, the three disciplines appeared to differ regarding the multitasking information behaviour characteristics such as queries, web search sessions and opened tabs/windows.
Verb bias and verb-specific competition effects on sentence production
Thothathiri, Malathi; Evans, Daniel G.; Poudel, Sonali
2017-01-01
How do speakers choose between structural options for expressing a given meaning? Overall preference for some structures over others as well as prior statistical association between specific verbs and sentence structures (“verb bias”) are known to broadly influence language use. However, the effects of prior statistical experience on the planning and execution of utterances and the mechanisms that facilitate structural choice for verbs with different biases have not been fully explored. In this study, we manipulated verb bias for English double-object (DO) and prepositional-object (PO) dative structures: some verbs appeared solely in the DO structure (DO-only), others solely in PO (PO-only) and yet others equally in both (Equi). Structural choices during subsequent free-choice sentence production revealed the expected dispreference for DO overall but critically also a reliable linear trend in DO production that was consistent with verb bias (DO-only > Equi > PO-only). Going beyond the general verb bias effect, three results suggested that Equi verbs, which were associated equally with the two structures, engendered verb-specific competition and required additional resources for choosing the dispreferred DO structure. First, DO production with Equi verbs but not the other verbs correlated with participants’ inhibition ability. Second, utterance duration prior to the choice of a DO structure showed a quadratic trend (DO-only < Equi > PO-only) with the longest durations for Equi verbs. Third, eye movements consistent with reimagining the event also showed a quadratic trend (DO-only < Equi > PO-only) prior to choosing DO, suggesting that participants used such recall particularly for Equi verbs. Together, these analyses of structural choices, utterance durations, eye movements and individual differences in executive functions shed light on the effects of verb bias and verb-specific competition on sentence production and the role of different executive functions in choosing between sentence structures. PMID:28672009
Steel, Amie; Adams, Jon
2011-06-01
The approach of evidence-based medicine (EBM), providing a paradigm to validate information sources and a process for critiquing their value, is an important platform for guiding practice. Researchers have explored the application and value of information sources in clinical practice with regard to a range of health professions; however, naturopathic practice has been overlooked. An exploratory study of naturopaths' perspectives of the application and value of information sources has been undertaken. Semi-structured interviews with 12 naturopaths in current clinical practice, concerning the information sources used in clinical practice and their perceptions of these sources. Thematic analysis identified differences in the application of the variety of information sources used, depending upon the perceived validity. Internet databases were viewed as highly valid. Textbooks, formal education and interpersonal interactions were judged based upon a variety of factors, whilst validation of general internet sites and manufacturers information was required prior to use. The findings of this study will provide preliminary aid to those responsible for supporting naturopaths' information use and access. In particular, it may assist publishers, medical librarians and professional associations in developing strategies to expand the clinically useful information sources available to naturopaths. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.
Lee, Jung Yeon; Brook, Judith S; Pahl, Kerstin; Brook, David W
2017-11-01
A quarter of people living with human immunodeficiency virus (HIV) infection in the United States are women. Furthermore, African American and Hispanic/Latina women continue to be disproportionately affected by HIV, compared with women of other races/ethnicities. Cannabis use prior to intercourse may be associated with increased risky sexual behaviors which are highly related to HIV. The ultimate goal of this research is to better understand the relationships between unconventional personal attributes (e.g., risk-taking behaviors) in the late 20s, substance use (e.g., alcohol) in the mid 30s, and cannabis use prior to intercourse in the late 30s using a community sample; such an understanding may inform interventions. This study employing data from the Harlem Longitudinal Development Study includes 343 female participants (50% African Americans, 50% Puerto Ricans). Structural equation modeling indicated that unconventional personal attributes in the late 20s were associated with substance use in the mid 30s (β=0.32, p<0.001), which in turn, was associated with cannabis use prior to sexual intercourse in the late 30s (β=0.64, p<0.001). Unconventional personal attributes in the late 20s were also directly related to cannabis use prior to sexual intercourse in the late 30s (β=0.39, p<0.01). The findings of this study suggest that interventions focused on decreasing unconventional personal attributes as well as substance use may reduce sexual risk behaviors among urban African American and Puerto Rican women. Also, the implications of this study for health care providers and researchers working in HIV prevention are that these precursors may be useful as patient screening tools. Copyright © 2017 Elsevier Ltd. All rights reserved.
Robust Measurement via A Fused Latent and Graphical Item Response Theory Model.
Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang
2018-03-12
Item response theory (IRT) plays an important role in psychological and educational measurement. Unlike the classical testing theory, IRT models aggregate the item level information, yielding more accurate measurements. Most IRT models assume local independence, an assumption not likely to be satisfied in practice, especially when the number of items is large. Results in the literature and simulation studies in this paper reveal that misspecifying the local independence assumption may result in inaccurate measurements and differential item functioning. To provide more robust measurements, we propose an integrated approach by adding a graphical component to a multidimensional IRT model that can offset the effect of unknown local dependence. The new model contains a confirmatory latent variable component, which measures the targeted latent traits, and a graphical component, which captures the local dependence. An efficient proximal algorithm is proposed for the parameter estimation and structure learning of the local dependence. This approach can substantially improve the measurement, given no prior information on the local dependence structure. The model can be applied to measure both a unidimensional latent trait and multidimensional latent traits.
Determining building interior structures using compressive sensing
NASA Astrophysics Data System (ADS)
Lagunas, Eva; Amin, Moeness G.; Ahmad, Fauzia; Nájar, Montse
2013-04-01
We consider imaging of the building interior structures using compressive sensing (CS) with applications to through-the-wall imaging and urban sensing. We consider a monostatic synthetic aperture radar imaging system employing stepped frequency waveform. The proposed approach exploits prior information of building construction practices to form an appropriate sparse representation of the building interior layout. We devise a dictionary of possible wall locations, which is consistent with the fact that interior walls are typically parallel or perpendicular to the front wall. The dictionary accounts for the dominant normal angle reflections from exterior and interior walls for the monostatic imaging system. CS is applied to a reduced set of observations to recover the true positions of the walls. Additional information about interior walls can be obtained using a dictionary of possible corner reflectors, which is the response of the junction of two walls. Supporting results based on simulation and laboratory experiments are provided. It is shown that the proposed sparsifying basis outperforms the conventional through-the-wall CS model, the wavelet sparsifying basis, and the block sparse model for building interior layout detection.
Martignac, Marion; Balayssac, Stéphane; Gilard, Véronique; Benoit-Marquié, Florence
2015-06-18
We have investigated the removal of bortezomib, an anticancer drug prescribed in multiple myeloma, using the photochemical advanced oxidation process of V-UV/UV (185/254 nm). We used two complementary analytical techniques to follow the removal rate of bortezomib. Nuclear magnetic resonance (NMR) is a nonselective method requiring no prior knowledge of the structures of the byproducts and permits us to provide a spectral signature (fingerprinting approach). This untargeted method provides clues to the molecular structure changes and information on the degradation of the parent drug during the irradiation process. This holistic NMR approach could provide information for monitoring aromaticity evolution. We use liquid chromatography, coupled with high-resolution mass spectrometry (LC-MS), to correlate results obtained by (1)H NMR and for accurate identification of the byproducts, in order to understand the mechanistic degradation pathways of bortezomib. The results show that primary byproducts come from photoassisted deboronation of bortezomib at 254 nm. A secondary byproduct of pyrazinecarboxamide was also identified. We obtained a reliable correlation between these two analytical techniques.
Model verification of large structural systems
NASA Technical Reports Server (NTRS)
Lee, L. T.; Hasselman, T. K.
1977-01-01
A methodology was formulated, and a general computer code implemented for processing sinusoidal vibration test data to simultaneously make adjustments to a prior mathematical model of a large structural system, and resolve measured response data to obtain a set of orthogonal modes representative of the test model. The derivation of estimator equations is shown along with example problems. A method for improving the prior analytic model is included.
Guided transect sampling - a new design combining prior information and field surveying
Anna Ringvall; Goran Stahl; Tomas Lamas
2000-01-01
Guided transect sampling is a two-stage sampling design in which prior information is used to guide the field survey in the second stage. In the first stage, broad strips are randomly selected and divided into grid-cells. For each cell a covariate value is estimated from remote sensing data, for example. The covariate is the basis for subsampling of a transect through...
NASA Astrophysics Data System (ADS)
Qu, W.; Bogena, H. R.; Huisman, J. A.; Martinez, G.; Pachepsky, Y. A.; Vereecken, H.
2013-12-01
Soil water content is a key variable in the soil, vegetation and atmosphere continuum with high spatial and temporal variability. Temporal stability of soil water content (SWC) has been observed in multiple monitoring studies and the quantification of controls on soil moisture variability and temporal stability presents substantial interest. The objective of this work was to assess the effect of soil hydraulic parameters on the temporal stability. The inverse modeling based on large observed time series SWC with in-situ sensor network was used to estimate the van Genuchten-Mualem (VGM) soil hydraulic parameters in a small grassland catchment located in western Germany. For the inverse modeling, the shuffled complex evaluation (SCE) optimization algorithm was coupled with the HYDRUS 1D code. We considered two cases: without and with prior information about the correlation between VGM parameters. The temporal stability of observed SWC was well pronounced at all observation depths. Both the spatial variability of SWC and the robustness of temporal stability increased with depth. Calibrated models both with and without prior information provided reasonable correspondence between simulated and measured time series of SWC. Furthermore, we found a linear relationship between the mean relative difference (MRD) of SWC and the saturated SWC (θs). Also, the logarithm of saturated hydraulic conductivity (Ks), the VGM parameter n and logarithm of α were strongly correlated with the MRD of saturation degree for the prior information case, but no correlation was found for the non-prior information case except at the 50cm depth. Based on these results we propose that establishing relationships between temporal stability and spatial variability of soil properties presents a promising research avenue for a better understanding of the controls on soil moisture variability. Correlation between Mean Relative Difference of soil water content (or saturation degree) and inversely estimated soil hydraulic parameters (log10(Ks), log10(α), n, and θs) at 5-cm, 20-cm and 50-cm depths. Solid circles represent parameters estimated by using prior information; open circles represent parameters estimated without using prior information.
Yang, Liang; Jin, Di; He, Dongxiao; Fu, Huazhu; Cao, Xiaochun; Fogelman-Soulie, Francoise
2017-03-29
Due to the importance of community structure in understanding network and a surge of interest aroused on community detectability, how to improve the community identification performance with pairwise prior information becomes a hot topic. However, most existing semi-supervised community detection algorithms only focus on improving the accuracy but ignore the impacts of priors on speeding detection. Besides, they always require to tune additional parameters and cannot guarantee pairwise constraints. To address these drawbacks, we propose a general, high-speed, effective and parameter-free semi-supervised community detection framework. By constructing the indivisible super-nodes according to the connected subgraph of the must-link constraints and by forming the weighted super-edge based on network topology and cannot-link constraints, our new framework transforms the original network into an equivalent but much smaller Super-Network. Super-Network perfectly ensures the must-link constraints and effectively encodes cannot-link constraints. Furthermore, the time complexity of super-network construction process is linear in the original network size, which makes it efficient. Meanwhile, since the constructed super-network is much smaller than the original one, any existing community detection algorithm is much faster when using our framework. Besides, the overall process will not introduce any additional parameters, making it more practical.
Schema-driven facilitation of new hierarchy learning in the transitive inference paradigm
Kumaran, Dharshan
2013-01-01
Prior knowledge, in the form of a mental schema or framework, is viewed to facilitate the learning of new information in a range of experimental and everyday scenarios. Despite rising interest in the cognitive and neural mechanisms underlying schema-driven facilitation of new learning, few paradigms have been developed to examine this issue in humans. Here we develop a multiphase experimental scenario aimed at characterizing schema-based effects in the context of a paradigm that has been very widely used across species, the transitive inference task. We show that an associative schema, comprised of prior knowledge of the rank positions of familiar items in the hierarchy, has a marked effect on transitivity performance and the development of relational knowledge of the hierarchy that cannot be accounted for by more general changes in task strategy. Further, we show that participants are capable of deploying prior knowledge to successful effect under surprising conditions (i.e., when corrective feedback is totally absent), but only when the associative schema is robust. Finally, our results provide insights into the cognitive mechanisms underlying such schema-driven effects, and suggest that new hierarchy learning in the transitive inference task can occur through a contextual transfer mechanism that exploits the structure of associative experiences. PMID:23782509
Zhang, Yawei; Holford, Theodore R; Leaderer, Brian; Zahm, Shelia Hoar; Boyle, Peter; Morton, Lindsay McOmber; Zhang, Bing; Zou, Kaiyong; Flynn, Stuart; Tallini, Giovanni; Owens, Patricia H; Zheng, Tongzhang
2004-05-01
To further investigate the role of prior medical conditions and medication use in the etiology of non-Hodgkin lymphoma (NHL), we analyzed the data from a population-based case-control study of NHL in Connecticut women. A total of 601 histologically confirmed incident cases of NHL and 717 population-based controls were included in this study. In-person interviews were administered using standardized, structured questionnaires to collect information on medical conditions and medication use. An increased risk was found among women who had a history of autoimmune disorders (such as rheumatoid arthritis, lupus erythematosus, Sjogren's syndrome, and multiple sclerosis), anemia, eczema, or psoriasis. An increased risk was also observed among women who had used steroidal anti-inflammatory drugs and tranquilizers. A reduced risk was found for women who had scarlet fever or who had used estrogen replacement therapy, aspirin, medications for non-insulin dependent diabetes, HMG-CoA reductase inhibitors, or beta-adrenergic blocking agents. Risk associated with past medical history appeared to vary based on NHL subtypes, but the results were based on small number of exposed subjects. A relationship between certain prior medical conditions and medication use and risk of NHL was observed in this study. Further studies are warranted to confirm our findings.
Schema-driven facilitation of new hierarchy learning in the transitive inference paradigm.
Kumaran, Dharshan
2013-06-19
Prior knowledge, in the form of a mental schema or framework, is viewed to facilitate the learning of new information in a range of experimental and everyday scenarios. Despite rising interest in the cognitive and neural mechanisms underlying schema-driven facilitation of new learning, few paradigms have been developed to examine this issue in humans. Here we develop a multiphase experimental scenario aimed at characterizing schema-based effects in the context of a paradigm that has been very widely used across species, the transitive inference task. We show that an associative schema, comprised of prior knowledge of the rank positions of familiar items in the hierarchy, has a marked effect on transitivity performance and the development of relational knowledge of the hierarchy that cannot be accounted for by more general changes in task strategy. Further, we show that participants are capable of deploying prior knowledge to successful effect under surprising conditions (i.e., when corrective feedback is totally absent), but only when the associative schema is robust. Finally, our results provide insights into the cognitive mechanisms underlying such schema-driven effects, and suggest that new hierarchy learning in the transitive inference task can occur through a contextual transfer mechanism that exploits the structure of associative experiences.
NASA Astrophysics Data System (ADS)
Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.
2016-08-01
Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.
Segmentation of knee MRI using structure enhanced local phase filtering
NASA Astrophysics Data System (ADS)
Lim, Mikhiel; Hacihaliloglu, Ilker
2016-03-01
The segmentation of bone surfaces from magnetic resonance imaging (MRI) data has applications in the quanti- tative measurement of knee osteoarthritis, surgery planning for patient specific total knee arthroplasty and its subsequent fabrication of artificial implants. However, due to the problems associated with MRI imaging such as low contrast between bone and surrounding tissues, noise, bias fields, and the partial volume effect, segmentation of bone surfaces continues to be a challenging operation. In this paper, a new framework is presented for the enhancement of knee MRI scans prior to segmentation in order to obtain high contrast bone images. During the first stage, a new contrast enhanced relative total variation (RTV) regularization method is used in order to remove textural noise from the bone structures and surrounding soft tissue interface. This salient bone edge information is further enhanced using a sparse gradient counting method based on L0 gradient minimization, which globally controls how many non-zero gradients are resulted in order to approximate prominent bone structures in a structure-sparsity-management manner. The last stage of the framework involves incorporation of local phase bone boundary information in order to provide an intensity invariant enhancement of contrast between the bone and surrounding soft tissue. The enhanced images are segmented using a fast random walker algorithm. Validation against expert segmentation was performed on 10 clinical knee MRI images, and achieved a mean dice similarity coefficient (DSC) of 0.975.
Building qualitative study design using nursing's disciplinary epistemology.
Thorne, Sally; Stephens, Jennifer; Truant, Tracy
2016-02-01
To discuss the implications of drawing on core nursing knowledge as theoretical scaffolding for qualitative nursing enquiry. Although nurse scholars have been using qualitative methods for decades, much of their methodological direction derives from conventional approaches developed for answering questions in the social sciences. The quality of available knowledge to inform practice can be enhanced through the selection of study design options informed by an appreciation for the nature of nursing knowledge. Discussion paper. Drawing on the body of extant literature dealing with nursing's theoretical and qualitative research traditions, we consider contextual factors that have shaped the application of qualitative research approaches in nursing, including prior attempts to align method with the structure and form of disciplinary knowledge. On this basis, we critically reflect on design considerations that would follow logically from core features associated with a nursing epistemology. The substantive knowledge used by nurses to inform their practice includes both aspects developed at the level of the general and also that which pertains to application in the unique context of the particular. It must be contextually relevant to a fluid and dynamic healthcare environment and adaptable to distinctive patient conditions. Finally, it must align with nursing's moral mandate and action imperative. Qualitative research design components informed by nursing's disciplinary epistemology will help ensure a logical line of reasoning in our enquiries that remains true to the nature and structure of practice knowledge. © 2015 John Wiley & Sons Ltd.
Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A; Lu, Zhong-Lin; Myung, Jay I
2016-01-01
Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias.
Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A.; Lu, Zhong-Lin; Myung, Jay I.
2016-01-01
Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias. PMID:27105061
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... constitutes prior art increase the need to have accurate and up-to-date ownership information about patent..., section 102(b)(2)(C) exempts as prior art those patent applications or issued patents that name different... issued patent may prevent its use as prior art against a later-filed patent application, patentability...
ERIC Educational Resources Information Center
Happ, Roland; Förster, Manuel; Zlatkin-Troitschanskaia, Olga; Carstensen, Vivian
2016-01-01
Study-related prior knowledge plays a decisive role in business and economics degree courses. Prior knowledge has a significant influence on knowledge acquisition in higher education, and teachers need information on it to plan their introductory courses accordingly. Very few studies have been conducted of first-year students' prior economic…
ERIC Educational Resources Information Center
Popova-Gonci, Viktoria; Lamb, Monica C.
2012-01-01
Prior learning assessment (PLA) students enter academia with different types of concepts--some of them have been formally accepted and labeled by academia and others are informally formulated by students via independent and/or experiential learning. The critical goal of PLA practices is to assess an intricate combination of prior learning…
An algorithmic approach to crustal deformation analysis
NASA Technical Reports Server (NTRS)
Iz, Huseyin Baki
1987-01-01
In recent years the analysis of crustal deformation measurements has become important as a result of current improvements in geodetic methods and an increasing amount of theoretical and observational data provided by several earth sciences. A first-generation data analysis algorithm which combines a priori information with current geodetic measurements was proposed. Relevant methods which can be used in the algorithm were discussed. Prior information is the unifying feature of this algorithm. Some of the problems which may arise through the use of a priori information in the analysis were indicated and preventive measures were demonstrated. The first step in the algorithm is the optimal design of deformation networks. The second step in the algorithm identifies the descriptive model of the deformation field. The final step in the algorithm is the improved estimation of deformation parameters. Although deformation parameters are estimated in the process of model discrimination, they can further be improved by the use of a priori information about them. According to the proposed algorithm this information must first be tested against the estimates calculated using the sample data only. Null-hypothesis testing procedures were developed for this purpose. Six different estimators which employ a priori information were examined. Emphasis was put on the case when the prior information is wrong and analytical expressions for possible improvements under incompatible prior information were derived.
A prior feature SVM – MRF based method for mouse brain segmentation
Wu, Teresa; Bae, Min Hyeok; Zhang, Min; Pan, Rong; Badea, Alexandra
2012-01-01
We introduce an automated method, called prior feature Support Vector Machine- Markov Random Field (pSVMRF), to segment three-dimensional mouse brain Magnetic Resonance Microscopy (MRM) images. Our earlier work, extended MRF (eMRF) integrated Support Vector Machine (SVM) and Markov Random Field (MRF) approaches, leading to improved segmentation accuracy; however, the computation of eMRF is very expensive, which may limit its performance on segmentation and robustness. In this study pSVMRF reduces training and testing time for SVM, while boosting segmentation performance. Unlike the eMRF approach, where MR intensity information and location priors are linearly combined, pSVMRF combines this information in a nonlinear fashion, and enhances the discriminative ability of the algorithm. We validate the proposed method using MR imaging of unstained and actively stained mouse brain specimens, and compare segmentation accuracy with two existing methods: eMRF and MRF. C57BL/6 mice are used for training and testing, using cross validation. For formalin fixed C57BL/6 specimens, pSVMRF outperforms both eMRF and MRF. The segmentation accuracy for C57BL/6 brains, stained or not, was similar for larger structures like hippocampus and caudate putamen, (~87%), but increased substantially for smaller regions like susbtantia nigra (from 78.36% to 91.55%), and anterior commissure (from ~50% to ~80%). To test segmentation robustness against increased anatomical variability we add two strains, BXD29 and a transgenic mouse model of Alzheimer’s Disease. Segmentation accuracy for new strains is 80% for hippocampus, and caudate putamen, indicating that pSVMRF is a promising approach for phenotyping mouse models of human brain disorders. PMID:21988893
A prior feature SVM-MRF based method for mouse brain segmentation.
Wu, Teresa; Bae, Min Hyeok; Zhang, Min; Pan, Rong; Badea, Alexandra
2012-02-01
We introduce an automated method, called prior feature Support Vector Machine-Markov Random Field (pSVMRF), to segment three-dimensional mouse brain Magnetic Resonance Microscopy (MRM) images. Our earlier work, extended MRF (eMRF) integrated Support Vector Machine (SVM) and Markov Random Field (MRF) approaches, leading to improved segmentation accuracy; however, the computation of eMRF is very expensive, which may limit its performance on segmentation and robustness. In this study pSVMRF reduces training and testing time for SVM, while boosting segmentation performance. Unlike the eMRF approach, where MR intensity information and location priors are linearly combined, pSVMRF combines this information in a nonlinear fashion, and enhances the discriminative ability of the algorithm. We validate the proposed method using MR imaging of unstained and actively stained mouse brain specimens, and compare segmentation accuracy with two existing methods: eMRF and MRF. C57BL/6 mice are used for training and testing, using cross validation. For formalin fixed C57BL/6 specimens, pSVMRF outperforms both eMRF and MRF. The segmentation accuracy for C57BL/6 brains, stained or not, was similar for larger structures like hippocampus and caudate putamen, (~87%), but increased substantially for smaller regions like susbtantia nigra (from 78.36% to 91.55%), and anterior commissure (from ~50% to ~80%). To test segmentation robustness against increased anatomical variability we add two strains, BXD29 and a transgenic mouse model of Alzheimer's disease. Segmentation accuracy for new strains is 80% for hippocampus, and caudate putamen, indicating that pSVMRF is a promising approach for phenotyping mouse models of human brain disorders. Copyright © 2011 Elsevier Inc. All rights reserved.
Ahonen, Emily Q; Watson, Dennis P; Adams, Erin L; McGuire, Alan
2017-01-01
Detailed descriptions of implementation strategies are lacking, and there is a corresponding dearth of information regarding methods employed in implementation strategy development. This paper describes methods and findings related to the alpha testing of eLearning modules developed as part of the Housing First Technical Assistance and Training (HFTAT) program's development. Alpha testing is an approach for improving the quality of a product prior to beta (i.e., real world) testing with potential applications for intervention development. Ten participants in two cities tested the modules. We collected data through (1) a structured log where participants were asked to record their experiences as they worked through the modules; (2) a brief online questionnaire delivered at the end of each module; and (3) focus groups. The alpha test provided useful data related to the acceptability and feasibility of eLearning as an implementation strategy, as well as identifying a number of technical issues and bugs. Each of the qualitative methods used provided unique and valuable information. In particular, logs were the most useful for identifying technical issues, and focus groups provided high quality data regarding how the intervention could best be used as an implementation strategy. Alpha testing was a valuable step in intervention development, providing us an understanding of issues that would have been more difficult to address at a later stage of the study. As a result, we were able to improve the modules prior to pilot testing of the entire HFTAT. Researchers wishing to alpha test interventions prior to piloting should balance the unique benefits of different data collection approaches with the need to minimize burdens for themselves and participants.
Good, Benjamin M; Loguercio, Salvatore; Griffith, Obi L; Nanis, Max; Wu, Chunlei; Su, Andrew I
2014-07-29
Molecular signatures for predicting breast cancer prognosis could greatly improve care through personalization of treatment. Computational analyses of genome-wide expression datasets have identified such signatures, but these signatures leave much to be desired in terms of accuracy, reproducibility, and biological interpretability. Methods that take advantage of structured prior knowledge (eg, protein interaction networks) show promise in helping to define better signatures, but most knowledge remains unstructured. Crowdsourcing via scientific discovery games is an emerging methodology that has the potential to tap into human intelligence at scales and in modes unheard of before. The main objective of this study was to test the hypothesis that knowledge linking expression patterns of specific genes to breast cancer outcomes could be captured from players of an open, Web-based game. We envisioned capturing knowledge both from the player's prior experience and from their ability to interpret text related to candidate genes presented to them in the context of the game. We developed and evaluated an online game called The Cure that captured information from players regarding genes for use as predictors of breast cancer survival. Information gathered from game play was aggregated using a voting approach, and used to create rankings of genes. The top genes from these rankings were evaluated using annotation enrichment analysis, comparison to prior predictor gene sets, and by using them to train and test machine learning systems for predicting 10 year survival. Between its launch in September 2012 and September 2013, The Cure attracted more than 1000 registered players, who collectively played nearly 10,000 games. Gene sets assembled through aggregation of the collected data showed significant enrichment for genes known to be related to key concepts such as cancer, disease progression, and recurrence. In terms of the predictive accuracy of models trained using this information, these gene sets provided comparable performance to gene sets generated using other methods, including those used in commercial tests. The Cure is available on the Internet. The principal contribution of this work is to show that crowdsourcing games can be developed as a means to address problems involving domain knowledge. While most prior work on scientific discovery games and crowdsourcing in general takes as a premise that contributors have little or no expertise, here we demonstrated a crowdsourcing system that succeeded in capturing expert knowledge.
Tarrant, Marie; Dodgson, Joan E; Law, Beatrice V K K
2008-05-01
In today's environment of rapidly changing health care and information technology, nurses require a broad range of skills. One of the key skills required of all health professionals in this environment is information literacy. For registered nurses returning to a university setting to study for their baccalaureate degree, becoming information literate is one of many challenges they face. Also key to students' ability to use and communicate information in an appropriate and effective manner is their writing skills. This article describes a curricular intervention designed to develop and strengthen post-registration nurses' information literacy and academic writing competencies. An introductory information management module was developed and provided to three successive cohorts of students (n=159). Students were predominantly female (85.4%) with a mean age of 34.2 years (SD=6.8). Prior to commencing the program, students reported low information literacy and writing skills, especially in accessing and searching electronic databases and using referencing formats. The post-test evaluation of skills showed substantial and statistically significant increases in all assessed competencies. This intervention demonstrated that with structured but flexible learning activities early in the curriculum, post-registration nursing students can quickly become information literate.
How much to trust the senses: Likelihood learning
Sato, Yoshiyuki; Kording, Konrad P.
2014-01-01
Our brain often needs to estimate unknown variables from imperfect information. Our knowledge about the statistical distributions of quantities in our environment (called priors) and currently available information from sensory inputs (called likelihood) are the basis of all Bayesian models of perception and action. While we know that priors are learned, most studies of prior-likelihood integration simply assume that subjects know about the likelihood. However, as the quality of sensory inputs change over time, we also need to learn about new likelihoods. Here, we show that human subjects readily learn the distribution of visual cues (likelihood function) in a way that can be predicted by models of statistically optimal learning. Using a likelihood that depended on color context, we found that a learned likelihood generalized to new priors. Thus, we conclude that subjects learn about likelihood. PMID:25398975
Assignment of a non-informative prior when using a calibration function
NASA Astrophysics Data System (ADS)
Lira, I.; Grientschnig, D.
2012-01-01
The evaluation of measurement uncertainty associated with the use of calibration functions was addressed in a talk at the 19th IMEKO World Congress 2009 in Lisbon (Proceedings, pp 2346-51). Therein, an example involving a cubic function was analysed by a Bayesian approach and by the Monte Carlo method described in Supplement 1 to the 'Guide to the Expression of Uncertainty in Measurement'. Results were found to be discrepant. In this paper we examine a simplified version of the example and show that the reported discrepancy is caused by the choice of the prior in the Bayesian analysis, which does not conform to formal rules for encoding the absence of prior knowledge. Two options for assigning a non-informative prior free from this shortcoming are considered; they are shown to be equivalent.
Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.
Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M
2011-10-01
Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.