Sample records for random fields based

  1. Connectivity ranking of heterogeneous random conductivity models

    NASA Astrophysics Data System (ADS)

    Rizzo, C. B.; de Barros, F.

    2017-12-01

    To overcome the challenges associated with hydrogeological data scarcity, the hydraulic conductivity (K) field is often represented by a spatial random process. The state-of-the-art provides several methods to generate 2D or 3D random K-fields, such as the classic multi-Gaussian fields or non-Gaussian fields, training image-based fields and object-based fields. We provide a systematic comparison of these models based on their connectivity. We use the minimum hydraulic resistance as a connectivity measure, which it has been found to be strictly correlated with early time arrival of dissolved contaminants. A computationally efficient graph-based algorithm is employed, allowing a stochastic treatment of the minimum hydraulic resistance through a Monte-Carlo approach and therefore enabling the computation of its uncertainty. The results show the impact of geostatistical parameters on the connectivity for each group of random fields, being able to rank the fields according to their minimum hydraulic resistance.

  2. Markov Random Fields, Stochastic Quantization and Image Analysis

    DTIC Science & Technology

    1990-01-01

    Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.

  3. On Pfaffian Random Point Fields

    NASA Astrophysics Data System (ADS)

    Kargin, V.

    2014-02-01

    We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.

  4. SMERFS: Stochastic Markov Evaluation of Random Fields on the Sphere

    NASA Astrophysics Data System (ADS)

    Creasey, Peter; Lang, Annika

    2018-04-01

    SMERFS (Stochastic Markov Evaluation of Random Fields on the Sphere) creates large realizations of random fields on the sphere. It uses a fast algorithm based on Markov properties and fast Fourier Transforms in 1d that generates samples on an n X n grid in O(n2 log n) and efficiently derives the necessary conditional covariance matrices.

  5. Surface plasmon enhanced cell microscopy with blocked random spatial activation

    NASA Astrophysics Data System (ADS)

    Son, Taehwang; Oh, Youngjin; Lee, Wonju; Yang, Heejin; Kim, Donghyun

    2016-03-01

    We present surface plasmon enhanced fluorescence microscopy with random spatial sampling using patterned block of silver nanoislands. Rigorous coupled wave analysis was performed to confirm near-field localization on nanoislands. Random nanoislands were fabricated in silver by temperature annealing. By analyzing random near-field distribution, average size of localized fields was found to be on the order of 135 nm. Randomly localized near-fields were used to spatially sample F-actin of J774 cells (mouse macrophage cell-line). Image deconvolution algorithm based on linear imaging theory was established for stochastic estimation of fluorescent molecular distribution. The alignment between near-field distribution and raw image was performed by the patterned block. The achieved resolution is dependent upon factors including the size of localized fields and estimated to be 100-150 nm.

  6. Clustering, randomness and regularity in cloud fields. I - Theoretical considerations. II - Cumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.

    1992-01-01

    The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.

  7. Summer School Effects in a Randomized Field Trial

    ERIC Educational Resources Information Center

    Zvoch, Keith; Stevens, Joseph J.

    2013-01-01

    This field-based randomized trial examined the effect of assignment to and participation in summer school for two moderately at-risk samples of struggling readers. Application of multiple regression models to difference scores capturing the change in summer reading fluency revealed that kindergarten students randomly assigned to summer school…

  8. Note: The design of thin gap chamber simulation signal source based on field programmable gate array.

    PubMed

    Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge

    2015-01-01

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  9. Note: The design of thin gap chamber simulation signal source based on field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Kun; Wang, Xu; Li, Feng

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  10. Random numbers from vacuum fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com; Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  11. COMPUTERIZED EXPERT SYSTEM FOR EVALUATION OF AUTOMATED VISUAL FIELDS FROM THE ISCHEMIC OPTIC NEUROPATHY DECOMPRESSION TRIAL: METHODS, BASELINE FIELDS, AND SIX-MONTH LONGITUDINAL FOLLOW-UP

    PubMed Central

    Feldon, Steven E

    2004-01-01

    ABSTRACT Purpose To validate a computerized expert system evaluating visual fields in a prospective clinical trial, the Ischemic Optic Neuropathy Decompression Trial (IONDT). To identify the pattern and within-pattern severity of field defects for study eyes at baseline and 6-month follow-up. Design Humphrey visual field (HVF) change was used as the outcome measure for a prospective, randomized, multi-center trial to test the null hypothesis that optic nerve sheath decompression was ineffective in treating nonarteritic anterior ischemic optic neuropathy and to ascertain the natural history of the disease. Methods An expert panel established criteria for the type and severity of visual field defects. Using these criteria, a rule-based computerized expert system interpreted HVF from baseline and 6-month visits for patients randomized to surgery or careful follow-up and for patients who were not randomized. Results A computerized expert system was devised and validated. The system was then used to analyze HVFs. The pattern of defects found at baseline for patients randomized to surgery did not differ from that of patients randomized to careful follow-up. The most common pattern of defect was a superior and inferior arcuate with central scotoma for randomized eyes (19.2%) and a superior and inferior arcuate for nonrandomized eyes (30.6%). Field patterns at 6 months and baseline were not different. For randomized study eyes, the superior altitudinal defects improved (P = .03), as did the inferior altitudinal defects (P = .01). For nonrandomized study eyes, only the inferior altitudinal defects improved (P = .02). No treatment effect was noted. Conclusions A novel rule-based expert system successfully interpreted visual field defects at baseline of eyes enrolled in the IONDT. PMID:15747764

  12. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  13. Random Fields

    NASA Astrophysics Data System (ADS)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  14. Random walk study of electron motion in helium in crossed electromagnetic fields

    NASA Technical Reports Server (NTRS)

    Englert, G. W.

    1972-01-01

    Random walk theory, previously adapted to electron motion in the presence of an electric field, is extended to include a transverse magnetic field. In principle, the random walk approach avoids mathematical complexity and concomitant simplifying assumptions and permits determination of energy distributions and transport coefficients within the accuracy of available collisional cross section data. Application is made to a weakly ionized helium gas. Time of relaxation of electron energy distribution, determined by the random walk, is described by simple expressions based on energy exchange between the electron and an effective electric field. The restrictive effect of the magnetic field on electron motion, which increases the required number of collisions per walk to reach a terminal steady state condition, as well as the effect of the magnetic field on electron transport coefficients and mean energy can be quite adequately described by expressions involving only the Hall parameter.

  15. Modeling and statistical analysis of non-Gaussian random fields with heavy-tailed distributions.

    PubMed

    Nezhadhaghighi, Mohsen Ghasemi; Nakhlband, Abbas

    2017-04-01

    In this paper, we investigate and develop an alternative approach to the numerical analysis and characterization of random fluctuations with the heavy-tailed probability distribution function (PDF), such as turbulent heat flow and solar flare fluctuations. We identify the heavy-tailed random fluctuations based on the scaling properties of the tail exponent of the PDF, power-law growth of qth order correlation function, and the self-similar properties of the contour lines in two-dimensional random fields. Moreover, this work leads to a substitution for the fractional Edwards-Wilkinson (EW) equation that works in the presence of μ-stable Lévy noise. Our proposed model explains the configuration dynamics of the systems with heavy-tailed correlated random fluctuations. We also present an alternative solution to the fractional EW equation in the presence of μ-stable Lévy noise in the steady state, which is implemented numerically, using the μ-stable fractional Lévy motion. Based on the analysis of the self-similar properties of contour loops, we numerically show that the scaling properties of contour loop ensembles can qualitatively and quantitatively distinguish non-Gaussian random fields from Gaussian random fluctuations.

  16. Classification of high resolution remote sensing image based on geo-ontology and conditional random fields

    NASA Astrophysics Data System (ADS)

    Hong, Liang

    2013-10-01

    The availability of high spatial resolution remote sensing data provides new opportunities for urban land-cover classification. More geometric details can be observed in the high resolution remote sensing image, Also Ground objects in the high resolution remote sensing image have displayed rich texture, structure, shape and hierarchical semantic characters. More landscape elements are represented by a small group of pixels. Recently years, the an object-based remote sensing analysis methodology is widely accepted and applied in high resolution remote sensing image processing. The classification method based on Geo-ontology and conditional random fields is presented in this paper. The proposed method is made up of four blocks: (1) the hierarchical ground objects semantic framework is constructed based on geoontology; (2) segmentation by mean-shift algorithm, which image objects are generated. And the mean-shift method is to get boundary preserved and spectrally homogeneous over-segmentation regions ;(3) the relations between the hierarchical ground objects semantic and over-segmentation regions are defined based on conditional random fields framework ;(4) the hierarchical classification results are obtained based on geo-ontology and conditional random fields. Finally, high-resolution remote sensed image data -GeoEye, is used to testify the performance of the presented method. And the experimental results have shown the superiority of this method to the eCognition method both on the effectively and accuracy, which implies it is suitable for the classification of high resolution remote sensing image.

  17. A biorthogonal decomposition for the identification and simulation of non-stationary and non-Gaussian random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zentner, I.; Ferré, G., E-mail: gregoire.ferre@ponts.org; Poirion, F.

    2016-06-01

    In this paper, a new method for the identification and simulation of non-Gaussian and non-stationary stochastic fields given a database is proposed. It is based on two successive biorthogonal decompositions aiming at representing spatio–temporal stochastic fields. The proposed double expansion allows to build the model even in the case of large-size problems by separating the time, space and random parts of the field. A Gaussian kernel estimator is used to simulate the high dimensional set of random variables appearing in the decomposition. The capability of the method to reproduce the non-stationary and non-Gaussian features of random phenomena is illustrated bymore » applications to earthquakes (seismic ground motion) and sea states (wave heights).« less

  18. Nonlinear wave chaos: statistics of second harmonic fields.

    PubMed

    Zhou, Min; Ott, Edward; Antonsen, Thomas M; Anlage, Steven M

    2017-10-01

    Concepts from the field of wave chaos have been shown to successfully predict the statistical properties of linear electromagnetic fields in electrically large enclosures. The Random Coupling Model (RCM) describes these properties by incorporating both universal features described by Random Matrix Theory and the system-specific features of particular system realizations. In an effort to extend this approach to the nonlinear domain, we add an active nonlinear frequency-doubling circuit to an otherwise linear wave chaotic system, and we measure the statistical properties of the resulting second harmonic fields. We develop an RCM-based model of this system as two linear chaotic cavities coupled by means of a nonlinear transfer function. The harmonic field strengths are predicted to be the product of two statistical quantities and the nonlinearity characteristics. Statistical results from measurement-based calculation, RCM-based simulation, and direct experimental measurements are compared and show good agreement over many decades of power.

  19. In Defense of the Randomized Controlled Trial for Health Promotion Research

    PubMed Central

    Rosen, Laura; Manor, Orly; Engelhard, Dan; Zucker, David

    2006-01-01

    The overwhelming evidence about the role lifestyle plays in mortality, morbidity, and quality of life has pushed the young field of modern health promotion to center stage. The field is beset with intense debate about appropriate evaluation methodologies. Increasingly, randomized designs are considered inappropriate for health promotion research. We have reviewed criticisms against randomized trials that raise philosophical and practical issues, and we will show how most of these criticisms can be overcome with minor design modifications. By providing rebuttal to arguments against randomized trials, our work contributes to building a sound methodological base for health promotion research. PMID:16735622

  20. Chemical Distances for Percolation of Planar Gaussian Free Fields and Critical Random Walk Loop Soups

    NASA Astrophysics Data System (ADS)

    Ding, Jian; Li, Li

    2018-05-01

    We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.

  1. Chemical Distances for Percolation of Planar Gaussian Free Fields and Critical Random Walk Loop Soups

    NASA Astrophysics Data System (ADS)

    Ding, Jian; Li, Li

    2018-06-01

    We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.

  2. A Gaussian random field model for similarity-based smoothing in Bayesian disease mapping.

    PubMed

    Baptista, Helena; Mendes, Jorge M; MacNab, Ying C; Xavier, Miguel; Caldas-de-Almeida, José

    2016-08-01

    Conditionally specified Gaussian Markov random field (GMRF) models with adjacency-based neighbourhood weight matrix, commonly known as neighbourhood-based GMRF models, have been the mainstream approach to spatial smoothing in Bayesian disease mapping. In the present paper, we propose a conditionally specified Gaussian random field (GRF) model with a similarity-based non-spatial weight matrix to facilitate non-spatial smoothing in Bayesian disease mapping. The model, named similarity-based GRF, is motivated for modelling disease mapping data in situations where the underlying small area relative risks and the associated determinant factors do not vary systematically in space, and the similarity is defined by "similarity" with respect to the associated disease determinant factors. The neighbourhood-based GMRF and the similarity-based GRF are compared and accessed via a simulation study and by two case studies, using new data on alcohol abuse in Portugal collected by the World Mental Health Survey Initiative and the well-known lip cancer data in Scotland. In the presence of disease data with no evidence of positive spatial correlation, the simulation study showed a consistent gain in efficiency from the similarity-based GRF, compared with the adjacency-based GMRF with the determinant risk factors as covariate. This new approach broadens the scope of the existing conditional autocorrelation models. © The Author(s) 2016.

  3. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE PAGES

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    2017-10-26

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  4. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  5. Nuclear test ban treaty verification: Improving test ban monitoring with empirical and model-based signal processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.

    In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.

  6. Nuclear test ban treaty verification: Improving test ban monitoring with empirical and model-based signal processing

    DOE PAGES

    Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.; ...

    2012-05-01

    In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.

  7. Subcritical Multiplicative Chaos for Regularized Counting Statistics from Random Matrix Theory

    NASA Astrophysics Data System (ADS)

    Lambert, Gaultier; Ostrovsky, Dmitry; Simm, Nick

    2018-05-01

    For an {N × N} Haar distributed random unitary matrix U N , we consider the random field defined by counting the number of eigenvalues of U N in a mesoscopic arc centered at the point u on the unit circle. We prove that after regularizing at a small scale {ɛN > 0}, the renormalized exponential of this field converges as N \\to ∞ to a Gaussian multiplicative chaos measure in the whole subcritical phase. We discuss implications of this result for obtaining a lower bound on the maximum of the field. We also show that the moments of the total mass converge to a Selberg-like integral and by taking a further limit as the size of the arc diverges, we establish part of the conjectures in Ostrovsky (Nonlinearity 29(2):426-464, 2016). By an analogous construction, we prove that the multiplicative chaos measure coming from the sine process has the same distribution, which strongly suggests that this limiting object should be universal. Our approach to the L 1-phase is based on a generalization of the construction in Berestycki (Electron Commun Probab 22(27):12, 2017) to random fields which are only asymptotically Gaussian. In particular, our method could have applications to other random fields coming from either random matrix theory or a different context.

  8. Sparse Forward-Backward for Fast Training of Conditional Random Fields

    DTIC Science & Technology

    2006-01-01

    knowledge- based systems. Proceedings of the 6th Conference on Uncertainty in Artifcial Intelligence , 1990. Appears to be unavailable. [4] Michael I...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...task, the NetTalk text-to-speech data set [5], we can now train a conditional random field (CRF) in about 6 hours for which training previously

  9. Markov random field model-based edge-directed image interpolation.

    PubMed

    Li, Min; Nguyen, Truong Q

    2008-07-01

    This paper presents an edge-directed image interpolation algorithm. In the proposed algorithm, the edge directions are implicitly estimated with a statistical-based approach. In opposite to explicit edge directions, the local edge directions are indicated by length-16 weighting vectors. Implicitly, the weighting vectors are used to formulate geometric regularity (GR) constraint (smoothness along edges and sharpness across edges) and the GR constraint is imposed on the interpolated image through the Markov random field (MRF) model. Furthermore, under the maximum a posteriori-MRF framework, the desired interpolated image corresponds to the minimal energy state of a 2-D random field given the low-resolution image. Simulated annealing methods are used to search for the minimal energy state from the state space. To lower the computational complexity of MRF, a single-pass implementation is designed, which performs nearly as well as the iterative optimization. Simulation results show that the proposed MRF model-based edge-directed interpolation method produces edges with strong geometric regularity. Compared to traditional methods and other edge-directed interpolation methods, the proposed method improves the subjective quality of the interpolated edges while maintaining a high PSNR level.

  10. Single-image super-resolution based on Markov random field and contourlet transform

    NASA Astrophysics Data System (ADS)

    Wu, Wei; Liu, Zheng; Gueaieb, Wail; He, Xiaohai

    2011-04-01

    Learning-based methods are well adopted in image super-resolution. In this paper, we propose a new learning-based approach using contourlet transform and Markov random field. The proposed algorithm employs contourlet transform rather than the conventional wavelet to represent image features and takes into account the correlation between adjacent pixels or image patches through the Markov random field (MRF) model. The input low-resolution (LR) image is decomposed with the contourlet transform and fed to the MRF model together with the contourlet transform coefficients from the low- and high-resolution image pairs in the training set. The unknown high-frequency components/coefficients for the input low-resolution image are inferred by a belief propagation algorithm. Finally, the inverse contourlet transform converts the LR input and the inferred high-frequency coefficients into the super-resolved image. The effectiveness of the proposed method is demonstrated with the experiments on facial, vehicle plate, and real scene images. A better visual quality is achieved in terms of peak signal to noise ratio and the image structural similarity measurement.

  11. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    PubMed

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  12. Approximate ground states of the random-field Potts model from graph cuts

    NASA Astrophysics Data System (ADS)

    Kumar, Manoj; Kumar, Ravinder; Weigel, Martin; Banerjee, Varsha; Janke, Wolfhard; Puri, Sanjay

    2018-05-01

    While the ground-state problem for the random-field Ising model is polynomial, and can be solved using a number of well-known algorithms for maximum flow or graph cut, the analog random-field Potts model corresponds to a multiterminal flow problem that is known to be NP-hard. Hence an efficient exact algorithm is very unlikely to exist. As we show here, it is nevertheless possible to use an embedding of binary degrees of freedom into the Potts spins in combination with graph-cut methods to solve the corresponding ground-state problem approximately in polynomial time. We benchmark this heuristic algorithm using a set of quasiexact ground states found for small systems from long parallel tempering runs. For a not-too-large number q of Potts states, the method based on graph cuts finds the same solutions in a fraction of the time. We employ the new technique to analyze the breakup length of the random-field Potts model in two dimensions.

  13. Collision Models for Particle Orbit Code on SSX

    NASA Astrophysics Data System (ADS)

    Fisher, M. W.; Dandurand, D.; Gray, T.; Brown, M. R.; Lukin, V. S.

    2011-10-01

    Coulomb collision models are being developed and incorporated into the Hamiltonian particle pushing code (PPC) for applications to the Swarthmore Spheromak eXperiment (SSX). A Monte Carlo model based on that of Takizuka and Abe [JCP 25, 205 (1977)] performs binary collisions between test particles and thermal plasma field particles randomly drawn from a stationary Maxwellian distribution. A field-based electrostatic fluctuation model scatters particles from a spatially uniform random distribution of positive and negative spherical potentials generated throughout the plasma volume. The number, radii, and amplitude of these potentials are chosen to mimic the correct particle diffusion statistics without the use of random particle draws or collision frequencies. An electromagnetic fluctuating field model will be presented, if available. These numerical collision models will be benchmarked against known analytical solutions, including beam diffusion rates and Spitzer resistivity, as well as each other. The resulting collisional particle orbit models will be used to simulate particle collection with electrostatic probes in the SSX wind tunnel, as well as particle confinement in typical SSX fields. This work has been supported by US DOE, NSF and ONR.

  14. Speckle phase near random surfaces

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoyi; Cheng, Chuanfu; An, Guoqiang; Han, Yujing; Rong, Zhenyu; Zhang, Li; Zhang, Meina

    2018-03-01

    Based on Kirchhoff approximation theory, the speckle phase near random surfaces with different roughness is numerically simulated. As expected, the properties of the speckle phase near the random surfaces are different from that in far field. In addition, as scattering distances and roughness increase, the average fluctuations of the speckle phase become larger. Unusually, the speckle phase is somewhat similar to the corresponding surface topography. We have performed experiments to verify the theoretical simulation results. Studies in this paper contribute to understanding the evolution of speckle phase near a random surface and provide a possible way to identify a random surface structure based on its speckle phase.

  15. Random crystal field effect on the magnetic and hysteresis behaviors of a spin-1 cylindrical nanowire

    NASA Astrophysics Data System (ADS)

    Zaim, N.; Zaim, A.; Kerouad, M.

    2017-02-01

    In this work, the magnetic behavior of the cylindrical nanowire, consisting of a ferromagnetic core of spin-1 atoms surrounded by a ferromagnetic shell of spin-1 atoms is studied in the presence of a random crystal field interaction. Based on Metropolis algorithm, the Monte Carlo simulation has been used to investigate the effects of the concentration of the random crystal field p, the crystal field D and the shell exchange interaction Js on the phase diagrams and the hysteresis behavior of the system. Some characteristic behaviors have been found, such as the first and second-order phase transitions joined by tricritical point for appropriate values of the system parameters, triple and isolated critical points can be also found. Depending on the Hamiltonian parameters, single, double and para hysteresis regions are explicitly determined.

  16. COMPARISON OF RANDOM AND SYSTEMATIC SITE SELECTION FOR ASSESSING ATTAINMENT OF AQUATIC LIFE USES IN SEGMENTS OF THE OHIO RIVER

    EPA Science Inventory

    This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...

  17. Effects of Check and Connect on Attendance, Behavior, and Academics: A Randomized Effectiveness Trial

    ERIC Educational Resources Information Center

    Maynard, Brandy R.; Kjellstrand, Elizabeth K.; Thompson, Aaron M.

    2014-01-01

    Objectives: This study examined the effects of Check & Connect (C&C) on the attendance, behavior, and academic outcomes of at-risk youth in a field-based effectiveness trial. Method: A multisite randomized block design was used, wherein 260 primarily Hispanic (89%) and economically disadvantaged (74%) students were randomized to treatment…

  18. Smooth invariant densities for random switching on the torus

    NASA Astrophysics Data System (ADS)

    Bakhtin, Yuri; Hurth, Tobias; Lawley, Sean D.; Mattingly, Jonathan C.

    2018-04-01

    We consider a random dynamical system obtained by switching between the flows generated by two smooth vector fields on the 2d-torus, with the random switchings happening according to a Poisson process. Assuming that the driving vector fields are transversal to each other at all points of the torus and that each of them allows for a smooth invariant density and no periodic orbits, we prove that the switched system also has a smooth invariant density, for every switching rate. Our approach is based on an integration by parts formula inspired by techniques from Malliavin calculus.

  19. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  20. Stochastic Resonance and Safe Basin of Single-Walled Carbon Nanotubes with Strongly Nonlinear Stiffness under Random Magnetic Field.

    PubMed

    Xu, Jia; Li, Chao; Li, Yiran; Lim, Chee Wah; Zhu, Zhiwen

    2018-05-04

    In this paper, a kind of single-walled carbon nanotube nonlinear model is developed and the strongly nonlinear dynamic characteristics of such carbon nanotubes subjected to random magnetic field are studied. The nonlocal effect of the microstructure is considered based on Eringen’s differential constitutive model. The natural frequency of the strongly nonlinear dynamic system is obtained by the energy function method, the drift coefficient and the diffusion coefficient are verified. The stationary probability density function of the system dynamic response is given and the fractal boundary of the safe basin is provided. Theoretical analysis and numerical simulation show that stochastic resonance occurs when varying the random magnetic field intensity. The boundary of safe basin has fractal characteristics and the area of safe basin decreases when the intensity of the magnetic field permeability increases.

  1. Robotic Range Clearance Competition (R2C2)

    DTIC Science & Technology

    2011-10-01

    unexploded ordnance (UXO). A large part of the debris field consists of ferrous metal objects that magnetic 39 Distribution A: Approved for public...was set at 7 degrees above horizontal based on terrain around the Base station. We used the BSUBR file for all fields except the Subsurface...and subsurface clearance test areas had numerous pieces of simulated unexploded ordinance (SUXO) buried at random locations around the field . These

  2. How well do mean field theories of spiking quadratic-integrate-and-fire networks work in realistic parameter regimes?

    PubMed

    Grabska-Barwińska, Agnieszka; Latham, Peter E

    2014-06-01

    We use mean field techniques to compute the distribution of excitatory and inhibitory firing rates in large networks of randomly connected spiking quadratic integrate and fire neurons. These techniques are based on the assumption that activity is asynchronous and Poisson. For most parameter settings these assumptions are strongly violated; nevertheless, so long as the networks are not too synchronous, we find good agreement between mean field prediction and network simulations. Thus, much of the intuition developed for randomly connected networks in the asynchronous regime applies to mildly synchronous networks.

  3. The influence of an uncertain force environment on reshaping trial-to-trial motor variability.

    PubMed

    Izawa, Jun; Yoshioka, Toshinori; Osu, Rieko

    2014-09-10

    Motor memory is updated to generate ideal movements in a novel environment. When the environment changes every trial randomly, how does the brain incorporate this uncertainty into motor memory? To investigate how the brain adapts to an uncertain environment, we considered a reach adaptation protocol where individuals practiced moving in a force field where a noise was injected. After they had adapted, we measured the trial-to-trial variability in the temporal profiles of the produced hand force. We found that the motor variability was significantly magnified by the adaptation to the random force field. Temporal profiles of the motor variance were significantly dissociable between two different types of random force fields experienced. A model-based analysis suggests that the variability is generated by noise in the gains of the internal model. It further suggests that the trial-to-trial motor variability magnified by the adaptation in a random force field is generated by the uncertainty of the internal model formed in the brain as a result of the adaptation.

  4. Conditional Random Field-Based Offline Map Matching for Indoor Environments

    PubMed Central

    Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram

    2016-01-01

    In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892

  5. Conditional Random Field-Based Offline Map Matching for Indoor Environments.

    PubMed

    Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram

    2016-08-16

    In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.

  6. Field Line Random Walk in Isotropic Magnetic Turbulence up to Infinite Kubo Number

    NASA Astrophysics Data System (ADS)

    Sonsrettee, W.; Wongpan, P.; Ruffolo, D. J.; Matthaeus, W. H.; Chuychai, P.; Rowlands, G.

    2013-12-01

    In astrophysical plasmas, the magnetic field line random walk (FLRW) plays a key role in the transport of energetic particles. In the present, we consider isotropic magnetic turbulence, which is a reasonable model for interstellar space. Theoretical conceptions of the FLRW have been strongly influenced by studies of the limit of weak fluctuations (or a strong mean field) (e.g, Isichenko 1991a, b). In this case, the behavior of FLRW can be characterized by the Kubo number R = (b/B0)(l_∥ /l_ \\bot ) , where l∥ and l_ \\bot are turbulence coherence scales parallel and perpendicular to the mean field, respectively, and b is the root mean squared fluctuation field. In the 2D limit (R ≫ 1), there has been an apparent conflict between concepts of Bohm diffusion, which is based on the Corrsin's independence hypothesis, and percolative diffusion. Here we have used three non-perturbative analytic techniques based on Corrsin's independence hypothesis for B0 = 0 (R = ∞ ): diffusive decorrelation (DD), random ballistic decorrelation (RBD) and a general ordinary differential equation (ODE), and compared them with direct computer simulations. All the analytical models and computer simulations agree that isotropic turbulence for R = ∞ has a field line diffusion coefficient that is consistent with Bohm diffusion. Partially supported by the Thailand Research Fund, NASA, and NSF.

  7. Influence of Embedded Inhomogeneities on the Spectral Ratio of the Horizontal Components of a Random Field of Rayleigh Waves

    NASA Astrophysics Data System (ADS)

    Tsukanov, A. A.; Gorbatnikov, A. V.

    2018-01-01

    Study of the statistical parameters of the Earth's random microseismic field makes it possible to obtain estimates of the properties and structure of the Earth's crust and upper mantle. Different approaches are used to observe and process the microseismic records, which are divided into several groups of passive seismology methods. Among them are the well-known methods of surface-wave tomography, the spectral H/ V ratio of the components in the surface wave, and microseismic sounding, currently under development, which uses the spectral ratio V/ V 0 of the vertical components between pairs of spatially separated stations. In the course of previous experiments, it became clear that these ratios are stable statistical parameters of the random field that do not depend on the properties of microseism sources. This paper proposes to expand the mentioned approach and study the possibilities for using the ratio of the horizontal components H 1/ H 2 of the microseismic field. Numerical simulation was used to study the influence of an embedded velocity inhomogeneity on the spectral ratio of the horizontal components of the random field of fundamental Rayleigh modes, based on the concept that the Earth's microseismic field is represented by these waves in a significant part of the frequency spectrum.

  8. Using Norm-Based Appeals to Increase Response Rates in Evaluation Research: A Field Experiment

    ERIC Educational Resources Information Center

    Misra, Shalini; Stokols, Daniel; Marino, Anne Heberger

    2012-01-01

    A field experiment was conducted to test the effectiveness of norm-based persuasive messages for increasing response rates in online survey research. Participants in an interdisciplinary conference were asked to complete two successive postconference surveys and randomly assigned to one of two groups at each time point. The experimental group…

  9. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    NASA Astrophysics Data System (ADS)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  10. Replication, randomization, and treatment design concepts for on-farm research

    USDA-ARS?s Scientific Manuscript database

    For most agronomists, randomization and replication are fundamental concepts that have a nearly sacred or spiritual status. They are an integral part of nearly all of our field-based activities. Some on-farm research falls into this category, simply because it is driven and designed by researchers w...

  11. Improving Preschoolers' Mathematics Achievement with Tablets: A Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Schacter, John; Jo, Booil

    2017-01-01

    With a randomized field experiment of 433 preschoolers, we tested a tablet mathematics program designed to increase young children's mathematics learning. Intervention students played Math Shelf, a comprehensive iPad preschool and year 1 mathematics app, while comparison children received research-based hands-on mathematics instruction delivered…

  12. Medical Students' and Tutors' Experiences of Directed and Self-Directed Learning Programs in Evidence-Based Medicine: A Qualitative Evaluation Accompanying a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Bradley, Peter; Oterholt, Christina; Nordheim, Lena; Bjorndal, Arild

    2005-01-01

    This qualitative study aims to interpret the results of a randomized controlled trial comparing two educational programs (directed learning and self-directed learning) in evidence-based medicine (EBM) for medical students at the University of Oslo from 2002 to 2003. There is currently very little comparative educational research in this field. In…

  13. A modified hybrid uncertain analysis method for dynamic response field of the LSOAAC with random and interval parameters

    NASA Astrophysics Data System (ADS)

    Zi, Bin; Zhou, Bin

    2016-07-01

    For the prediction of dynamic response field of the luffing system of an automobile crane (LSOAAC) with random and interval parameters, a hybrid uncertain model is introduced. In the hybrid uncertain model, the parameters with certain probability distribution are modeled as random variables, whereas, the parameters with lower and upper bounds are modeled as interval variables instead of given precise values. Based on the hybrid uncertain model, the hybrid uncertain dynamic response equilibrium equation, in which different random and interval parameters are simultaneously included in input and output terms, is constructed. Then a modified hybrid uncertain analysis method (MHUAM) is proposed. In the MHUAM, based on random interval perturbation method, the first-order Taylor series expansion and the first-order Neumann series, the dynamic response expression of the LSOAAC is developed. Moreover, the mathematical characteristics of extrema of bounds of dynamic response are determined by random interval moment method and monotonic analysis technique. Compared with the hybrid Monte Carlo method (HMCM) and interval perturbation method (IPM), numerical results show the feasibility and efficiency of the MHUAM for solving the hybrid LSOAAC problems. The effects of different uncertain models and parameters on the LSOAAC response field are also investigated deeply, and numerical results indicate that the impact made by the randomness in the thrust of the luffing cylinder F is larger than that made by the gravity of the weight in suspension Q . In addition, the impact made by the uncertainty in the displacement between the lower end of the lifting arm and the luffing cylinder a is larger than that made by the length of the lifting arm L .

  14. A Novel Weighted Kernel PCA-Based Method for Optimization and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Thimmisetty, C.; Talbot, C.; Chen, X.; Tong, C. H.

    2016-12-01

    It has been demonstrated that machine learning methods can be successfully applied to uncertainty quantification for geophysical systems through the use of the adjoint method coupled with kernel PCA-based optimization. In addition, it has been shown through weighted linear PCA how optimization with respect to both observation weights and feature space control variables can accelerate convergence of such methods. Linear machine learning methods, however, are inherently limited in their ability to represent features of non-Gaussian stochastic random fields, as they are based on only the first two statistical moments of the original data. Nonlinear spatial relationships and multipoint statistics leading to the tortuosity characteristic of channelized media, for example, are captured only to a limited extent by linear PCA. With the aim of coupling the kernel-based and weighted methods discussed, we present a novel mathematical formulation of kernel PCA, Weighted Kernel Principal Component Analysis (WKPCA), that both captures nonlinear relationships and incorporates the attribution of significance levels to different realizations of the stochastic random field of interest. We also demonstrate how new instantiations retaining defining characteristics of the random field can be generated using Bayesian methods. In particular, we present a novel WKPCA-based optimization method that minimizes a given objective function with respect to both feature space random variables and observation weights through which optimal snapshot significance levels and optimal features are learned. We showcase how WKPCA can be applied to nonlinear optimal control problems involving channelized media, and in particular demonstrate an application of the method to learning the spatial distribution of material parameter values in the context of linear elasticity, and discuss further extensions of the method to stochastic inversion.

  15. The Effects of CBI Lesson Sequence Type and Field Dependence on Learning from Computer-Based Cooperative Instruction in Web

    ERIC Educational Resources Information Center

    Ipek, Ismail

    2010-01-01

    The purpose of this study was to investigate the effects of CBI lesson sequence type and cognitive style of field dependence on learning from Computer-Based Cooperative Instruction (CBCI) in WEB on the dependent measures, achievement, reading comprehension and reading rate. Eighty-seven college undergraduate students were randomly assigned to…

  16. Multi-fidelity Gaussian process regression for prediction of random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parussini, L.; Venturi, D., E-mail: venturi@ucsc.edu; Perdikaris, P.

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgersmore » equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.« less

  17. Sample Size Estimation in Cluster Randomized Educational Trials: An Empirical Bayes Approach

    ERIC Educational Resources Information Center

    Rotondi, Michael A.; Donner, Allan

    2009-01-01

    The educational field has now accumulated an extensive literature reporting on values of the intraclass correlation coefficient, a parameter essential to determining the required size of a planned cluster randomized trial. We propose here a simple simulation-based approach including all relevant information that can facilitate this task. An…

  18. Table Extraction from Web Pages Using Conditional Random Fields to Extract Toponym Related Data

    NASA Astrophysics Data System (ADS)

    Luthfi Hanifah, Hayyu'; Akbar, Saiful

    2017-01-01

    Table is one of the ways to visualize information on web pages. The abundant number of web pages that compose the World Wide Web has been the motivation of information extraction and information retrieval research, including the research for table extraction. Besides, there is a need for a system which is designed to specifically handle location-related information. Based on this background, this research is conducted to provide a way to extract location-related data from web tables so that it can be used in the development of Geographic Information Retrieval (GIR) system. The location-related data will be identified by the toponym (location name). In this research, a rule-based approach with gazetteer is used to recognize toponym from web table. Meanwhile, to extract data from a table, a combination of rule-based approach and statistical-based approach is used. On the statistical-based approach, Conditional Random Fields (CRF) model is used to understand the schema of the table. The result of table extraction is presented on JSON format. If a web table contains toponym, a field will be added on the JSON document to store the toponym values. This field can be used to index the table data in accordance to the toponym, which then can be used in the development of GIR system.

  19. Brownian Motion in a Speckle Light Field: Tunable Anomalous Diffusion and Selective Optical Manipulation

    PubMed Central

    Volpe, Giorgio; Volpe, Giovanni; Gigan, Sylvain

    2014-01-01

    The motion of particles in random potentials occurs in several natural phenomena ranging from the mobility of organelles within a biological cell to the diffusion of stars within a galaxy. A Brownian particle moving in the random optical potential associated to a speckle pattern, i.e., a complex interference pattern generated by the scattering of coherent light by a random medium, provides an ideal model system to study such phenomena. Here, we derive a theory for the motion of a Brownian particle in a speckle field and, in particular, we identify its universal characteristic timescale. Based on this theoretical insight, we show how speckle light fields can be used to control the anomalous diffusion of a Brownian particle and to perform some basic optical manipulation tasks such as guiding and sorting. Our results might broaden the perspectives of optical manipulation for real-life applications. PMID:24496461

  20. Weak scattering of scalar and electromagnetic random fields

    NASA Astrophysics Data System (ADS)

    Tong, Zhisong

    This dissertation encompasses several studies relating to the theory of weak potential scattering of scalar and electromagnetic random, wide-sense statistically stationary fields from various types of deterministic or random linear media. The proposed theory is largely based on the first Born approximation for potential scattering and on the angular spectrum representation of fields. The main focus of the scalar counterpart of the theory is made on calculation of the second-order statistics of scattered light fields in cases when the scattering medium consists of several types of discrete particles with deterministic or random potentials. It is shown that the knowledge of the correlation properties for the particles of the same and different types, described with the newly introduced pair-scattering matrix, is crucial for determining the spectral and coherence states of the scattered radiation. The approach based on the pair-scattering matrix is then used for solving an inverse problem of determining the location of an "alien" particle within the scattering collection of "normal" particles, from several measurements of the spectral density of scattered light. Weak scalar scattering of light from a particulate medium in the presence of optical turbulence existing between the scattering centers is then approached using the combination of the Born's theory for treating the light interaction with discrete particles and the Rytov's theory for light propagation in extended turbulent medium. It is demonstrated how the statistics of scattered radiation depend on scattering potentials of particles and the power spectra of the refractive index fluctuations of turbulence. This theory is of utmost importance for applications involving atmospheric and oceanic light transmission. The second part of the dissertation includes the theoretical procedure developed for predicting the second-order statistics of the electromagnetic random fields, such as polarization and linear momentum, scattered from static media. The spatial distribution of these properties of scattered fields is shown to be substantially dependent on the correlation and polarization properties of incident fields and on the statistics of the refractive index distribution within the scatterers. Further, an example is considered which illustrates the usefulness of the electromagnetic scattering theory of random fields in the case when the scattering medium is a thin bio-tissue layer with the prescribed power spectrum of the refractive index fluctuations. The polarization state of the scattered light is shown to be influenced by correlation and polarization states of the illumination as well as by the particle size distribution of the tissue slice.

  1. Random-phase metasurfaces at optical wavelengths

    NASA Astrophysics Data System (ADS)

    Pors, Anders; Ding, Fei; Chen, Yiting; Radko, Ilya P.; Bozhevolnyi, Sergey I.

    2016-06-01

    Random-phase metasurfaces, in which the constituents scatter light with random phases, have the property that an incident plane wave will diffusely scatter, hereby leading to a complex far-field response that is most suitably described by statistical means. In this work, we present and exemplify the statistical description of the far-field response, particularly highlighting how the response for polarised and unpolarised light might be alike or different depending on the correlation of scattering phases for two orthogonal polarisations. By utilizing gap plasmon-based metasurfaces, consisting of an optically thick gold film overlaid by a subwavelength thin glass spacer and an array of gold nanobricks, we design and realize random-phase metasurfaces at a wavelength of 800 nm. Optical characterisation of the fabricated samples convincingly demonstrates the diffuse scattering of reflected light, with statistics obeying the theoretical predictions. We foresee the use of random-phase metasurfaces for camouflage applications and as high-quality reference structures in dark-field microscopy, while the control of the statistics for polarised and unpolarised light might find usage in security applications. Finally, by incorporating a certain correlation between scattering by neighbouring metasurface constituents new types of functionalities can be realised, such as a Lambertian reflector.

  2. Probabilistic finite elements for transient analysis in nonlinear continua

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Mani, A.

    1985-01-01

    The probabilistic finite element method (PFEM), which is a combination of finite element methods and second-moment analysis, is formulated for linear and nonlinear continua with inhomogeneous random fields. Analogous to the discretization of the displacement field in finite element methods, the random field is also discretized. The formulation is simplified by transforming the correlated variables to a set of uncorrelated variables through an eigenvalue orthogonalization. Furthermore, it is shown that a reduced set of the uncorrelated variables is sufficient for the second-moment analysis. Based on the linear formulation of the PFEM, the method is then extended to transient analysis in nonlinear continua. The accuracy and efficiency of the method is demonstrated by application to a one-dimensional, elastic/plastic wave propagation problem. The moments calculated compare favorably with those obtained by Monte Carlo simulation. Also, the procedure is amenable to implementation in deterministic FEM based computer programs.

  3. Adaptive Markov Random Fields for Example-Based Super-resolution of Faces

    NASA Astrophysics Data System (ADS)

    Stephenson, Todd A.; Chen, Tsuhan

    2006-12-01

    Image enhancement of low-resolution images can be done through methods such as interpolation, super-resolution using multiple video frames, and example-based super-resolution. Example-based super-resolution, in particular, is suited to images that have a strong prior (for those frameworks that work on only a single image, it is more like image restoration than traditional, multiframe super-resolution). For example, hallucination and Markov random field (MRF) methods use examples drawn from the same domain as the image being enhanced to determine what the missing high-frequency information is likely to be. We propose to use even stronger prior information by extending MRF-based super-resolution to use adaptive observation and transition functions, that is, to make these functions region-dependent. We show with face images how we can adapt the modeling for each image patch so as to improve the resolution.

  4. Localized surface plasmon enhanced cellular imaging using random metallic structures

    NASA Astrophysics Data System (ADS)

    Son, Taehwang; Lee, Wonju; Kim, Donghyun

    2017-02-01

    We have studied fluorescence cellular imaging with randomly distributed localized near-field induced by silver nano-islands. For the fabrication of nano-islands, a 10-nm silver thin film evaporated on a BK7 glass substrate with an adhesion layer of 2-nm thick chromium. Micrometer sized silver square pattern was defined using e-beam lithography and then the film was annealed at 200°C. Raw images were restored using electric field distribution produced on the surface of random nano-islands. Nano-islands were modeled from SEM images. 488-nm p-polarized light source was set to be incident at 60°. Simulation results show that localized electric fields were created among nano-islands and that their average size was found to be 135 nm. The feasibility was tested using conventional total internal reflection fluorescence microscopy while the angle of incidence was adjusted to maximize field enhancement. Mouse microphage cells were cultured on nano-islands, and actin filaments were selectively stained with FITC-conjugated phalloidin. Acquired images were deconvolved based on linear imaging theory, in which molecular distribution was sampled by randomly distributed localized near-field and blurred by point spread function of far-field optics. The optimum fluorophore distribution was probabilistically estimated by repetitively matching a raw image. The deconvolved images are estimated to have a resolution in the range of 100-150 nm largely determined by the size of localized near-fields. We also discuss and compare the results with images acquired with periodic nano-aperture arrays in various optical configurations to excite localized plasmonic fields and to produce super-resolved molecular images.

  5. Traffic Video Image Segmentation Model Based on Bayesian and Spatio-Temporal Markov Random Field

    NASA Astrophysics Data System (ADS)

    Zhou, Jun; Bao, Xu; Li, Dawei; Yin, Yongwen

    2017-10-01

    Traffic video image is a kind of dynamic image and its background and foreground is changed at any time, which results in the occlusion. In this case, using the general method is more difficult to get accurate image segmentation. A segmentation algorithm based on Bayesian and Spatio-Temporal Markov Random Field is put forward, which respectively build the energy function model of observation field and label field to motion sequence image with Markov property, then according to Bayesian' rule, use the interaction of label field and observation field, that is the relationship of label field’s prior probability and observation field’s likelihood probability, get the maximum posterior probability of label field’s estimation parameter, use the ICM model to extract the motion object, consequently the process of segmentation is finished. Finally, the segmentation methods of ST - MRF and the Bayesian combined with ST - MRF were analyzed. Experimental results: the segmentation time in Bayesian combined with ST-MRF algorithm is shorter than in ST-MRF, and the computing workload is small, especially in the heavy traffic dynamic scenes the method also can achieve better segmentation effect.

  6. Micromechanics-based magneto-elastic constitutive modeling of particulate composites

    NASA Astrophysics Data System (ADS)

    Yin, Huiming

    Modified Green's functions are derived for three situations: a magnetic field caused by a local magnetization, a displacement field caused by a local body force and a displacement field caused by a local prescribed eigenstrain. Based on these functions, an explicit solution is derived for two magnetic particles embedded in the infinite medium under external magnetic and mechanical loading. A general solution for numerable magnetic particles embedded in an infinite domain is then provided in integral form. Two-phase composites containing spherical magnetic particles of the same size are considered for three kinds of microstructures. With chain-structured composites, particle interactions in the same chain are considered and a transversely isotropic effective elasticity is obtained. For periodic composites, an eight-particle interaction model is developed and provides a cubic symmetric effective elasticity. In the random composite, pair-wise particle interactions are integrated from all possible positions and an isotropic effective property is reached. This method is further extended to functionally graded composites. Magneto-mechanical behavior is studied for the chain-structured composite and the random composite. Effective magnetic permeability, effective magnetostriction and field-dependent effective elasticity are investigated. It is seen that the chain-structured composite is more sensitive to the magnetic field than the random composite; a composite consisting of only 5% of chain-structured particles can provide a larger magnetostriction and a larger change of effective elasticity than an equivalent composite consisting of 30% of random dispersed particles. Moreover, the effective shear modulus of the chain-structured composite rapidly increases with the magnetic field, while that for the random composite decreases. An effective hyperelastic constitutive model is further developed for a magnetostrictive particle-filled elastomer, which is sampled by using a network of body-centered cubic lattices of particles connected by macromolecular chains. The proposed hyperelastic model is able to characterize overall nonlinear elastic stress-stretch relations of the composites under general three-dimensional loading. It is seen that the effective strain energy density is proportional to the length of stretched chains in unit volume and volume fraction of particles.

  7. Recognizing suspicious activities in infrared imagery using appearance-based features and the theory of hidden conditional random fields for outdoor perimeter surveillance

    NASA Astrophysics Data System (ADS)

    Rogotis, Savvas; Palaskas, Christos; Ioannidis, Dimosthenis; Tzovaras, Dimitrios; Likothanassis, Spiros

    2015-11-01

    This work aims to present an extended framework for automatically recognizing suspicious activities in outdoor perimeter surveilling systems based on infrared video processing. By combining size-, speed-, and appearance-based features, like the local phase quantization and the histograms of oriented gradients, actions of small duration are recognized and used as input, along with spatial information, for modeling target activities using the theory of hidden conditional random fields (HCRFs). HCRFs are used to classify an observation sequence into the most appropriate activity label class, thus discriminating high-risk activities like trespassing from zero risk activities, such as loitering outside the perimeter. The effectiveness of this approach is demonstrated with experimental results in various scenarios that represent suspicious activities in perimeter surveillance systems.

  8. Construction of a stochastic model of track geometry irregularities and validation through experimental measurements of dynamic loading

    NASA Astrophysics Data System (ADS)

    Panunzio, Alfonso M.; Puel, G.; Cottereau, R.; Simon, S.; Quost, X.

    2017-03-01

    This paper describes the construction of a stochastic model of urban railway track geometry irregularities, based on experimental data. The considered irregularities are track gauge, superelevation, horizontal and vertical curvatures. They are modelled as random fields whose statistical properties are extracted from a large set of on-track measurements of the geometry of an urban railway network. About 300-1000 terms are used in the Karhunen-Loève/Polynomial Chaos expansions to represent the random fields with appropriate accuracy. The construction of the random fields is then validated by comparing on-track measurements of the contact forces and numerical dynamics simulations for different operational conditions (train velocity and car load) and horizontal layouts (alignment, curve). The dynamics simulations are performed both with and without randomly generated geometrical irregularities for the track. The power spectrum densities obtained from the dynamics simulations with the model of geometrical irregularities compare extremely well with those obtained from the experimental contact forces. Without irregularities, the spectrum is 10-50 dB too low.

  9. A sparse reconstruction method for the estimation of multiresolution emission fields via atmospheric inversion

    DOE PAGES

    Ray, J.; Lee, J.; Yadav, V.; ...

    2014-08-20

    We present a sparse reconstruction scheme that can also be used to ensure non-negativity when fitting wavelet-based random field models to limited observations in non-rectangular geometries. The method is relevant when multiresolution fields are estimated using linear inverse problems. Examples include the estimation of emission fields for many anthropogenic pollutants using atmospheric inversion or hydraulic conductivity in aquifers from flow measurements. The scheme is based on three new developments. Firstly, we extend an existing sparse reconstruction method, Stagewise Orthogonal Matching Pursuit (StOMP), to incorporate prior information on the target field. Secondly, we develop an iterative method that uses StOMP tomore » impose non-negativity on the estimated field. Finally, we devise a method, based on compressive sensing, to limit the estimated field within an irregularly shaped domain. We demonstrate the method on the estimation of fossil-fuel CO 2 (ffCO 2) emissions in the lower 48 states of the US. The application uses a recently developed multiresolution random field model and synthetic observations of ffCO 2 concentrations from a limited set of measurement sites. We find that our method for limiting the estimated field within an irregularly shaped region is about a factor of 10 faster than conventional approaches. It also reduces the overall computational cost by a factor of two. Further, the sparse reconstruction scheme imposes non-negativity without introducing strong nonlinearities, such as those introduced by employing log-transformed fields, and thus reaps the benefits of simplicity and computational speed that are characteristic of linear inverse problems.« less

  10. Reducing Achievement Gaps in Academic Writing for Latinos and English Learners in Grades 7-12

    ERIC Educational Resources Information Center

    Olson, Carol Booth; Matuchniak, Tina; Chung, Huy Q.; Stumpf, Rachel; Farkas, George

    2017-01-01

    This study reports 2 years of findings from a randomized controlled trial designed to replicate and demonstrate the efficacy of an existing, successful professional development program, the Pathway Project, that uses a cognitive strategies approach to text-based analytical writing. Building on an earlier randomized field trial in a large, urban,…

  11. Fractal planetary rings: Energy inequalities and random field model

    NASA Astrophysics Data System (ADS)

    Malyarenko, Anatoliy; Ostoja-Starzewski, Martin

    2017-12-01

    This study is motivated by a recent observation, based on photographs from the Cassini mission, that Saturn’s rings have a fractal structure in radial direction. Accordingly, two questions are considered: (1) What Newtonian mechanics argument in support of such a fractal structure of planetary rings is possible? (2) What kinematics model of such fractal rings can be formulated? Both challenges are based on taking planetary rings’ spatial structure as being statistically stationary in time and statistically isotropic in space, but statistically nonstationary in space. An answer to the first challenge is given through an energy analysis of circular rings having a self-generated, noninteger-dimensional mass distribution [V. E. Tarasov, Int. J. Mod Phys. B 19, 4103 (2005)]. The second issue is approached by taking the random field of angular velocity vector of a rotating particle of the ring as a random section of a special vector bundle. Using the theory of group representations, we prove that such a field is completely determined by a sequence of continuous positive-definite matrix-valued functions defined on the Cartesian square F2 of the radial cross-section F of the rings, where F is a fat fractal.

  12. THE WESTERN LAKE SUPERIOR COMPARATIVE WATERSHED FRAMEWORK: A FIELD TEST OF GEOGRAPHICALLY-DEPENDENT VS. THRESHOLD-BASED GEOGRAPHICALLY-INDEPENDENT CLASSIFICATION

    EPA Science Inventory

    Stratified random selection of watersheds allowed us to compare geographically-independent classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme within the Northern Lakes a...

  13. The Investigation of Attitude Changes of Elementary Preservice Teachers in a Competency-Based, Field-Oriented Science Methods Course and Attitude Changes of Classroom Teachers Cooperating with the Field Component.

    ERIC Educational Resources Information Center

    Piper, Martha K.

    Thirty-six students enrolled in an elementary science methods course were randomly selected and given an instrument using Osgood's semantic differential approach the first week of class, the sixth week on campus prior to field experiences, and the thirteenth week following field experiences. The elementary teachers who had observed the university…

  14. Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators

    NASA Astrophysics Data System (ADS)

    Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.

    2015-11-01

    A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.

  15. Additional motional-magnetic-field considerations for electric-dipole-moment experiments

    NASA Astrophysics Data System (ADS)

    Lamoreaux, S. K.

    1996-06-01

    Electric-dipole-moment experiments based on spin-precession measurements of stored atoms or neutrons are generally considered to be immune from the effects of v×E or motional magnetic fields. This is because the average velocity for such systems is zero. We show here that the fluctuating field associated with the random velocity, heretofore not considered, can in fact lead to sizable systematic effects.

  16. Reduced Wiener Chaos representation of random fields via basis adaptation and projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsilifis, Panagiotis, E-mail: tsilifis@usc.edu; Department of Civil Engineering, University of Southern California, Los Angeles, CA 90089; Ghanem, Roger G., E-mail: ghanem@usc.edu

    2017-07-15

    A new characterization of random fields appearing in physical models is presented that is based on their well-known Homogeneous Chaos expansions. We take advantage of the adaptation capabilities of these expansions where the core idea is to rotate the basis of the underlying Gaussian Hilbert space, in order to achieve reduced functional representations that concentrate the induced probability measure in a lower dimensional subspace. For a smooth family of rotations along the domain of interest, the uncorrelated Gaussian inputs are transformed into a Gaussian process, thus introducing a mesoscale that captures intermediate characteristics of the quantity of interest.

  17. Reduced Wiener Chaos representation of random fields via basis adaptation and projection

    NASA Astrophysics Data System (ADS)

    Tsilifis, Panagiotis; Ghanem, Roger G.

    2017-07-01

    A new characterization of random fields appearing in physical models is presented that is based on their well-known Homogeneous Chaos expansions. We take advantage of the adaptation capabilities of these expansions where the core idea is to rotate the basis of the underlying Gaussian Hilbert space, in order to achieve reduced functional representations that concentrate the induced probability measure in a lower dimensional subspace. For a smooth family of rotations along the domain of interest, the uncorrelated Gaussian inputs are transformed into a Gaussian process, thus introducing a mesoscale that captures intermediate characteristics of the quantity of interest.

  18. Rigorous Program Evaluations on a Budget: How Low-Cost Randomized Controlled Trials Are Possible in Many Areas of Social Policy

    ERIC Educational Resources Information Center

    Coalition for Evidence-Based Policy, 2012

    2012-01-01

    The increasing ability of social policy researchers to conduct randomized controlled trials (RCTs) at low cost could revolutionize the field of performance-based government. RCTs are widely judged to be the most credible method of evaluating whether a social program is effective, overcoming the demonstrated inability of other, more common methods…

  19. The mean field theory in EM procedures for blind Markov random field image restoration.

    PubMed

    Zhang, J

    1993-01-01

    A Markov random field (MRF) model-based EM (expectation-maximization) procedure for simultaneously estimating the degradation model and restoring the image is described. The MRF is a coupled one which provides continuity (inside regions of smooth gray tones) and discontinuity (at region boundaries) constraints for the restoration problem which is, in general, ill posed. The computational difficulty associated with the EM procedure for MRFs is resolved by using the mean field theory from statistical mechanics. An orthonormal blur decomposition is used to reduce the chances of undesirable locally optimal estimates. Experimental results on synthetic and real-world images show that this approach provides good blur estimates and restored images. The restored images are comparable to those obtained by a Wiener filter in mean-square error, but are most visually pleasing.

  20. Semantic segmentation of 3D textured meshes for urban scene analysis

    NASA Astrophysics Data System (ADS)

    Rouhani, Mohammad; Lafarge, Florent; Alliez, Pierre

    2017-01-01

    Classifying 3D measurement data has become a core problem in photogrammetry and 3D computer vision, since the rise of modern multiview geometry techniques, combined with affordable range sensors. We introduce a Markov Random Field-based approach for segmenting textured meshes generated via multi-view stereo into urban classes of interest. The input mesh is first partitioned into small clusters, referred to as superfacets, from which geometric and photometric features are computed. A random forest is then trained to predict the class of each superfacet as well as its similarity with the neighboring superfacets. Similarity is used to assign the weights of the Markov Random Field pairwise-potential and to account for contextual information between the classes. The experimental results illustrate the efficacy and accuracy of the proposed framework.

  1. Critical exponents for diluted resistor networks

    NASA Astrophysics Data System (ADS)

    Stenull, O.; Janssen, H. K.; Oerding, K.

    1999-05-01

    An approach by Stephen [Phys. Rev. B 17, 4444 (1978)] is used to investigate the critical properties of randomly diluted resistor networks near the percolation threshold by means of renormalized field theory. We reformulate an existing field theory by Harris and Lubensky [Phys. Rev. B 35, 6964 (1987)]. By a decomposition of the principal Feynman diagrams, we obtain diagrams which again can be interpreted as resistor networks. This interpretation provides for an alternative way of evaluating the Feynman diagrams for random resistor networks. We calculate the resistance crossover exponent φ up to second order in ɛ=6-d, where d is the spatial dimension. Our result φ=1+ɛ/42+4ɛ2/3087 verifies a previous calculation by Lubensky and Wang, which itself was based on the Potts-model formulation of the random resistor network.

  2. Fuzzy Markov random fields versus chains for multispectral image segmentation.

    PubMed

    Salzenstein, Fabien; Collet, Christophe

    2006-11-01

    This paper deals with a comparison of recent statistical models based on fuzzy Markov random fields and chains for multispectral image segmentation. The fuzzy scheme takes into account discrete and continuous classes which model the imprecision of the hidden data. In this framework, we assume the dependence between bands and we express the general model for the covariance matrix. A fuzzy Markov chain model is developed in an unsupervised way. This method is compared with the fuzzy Markovian field model previously proposed by one of the authors. The segmentation task is processed with Bayesian tools, such as the well-known MPM (Mode of Posterior Marginals) criterion. Our goal is to compare the robustness and rapidity for both methods (fuzzy Markov fields versus fuzzy Markov chains). Indeed, such fuzzy-based procedures seem to be a good answer, e.g., for astronomical observations when the patterns present diffuse structures. Moreover, these approaches allow us to process missing data in one or several spectral bands which correspond to specific situations in astronomy. To validate both models, we perform and compare the segmentation on synthetic images and raw multispectral astronomical data.

  3. Scalable hierarchical PDE sampler for generating spatially correlated random fields using nonmatching meshes: Scalable hierarchical PDE sampler using nonmatching meshes

    DOE PAGES

    Osborn, Sarah; Zulian, Patrick; Benson, Thomas; ...

    2018-01-30

    This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less

  4. Scalable hierarchical PDE sampler for generating spatially correlated random fields using nonmatching meshes: Scalable hierarchical PDE sampler using nonmatching meshes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Zulian, Patrick; Benson, Thomas

    This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less

  5. Suspicious activity recognition in infrared imagery using Hidden Conditional Random Fields for outdoor perimeter surveillance

    NASA Astrophysics Data System (ADS)

    Rogotis, Savvas; Ioannidis, Dimosthenis; Tzovaras, Dimitrios; Likothanassis, Spiros

    2015-04-01

    The aim of this work is to present a novel approach for automatic recognition of suspicious activities in outdoor perimeter surveillance systems based on infrared video processing. Through the combination of size, speed and appearance based features, like the Center-Symmetric Local Binary Patterns, short-term actions are identified and serve as input, along with user location, for modeling target activities using the theory of Hidden Conditional Random Fields. HCRFs are used to directly link a set of observations to the most appropriate activity label and as such to discriminate high risk activities (e.g. trespassing) from zero risk activities (e.g loitering outside the perimeter). Experimental results demonstrate the effectiveness of our approach in identifying suspicious activities for video surveillance systems.

  6. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao

    2016-07-15

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilizedmore » interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.« less

  7. Dynamical behavior of the random field on the pulsating and snaking solitons in cubic-quintic complex Ginzburg-Landau equation

    NASA Astrophysics Data System (ADS)

    Bakhtiar, Nurizatul Syarfinas Ahmad; Abdullah, Farah Aini; Hasan, Yahya Abu

    2017-08-01

    In this paper, we consider the dynamical behaviour of the random field on the pulsating and snaking solitons in a dissipative systems described by the one-dimensional cubic-quintic complex Ginzburg-Landau equation (cqCGLE). The dynamical behaviour of the random filed was simulated by adding a random field to the initial pulse. Then, we solve it numerically by fixing the initial amplitude profile for the pulsating and snaking solitons without losing any generality. In order to create the random field, we choose 0 ≤ ɛ ≤ 1.0. As a result, multiple soliton trains are formed when the random field is applied to a pulse like initial profile for the parameters of the pulsating and snaking solitons. The results also show the effects of varying the random field of the transient energy peaks in pulsating and snaking solitons.

  8. Random technique to encode complex valued holograms with on axis reconstruction onto phase-only displays.

    PubMed

    Luis Martínez Fuentes, Jose; Moreno, Ignacio

    2018-03-05

    A new technique for encoding the amplitude and phase of diffracted fields in digital holography is proposed. It is based on a random spatial multiplexing of two phase-only diffractive patterns. The first one is the phase information of the intended pattern, while the second one is a diverging optical element whose purpose is the control of the amplitude. A random number determines the choice between these two diffractive patterns at each pixel, and the amplitude information of the desired field governs its discrimination threshold. This proposed technique is computationally fast and does not require iterative methods, and the complex field reconstruction appears on axis. We experimentally demonstrate this new encoding technique with holograms implemented onto a flicker-free phase-only spatial light modulator (SLM), which allows the axial generation of such holograms. The experimental verification includes the phase measurement of generated patterns with a phase-shifting polarization interferometer implemented in the same experimental setup.

  9. Real-time fast physical random number generator with a photonic integrated circuit.

    PubMed

    Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu

    2017-03-20

    Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.

  10. Ferroelectric field-effect transistors based on solution-processed electrochemically exfoliated graphene

    NASA Astrophysics Data System (ADS)

    Heidler, Jonas; Yang, Sheng; Feng, Xinliang; Müllen, Klaus; Asadi, Kamal

    2018-06-01

    Memories based on graphene that could be mass produced using low-cost methods have not yet received much attention. Here we demonstrate graphene ferroelectric (dual-gate) field effect transistors. The graphene has been obtained using electrochemical exfoliation of graphite. Field-effect transistors are realized using a monolayer of graphene flakes deposited by the Langmuir-Blodgett protocol. Ferroelectric field effect transistor memories are realized using a random ferroelectric copolymer poly(vinylidenefluoride-co-trifluoroethylene) in a top gated geometry. The memory transistors reveal ambipolar behaviour with both electron and hole accumulation channels. We show that the non-ferroelectric bottom gate can be advantageously used to tune the on/off ratio.

  11. Vortex-Core Reversal Dynamics: Towards Vortex Random Access Memory

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Koog

    2011-03-01

    An energy-efficient, ultrahigh-density, ultrafast, and nonvolatile solid-state universal memory is a long-held dream in the field of information-storage technology. The magnetic random access memory (MRAM) along with a spin-transfer-torque switching mechanism is a strong candidate-means of realizing that dream, given its nonvolatility, infinite endurance, and fast random access. Magnetic vortices in patterned soft magnetic dots promise ground-breaking applications in information-storage devices, owing to the very stable twofold ground states of either their upward or downward core magnetization orientation and plausible core switching by in-plane alternating magnetic fields or spin-polarized currents. However, two technologically most important but very challenging issues --- low-power recording and reliable selection of each memory cell with already existing cross-point architectures --- have not yet been resolved for the basic operations in information storage, that is, writing (recording) and readout. Here, we experimentally demonstrate a magnetic vortex random access memory (VRAM) in the basic cross-point architecture. This unique VRAM offers reliable cell selection and low-power-consumption control of switching of out-of-plane core magnetizations using specially designed rotating magnetic fields generated by two orthogonal and unipolar Gaussian-pulse currents along with optimized pulse width and time delay. Our achievement of a new device based on a new material, that is, a medium composed of patterned vortex-state disks, together with the new physics on ultrafast vortex-core switching dynamics, can stimulate further fruitful research on MRAMs that are based on vortex-state dot arrays.

  12. Linear and angular coherence momenta in the classical second-order coherence theory of vector electromagnetic fields.

    PubMed

    Wang, Wei; Takeda, Mitsuo

    2006-09-01

    A new concept of vector and tensor densities is introduced into the general coherence theory of vector electromagnetic fields that is based on energy and energy-flow coherence tensors. Related coherence conservation laws are presented in the form of continuity equations that provide new insights into the propagation of second-order correlation tensors associated with stationary random classical electromagnetic fields.

  13. RF-Phos: A Novel General Phosphorylation Site Prediction Tool Based on Random Forest.

    PubMed

    Ismail, Hamid D; Jones, Ahoi; Kim, Jung H; Newman, Robert H; Kc, Dukka B

    2016-01-01

    Protein phosphorylation is one of the most widespread regulatory mechanisms in eukaryotes. Over the past decade, phosphorylation site prediction has emerged as an important problem in the field of bioinformatics. Here, we report a new method, termed Random Forest-based Phosphosite predictor 2.0 (RF-Phos 2.0), to predict phosphorylation sites given only the primary amino acid sequence of a protein as input. RF-Phos 2.0, which uses random forest with sequence and structural features, is able to identify putative sites of phosphorylation across many protein families. In side-by-side comparisons based on 10-fold cross validation and an independent dataset, RF-Phos 2.0 compares favorably to other popular mammalian phosphosite prediction methods, such as PhosphoSVM, GPS2.1, and Musite.

  14. On random field Completely Automated Public Turing Test to Tell Computers and Humans Apart generation.

    PubMed

    Kouritzin, Michael A; Newton, Fraser; Wu, Biao

    2013-04-01

    Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.

  15. Seismic random noise attenuation method based on empirical mode decomposition of Hausdorff dimension

    NASA Astrophysics Data System (ADS)

    Yan, Z.; Luan, X.

    2017-12-01

    Introduction Empirical mode decomposition (EMD) is a noise suppression algorithm by using wave field separation, which is based on the scale differences between effective signal and noise. However, since the complexity of the real seismic wave field results in serious aliasing modes, it is not ideal and effective to denoise with this method alone. Based on the multi-scale decomposition characteristics of the signal EMD algorithm, combining with Hausdorff dimension constraints, we propose a new method for seismic random noise attenuation. First of all, We apply EMD algorithm adaptive decomposition of seismic data and obtain a series of intrinsic mode function (IMF)with different scales. Based on the difference of Hausdorff dimension between effectively signals and random noise, we identify IMF component mixed with random noise. Then we use threshold correlation filtering process to separate the valid signal and random noise effectively. Compared with traditional EMD method, the results show that the new method of seismic random noise attenuation has a better suppression effect. The implementation process The EMD algorithm is used to decompose seismic signals into IMF sets and analyze its spectrum. Since most of the random noise is high frequency noise, the IMF sets can be divided into three categories: the first category is the effective wave composition of the larger scale; the second category is the noise part of the smaller scale; the third category is the IMF component containing random noise. Then, the third kind of IMF component is processed by the Hausdorff dimension algorithm, and the appropriate time window size, initial step and increment amount are selected to calculate the Hausdorff instantaneous dimension of each component. The dimension of the random noise is between 1.0 and 1.05, while the dimension of the effective wave is between 1.05 and 2.0. On the basis of the previous steps, according to the dimension difference between the random noise and effective signal, we extracted the sample points, whose fractal dimension value is less than or equal to 1.05 for the each IMF components, to separate the residual noise. Using the IMF components after dimension filtering processing and the effective wave IMF components after the first selection for reconstruction, we can obtained the results of de-noising.

  16. Spectral filtering of gradient for l2-norm frequency-domain elastic waveform inversion

    NASA Astrophysics Data System (ADS)

    Oh, Ju-Won; Min, Dong-Joo

    2013-05-01

    To enhance the robustness of the l2-norm elastic full-waveform inversion (FWI), we propose a denoise function that is incorporated into single-frequency gradients. Because field data are noisy and modelled data are noise-free, the denoise function is designed based on the ratio of modelled data to field data summed over shots and receivers. We first take the sums of the modelled data and field data over shots, then take the sums of the absolute values of the resultant modelled data and field data over the receivers. Due to the monochromatic property of wavefields at each frequency, signals in both modelled and field data tend to be cancelled out or maintained, whereas certain types of noise, particularly random noise, can be amplified in field data. As a result, the spectral distribution of the denoise function is inversely proportional to the ratio of noise to signal at each frequency, which helps prevent the noise-dominant gradients from contributing to model parameter updates. Numerical examples show that the spectral distribution of the denoise function resembles a frequency filter that is determined by the spectrum of the signal-to-noise (S/N) ratio during the inversion process, with little human intervention. The denoise function is applied to the elastic FWI of synthetic data, with three types of random noise generated by the modified version of the Marmousi-2 model: white, low-frequency and high-frequency random noises. Based on the spectrum of S/N ratios at each frequency, the denoise function mainly suppresses noise-dominant single-frequency gradients, which improves the inversion results at the cost of spatial resolution.

  17. Determination of the Spatial Distribution in Hydraulic Conductivity Using Genetic Algorithm Optimization

    NASA Astrophysics Data System (ADS)

    Aksoy, A.; Lee, J. H.; Kitanidis, P. K.

    2016-12-01

    Heterogeneity in hydraulic conductivity (K) impacts the transport and fate of contaminants in subsurface as well as design and operation of managed aquifer recharge (MAR) systems. Recently, improvements in computational resources and availability of big data through electrical resistivity tomography (ERT) and remote sensing have provided opportunities to better characterize the subsurface. Yet, there is need to improve prediction and evaluation methods in order to obtain information from field measurements for better field characterization. In this study, genetic algorithm optimization, which has been widely used in optimal aquifer remediation designs, was used to determine the spatial distribution of K. A hypothetical 2 km by 2 km aquifer was considered. A genetic algorithm library, PGAPack, was linked with a fast Fourier transform based random field generator as well as a groundwater flow and contaminant transport simulation model (BIO2D-KE). The objective of the optimization model was to minimize the total squared error between measured and predicted field values. It was assumed measured K values were available through ERT. Performance of genetic algorithm in predicting the distribution of K was tested for different cases. In the first one, it was assumed that observed K values were evaluated using the random field generator only as the forward model. In the second case, as well as K-values obtained through ERT, measured head values were incorporated into evaluation in which BIO2D-KE and random field generator were used as the forward models. Lastly, tracer concentrations were used as additional information in the optimization model. Initial results indicated enhanced performance when random field generator and BIO2D-KE are used in combination in predicting the spatial distribution in K.

  18. A Randomized Field Trial of the Fast ForWord Language Computer-Based Training Program

    ERIC Educational Resources Information Center

    Borman, Geoffrey D.; Benson, James G.; Overman, Laura

    2009-01-01

    This article describes an independent assessment of the Fast ForWord Language computer-based training program developed by Scientific Learning Corporation. Previous laboratory research involving children with language-based learning impairments showed strong effects on their abilities to recognize brief and fast sequences of nonspeech and speech…

  19. Skin cancer texture analysis of OCT images based on Haralick, fractal dimension, Markov random field features, and the complex directional field features

    NASA Astrophysics Data System (ADS)

    Raupov, Dmitry S.; Myakinin, Oleg O.; Bratchenko, Ivan A.; Zakharov, Valery P.; Khramov, Alexander G.

    2016-10-01

    In this paper, we propose a report about our examining of the validity of OCT in identifying changes using a skin cancer texture analysis compiled from Haralick texture features, fractal dimension, Markov random field method and the complex directional features from different tissues. Described features have been used to detect specific spatial characteristics, which can differentiate healthy tissue from diverse skin cancers in cross-section OCT images (B- and/or C-scans). In this work, we used an interval type-II fuzzy anisotropic diffusion algorithm for speckle noise reduction in OCT images. The Haralick texture features as contrast, correlation, energy, and homogeneity have been calculated in various directions. A box-counting method is performed to evaluate fractal dimension of skin probes. Markov random field have been used for the quality enhancing of the classifying. Additionally, we used the complex directional field calculated by the local gradient methodology to increase of the assessment quality of the diagnosis method. Our results demonstrate that these texture features may present helpful information to discriminate tumor from healthy tissue. The experimental data set contains 488 OCT-images with normal skin and tumors as Basal Cell Carcinoma (BCC), Malignant Melanoma (MM) and Nevus. All images were acquired from our laboratory SD-OCT setup based on broadband light source, delivering an output power of 20 mW at the central wavelength of 840 nm with a bandwidth of 25 nm. We obtained sensitivity about 97% and specificity about 73% for a task of discrimination between MM and Nevus.

  20. Paradigm War Revived? On the Diagnosis of Resistance to Randomized Controlled Trials and Systematic Review in Education

    ERIC Educational Resources Information Center

    Hammersley, Martyn

    2008-01-01

    There has been considerable discussion in recent years about the role in educational research of randomized controlled trials (RCTs) and systematic reviews (SR). Advocacy of these methods arose partly as a result of the spread of the notion of evidence-based practice from medicine into other fields, and of the rise of the "new public…

  1. Assessing the significance of pedobarographic signals using random field theory.

    PubMed

    Pataky, Todd C

    2008-08-07

    Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.

  2. Do Human-Figure Drawings of Children and Adolescents Mirror Their Cognitive Style and Self-Esteem?

    ERIC Educational Resources Information Center

    Dey, Anindita; Ghosh, Paromita

    2016-01-01

    The investigation probed relationships among human-figure drawing, field-dependent-independent cognitive style and self-esteem of 10-15 year olds. It also attempted to predict human-figure drawing scores of participants based on their field-dependence-independence and self-esteem. Area, stratified and multi-stage random sampling were used to…

  3. A randomized controlled trial comparing 2 interventions for visual field loss with standard occupational therapy during inpatient stroke rehabilitation.

    PubMed

    Mödden, Claudia; Behrens, Marion; Damke, Iris; Eilers, Norbert; Kastrup, Andreas; Hildebrandt, Helmut

    2012-06-01

    Compensatory and restorative treatments have been developed to improve visual field defects after stroke. However, no controlled trials have compared these interventions with standard occupational therapy (OT). A total of 45 stroke participants with visual field defect admitted for inpatient rehabilitation were randomized to restorative computerized training (RT) using computer-based stimulation of border areas of their visual field defects or to a computer-based compensatory therapy (CT) teaching a visual search strategy. OT, in which different compensation strategies were used to train for activities of daily living, served as standard treatment for the active control group. Each treatment group received 15 single sessions of 30 minutes distributed over 3 weeks. The primary outcome measures were visual field expansion for RT, visual search performance for CT, and reading performance for both treatments. Visual conjunction search, alertness, and the Barthel Index were secondary outcomes. Compared with OT, CT resulted in a better visual search performance, and RT did not result in a larger expansion of the visual field. Intragroup pre-post comparisons demonstrated that CT improved all defined outcome parameters and RT several, whereas OT only improved one. CT improved functional deficits after visual field loss compared with standard OT and may be the intervention of choice during inpatient rehabilitation. A larger trial that includes lesion location in the analysis is recommended.

  4. A Methodological Analysis of Randomized Clinical Trials of Computer-Assisted Therapies for Psychiatric Disorders: Toward Improved Standards for an Emerging Field

    PubMed Central

    Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.

    2013-01-01

    Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689

  5. Analytical connection between thresholds and immunization strategies of SIS model in random networks

    NASA Astrophysics Data System (ADS)

    Zhou, Ming-Yang; Xiong, Wen-Man; Liao, Hao; Wang, Tong; Wei, Zong-Wen; Fu, Zhong-Qian

    2018-05-01

    Devising effective strategies for hindering the propagation of viruses and protecting the population against epidemics is critical for public security and health. Despite a number of studies based on the susceptible-infected-susceptible (SIS) model devoted to this topic, we still lack a general framework to compare different immunization strategies in completely random networks. Here, we address this problem by suggesting a novel method based on heterogeneous mean-field theory for the SIS model. Our method builds the relationship between the thresholds and different immunization strategies in completely random networks. Besides, we provide an analytical argument that the targeted large-degree strategy achieves the best performance in random networks with arbitrary degree distribution. Moreover, the experimental results demonstrate the effectiveness of the proposed method in both artificial and real-world networks.

  6. Simulating and mapping spatial complexity using multi-scale techniques

    USGS Publications Warehouse

    De Cola, L.

    1994-01-01

    A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author

  7. Epidemic spreading in weighted networks: an edge-based mean-field solution.

    PubMed

    Yang, Zimo; Zhou, Tao

    2012-05-01

    Weight distribution greatly impacts the epidemic spreading taking place on top of networks. This paper presents a study of a susceptible-infected-susceptible model on regular random networks with different kinds of weight distributions. Simulation results show that the more homogeneous weight distribution leads to higher epidemic prevalence, which, unfortunately, could not be captured by the traditional mean-field approximation. This paper gives an edge-based mean-field solution for general weight distribution, which can quantitatively reproduce the simulation results. This method could be applied to characterize the nonequilibrium steady states of dynamical processes on weighted networks.

  8. Kernel-Correlated Levy Field Driven Forward Rate and Application to Derivative Pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo Lijun; Wang Yongjin; Yang Xuewei, E-mail: xwyangnk@yahoo.com.cn

    2013-08-01

    We propose a term structure of forward rates driven by a kernel-correlated Levy random field under the HJM framework. The kernel-correlated Levy random field is composed of a kernel-correlated Gaussian random field and a centered Poisson random measure. We shall give a criterion to preclude arbitrage under the risk-neutral pricing measure. As applications, an interest rate derivative with general payoff functional is priced under this pricing measure.

  9. Universality of Critically Pinned Interfaces in Two-Dimensional Isotropic Random Media

    NASA Astrophysics Data System (ADS)

    Grassberger, Peter

    2018-05-01

    Based on extensive simulations, we conjecture that critically pinned interfaces in two-dimensional isotropic random media with short-range correlations are always in the universality class of ordinary percolation. Thus, in contrast to interfaces in >2 dimensions, there is no distinction between fractal (i.e., percolative) and rough but nonfractal interfaces. Our claim includes interfaces in zero-temperature random field Ising models (both with and without spontaneous nucleation), in heterogeneous bootstrap percolation, and in susceptible-weakened-infected-removed epidemics. It does not include models with long-range correlations in the randomness and models where overhangs are explicitly forbidden (which would imply nonisotropy of the medium).

  10. Incorporating conditional random fields and active learning to improve sentiment identification.

    PubMed

    Zhang, Kunpeng; Xie, Yusheng; Yang, Yi; Sun, Aaron; Liu, Hengchang; Choudhary, Alok

    2014-10-01

    Many machine learning, statistical, and computational linguistic methods have been developed to identify sentiment of sentences in documents, yielding promising results. However, most of state-of-the-art methods focus on individual sentences and ignore the impact of context on the meaning of a sentence. In this paper, we propose a method based on conditional random fields to incorporate sentence structure and context information in addition to syntactic information for improving sentiment identification. We also investigate how human interaction affects the accuracy of sentiment labeling using limited training data. We propose and evaluate two different active learning strategies for labeling sentiment data. Our experiments with the proposed approach demonstrate a 5%-15% improvement in accuracy on Amazon customer reviews compared to existing supervised learning and rule-based methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Electron spin resonance of nitrogen-vacancy centers in optically trapped nanodiamonds

    PubMed Central

    Horowitz, Viva R.; Alemán, Benjamín J.; Christle, David J.; Cleland, Andrew N.; Awschalom, David D.

    2012-01-01

    Using an optical tweezers apparatus, we demonstrate three-dimensional control of nanodiamonds in solution with simultaneous readout of ground-state electron-spin resonance (ESR) transitions in an ensemble of diamond nitrogen-vacancy color centers. Despite the motion and random orientation of nitrogen-vacancy centers suspended in the optical trap, we observe distinct peaks in the measured ESR spectra qualitatively similar to the same measurement in bulk. Accounting for the random dynamics, we model the ESR spectra observed in an externally applied magnetic field to enable dc magnetometry in solution. We estimate the dc magnetic field sensitivity based on variations in ESR line shapes to be approximately . This technique may provide a pathway for spin-based magnetic, electric, and thermal sensing in fluidic environments and biophysical systems inaccessible to existing scanning probe techniques. PMID:22869706

  12. Phase unwrapping using region-based markov random field model.

    PubMed

    Dong, Ying; Ji, Jim

    2010-01-01

    Phase unwrapping is a classical problem in Magnetic Resonance Imaging (MRI), Interferometric Synthetic Aperture Radar and Sonar (InSAR/InSAS), fringe pattern analysis, and spectroscopy. Although many methods have been proposed to address this problem, robust and effective phase unwrapping remains a challenge. This paper presents a novel phase unwrapping method using a region-based Markov Random Field (MRF) model. Specifically, the phase image is segmented into regions within which the phase is not wrapped. Then, the phase image is unwrapped between different regions using an improved Highest Confidence First (HCF) algorithm to optimize the MRF model. The proposed method has desirable theoretical properties as well as an efficient implementation. Simulations and experimental results on MRI images show that the proposed method provides similar or improved phase unwrapping than Phase Unwrapping MAx-flow/min-cut (PUMA) method and ZpM method.

  13. Is the Non-Dipole Magnetic Field Random?

    NASA Technical Reports Server (NTRS)

    Walker, Andrew D.; Backus, George E.

    1996-01-01

    Statistical modelling of the Earth's magnetic field B has a long history. In particular, the spherical harmonic coefficients of scalar fields derived from B can be treated as Gaussian random variables. In this paper, we give examples of highly organized fields whose spherical harmonic coefficients pass tests for independent Gaussian random variables. The fact that coefficients at some depth may be usefully summarized as independent samples from a normal distribution need not imply that there really is some physical, random process at that depth. In fact, the field can be extremely structured and still be regarded for some purposes as random. In this paper, we examined the radial magnetic field B(sub r) produced by the core, but the results apply to any scalar field on the core-mantle boundary (CMB) which determines B outside the CMB.

  14. Low-Energy Truly Random Number Generation with Superparamagnetic Tunnel Junctions for Unconventional Computing

    NASA Astrophysics Data System (ADS)

    Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.

    2017-11-01

    Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.

  15. Application of the Hotelling and ideal observers to detection and localization of exoplanets.

    PubMed

    Caucci, Luca; Barrett, Harrison H; Devaney, Nicholas; Rodríguez, Jeffrey J

    2007-12-01

    The ideal linear discriminant or Hotelling observer is widely used for detection tasks and image-quality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with long-exposure images obtained from ground-based or space-based telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the point-spread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves.

  16. Application of the Hotelling and ideal observers to detection and localization of exoplanets

    PubMed Central

    Caucci, Luca; Barrett, Harrison H.; Devaney, Nicholas; Rodríguez, Jeffrey J.

    2008-01-01

    The ideal linear discriminant or Hotelling observer is widely used for detection tasks and image-quality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with long-exposure images obtained from ground-based or space-based telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the point-spread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves. PMID:18059905

  17. Object-based change detection method using refined Markov random field

    NASA Astrophysics Data System (ADS)

    Peng, Daifeng; Zhang, Yongjun

    2017-01-01

    In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.

  18. Random-field-induced disordering mechanism in a disordered ferromagnet: Between the Imry-Ma and the standard disordering mechanism

    NASA Astrophysics Data System (ADS)

    Andresen, Juan Carlos; Katzgraber, Helmut G.; Schechter, Moshe

    2017-12-01

    Random fields disorder Ising ferromagnets by aligning single spins in the direction of the random field in three space dimensions, or by flipping large ferromagnetic domains at dimensions two and below. While the former requires random fields of typical magnitude similar to the interaction strength, the latter Imry-Ma mechanism only requires infinitesimal random fields. Recently, it has been shown that for dilute anisotropic dipolar systems a third mechanism exists, where the ferromagnetic phase is disordered by finite-size glassy domains at a random field of finite magnitude that is considerably smaller than the typical interaction strength. Using large-scale Monte Carlo simulations and zero-temperature numerical approaches, we show that this mechanism applies to disordered ferromagnets with competing short-range ferromagnetic and antiferromagnetic interactions, suggesting its generality in ferromagnetic systems with competing interactions and an underlying spin-glass phase. A finite-size-scaling analysis of the magnetization distribution suggests that the transition might be first order.

  19. Active classifier selection for RGB-D object categorization using a Markov random field ensemble method

    NASA Astrophysics Data System (ADS)

    Durner, Maximilian; Márton, Zoltán.; Hillenbrand, Ulrich; Ali, Haider; Kleinsteuber, Martin

    2017-03-01

    In this work, a new ensemble method for the task of category recognition in different environments is presented. The focus is on service robotic perception in an open environment, where the robot's task is to recognize previously unseen objects of predefined categories, based on training on a public dataset. We propose an ensemble learning approach to be able to flexibly combine complementary sources of information (different state-of-the-art descriptors computed on color and depth images), based on a Markov Random Field (MRF). By exploiting its specific characteristics, the MRF ensemble method can also be executed as a Dynamic Classifier Selection (DCS) system. In the experiments, the committee- and topology-dependent performance boost of our ensemble is shown. Despite reduced computational costs and using less information, our strategy performs on the same level as common ensemble approaches. Finally, the impact of large differences between datasets is analyzed.

  20. [A magnetic therapy apparatus with an adaptable electromagnetic spectrum for the treatment of prostatitis and gynecopathies].

    PubMed

    Kuz'min, A A; Meshkovskiĭ, D V; Filist, S A

    2008-01-01

    Problems of engineering and algorithm development of magnetic therapy apparatuses with pseudo-random radiation spectrum within the audio range for treatment of prostatitis and gynecopathies are considered. A typical design based on a PIC 16F microcontroller is suggested. It includes a keyboard, LCD indicator, audio amplifier, inducer, and software units. The problem of pseudo-random signal generation within the audio range is considered. A series of rectangular pulses is generated on a random-length interval on the basis of a three-component random vector. This series provides the required spectral characteristics of the therapeutic magnetic field and their adaptation to the therapeutic conditions and individual features of the patient.

  1. [Realization of design regarding experimental research in the clinical real-world research].

    PubMed

    He, Q; Shi, J P

    2018-04-10

    Real world study (RWS), a further verification and supplement for explanatory randomized controlled trial to evaluate the effectiveness of intervention measures in real clinical environment, has increasingly become the focus in the field of research on medical and health care services. However, some people mistakenly equate real world study with observational research, and argue that intervention and randomization cannot be carried out in real world study. In fact, both observational and experimental design are the basic designs in real world study, while the latter usually refers to pragmatic randomized controlled trial and registry-based randomized controlled trial. Other nonrandomized controlled and adaptive designs can also be adopted in the RWS.

  2. Direct Simulation of Extinction in a Slab of Spherical Particles

    NASA Technical Reports Server (NTRS)

    Mackowski, D.W.; Mishchenko, Michael I.

    2013-01-01

    The exact multiple sphere superposition method is used to calculate the coherent and incoherent contributions to the ensemble-averaged electric field amplitude and Poynting vector in systems of randomly positioned nonabsorbing spherical particles. The target systems consist of cylindrical volumes, with radius several times larger than length, containing spheres with positional configurations generated by a Monte Carlo sampling method. Spatially dependent values for coherent electric field amplitude, coherent energy flux, and diffuse energy flux, are calculated by averaging of exact local field and flux values over multiple configurations and over spatially independent directions for fixed target geometry, sphere properties, and sphere volume fraction. Our results reveal exponential attenuation of the coherent field and the coherent energy flux inside the particulate layer and thereby further corroborate the general methodology of the microphysical radiative transfer theory. An effective medium model based on plane wave transmission and reflection by a plane layer is used to model the dependence of the coherent electric field on particle packing density. The effective attenuation coefficient of the random medium, computed from the direct simulations, is found to agree closely with effective medium theories and with measurements. In addition, the simulation results reveal the presence of a counter-propagating component to the coherent field, which arises due to the internal reflection of the main coherent field component by the target boundary. The characteristics of the diffuse flux are compared to, and found to be consistent with, a model based on the diffusion approximation of the radiative transfer theory.

  3. Coherent light scattering of heterogeneous randomly rough films and effective medium in the theory of electromagnetic wave multiple scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berginc, G

    2013-11-30

    We have developed a general formalism based on Green's functions to calculate the coherent electromagnetic field scattered by a random medium with rough boundaries. The approximate expression derived makes it possible to determine the effective permittivity, which is generalised for a layer of an inhomogeneous random medium with different types of particles and bounded with randomly rough interfaces. This effective permittivity describes the coherent propagation of an electromagnetic wave in a random medium with randomly rough boundaries. We have obtained an expression, which contains the Maxwell – Garnett formula at the low-frequency limit, and the Keller formula; the latter hasmore » been proved to be in good agreement with experiments for particles whose dimensions are larger than a wavelength. (coherent light scattering)« less

  4. Outcomes of Parent Education Programs Based on Reevaluation Counseling

    ERIC Educational Resources Information Center

    Wolfe, Randi B.; Hirsch, Barton J.

    2003-01-01

    We report two studies in which a parent education program based on Reevaluation Counseling was field-tested on mothers randomly assigned to treatment groups or equivalent, no-treatment comparison groups. The goal was to evaluate the program's viability, whether there were measurable effects, whether those effects were sustained over time, and…

  5. Optoenergy storage and random walks assisted broadband amplification in Er3+-doped (Pb,La)(Zr,Ti)O3 disordered ceramics.

    PubMed

    Xu, Long; Zhao, Hua; Xu, Caixia; Zhang, Siqi; Zou, Yingyin K; Zhang, Jingwen

    2014-02-01

    A broadband optical amplification was observed and investigated in Er3+-doped electrostrictive ceramics of lanthanum-modified lead zirconate titanate under a corona atmosphere. The ceramic structure change caused by UV light, electric field, and random walks originated from the diffusive process in intrinsically disordered materials may all contribute to the optical amplification and the associated energy storage. Discussion based on optical energy storage and diffusive equations was given to explain the findings. Those experiments performed made it possible to study random walks and optical amplification in transparent ceramics materials.

  6. Development of a Random Field Model for Gas Plume Detection in Multiple LWIR Images.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heasler, Patrick G.

    This report develops a random field model that describes gas plumes in LWIR remote sensing images. The random field model serves as a prior distribution that can be combined with LWIR data to produce a posterior that determines the probability that a gas plume exists in the scene and also maps the most probable location of any plume. The random field model is intended to work with a single pixel regression estimator--a regression model that estimates gas concentration on an individual pixel basis.

  7. Tensor Minkowski Functionals for random fields on the sphere

    NASA Astrophysics Data System (ADS)

    Chingangbam, Pravabati; Yogendran, K. P.; Joby, P. K.; Ganesan, Vidhya; Appleby, Stephen; Park, Changbom

    2017-12-01

    We generalize the translation invariant tensor-valued Minkowski Functionals which are defined on two-dimensional flat space to the unit sphere. We apply them to level sets of random fields. The contours enclosing boundaries of level sets of random fields give a spatial distribution of random smooth closed curves. We outline a method to compute the tensor-valued Minkowski Functionals numerically for any random field on the sphere. Then we obtain analytic expressions for the ensemble expectation values of the matrix elements for isotropic Gaussian and Rayleigh fields. The results hold on flat as well as any curved space with affine connection. We elucidate the way in which the matrix elements encode information about the Gaussian nature and statistical isotropy (or departure from isotropy) of the field. Finally, we apply the method to maps of the Galactic foreground emissions from the 2015 PLANCK data and demonstrate their high level of statistical anisotropy and departure from Gaussianity.

  8. Correcting Biases in a lower resolution global circulation model with data assimilation

    NASA Astrophysics Data System (ADS)

    Canter, Martin; Barth, Alexander

    2016-04-01

    With this work, we aim at developping a new method of bias correction using data assimilation. This method is based on the stochastic forcing of a model to correct bias. First, through a preliminary run, we estimate the bias of the model and its possible sources. Then, we establish a forcing term which is directly added inside the model's equations. We create an ensemble of runs and consider the forcing term as a control variable during the assimilation of observations. We then use this analysed forcing term to correct the bias of the model. Since the forcing is added inside the model, it acts as a source term, unlike external forcings such as wind. This procedure has been developed and successfully tested with a twin experiment on a Lorenz 95 model. It is currently being applied and tested on the sea ice ocean NEMO LIM model, which is used in the PredAntar project. NEMO LIM is a global and low resolution (2 degrees) coupled model (hydrodynamic model and sea ice model) with long time steps allowing simulations over several decades. Due to its low resolution, the model is subject to bias in area where strong currents are present. We aim at correcting this bias by using perturbed current fields from higher resolution models and randomly generated perturbations. The random perturbations need to be constrained in order to respect the physical properties of the ocean, and not create unwanted phenomena. To construct those random perturbations, we first create a random field with the Diva tool (Data-Interpolating Variational Analysis). Using a cost function, this tool penalizes abrupt variations in the field, while using a custom correlation length. It also decouples disconnected areas based on topography. Then, we filter the field to smoothen it and remove small scale variations. We use this field as a random stream function, and take its derivatives to get zonal and meridional velocity fields. We also constrain the stream function along the coasts in order not to have currents perpendicular to the coast. The randomly generated stochastic forcing are then directly injected into the NEMO LIM model's equations in order to force the model at each timestep, and not only during the assimilation step. Results from a twin experiment will be presented. This method is being applied to a real case, with observations on the sea surface height available from the mean dynamic topography of CNES (Centre national d'études spatiales). The model, the bias correction, and more extensive forcings, in particular with a three dimensional structure and a time-varying component, will also be presented.

  9. Field size, length, and width distributions based on LACIE ground truth data. [large area crop inventory experiment

    NASA Technical Reports Server (NTRS)

    Pitts, D. E.; Badhwar, G.

    1980-01-01

    The development of agricultural remote sensing systems requires knowledge of agricultural field size distributions so that the sensors, sampling frames, image interpretation schemes, registration systems, and classification systems can be properly designed. Malila et al. (1976) studied the field size distribution for wheat and all other crops in two Kansas LACIE (Large Area Crop Inventory Experiment) intensive test sites using ground observations of the crops and measurements of their field areas based on current year rectified aerial photomaps. The field area and size distributions reported in the present investigation are derived from a representative subset of a stratified random sample of LACIE sample segments. In contrast to previous work, the obtained results indicate that most field-size distributions are not log-normally distributed. The most common field size observed in this study was 10 acres for most crops studied.

  10. Authoritative knowledge, evidence-based medicine, and behavioral pediatrics.

    PubMed

    Kennell, J H

    1999-12-01

    Evidence-based medicine is the conscientious and judicious use of current best knowledge in making decisions about the care of individual patients, often from well-designed, randomized, controlled trials. Authoritative medicine is the traditional approach to learning and practicing medicine, but no one authority has comprehensive scientific knowledge. Archie Cochrane proposed that every medical specialty should compile a list of all of the randomized, controlled trials within its field to be available for those who wish to know what treatments are effective. This was done first for obstetrics by a group collecting and critically analyzing all of the randomized trials and then indicating procedures every mother should have and those that no mother should have. Support during labor was used as an example. Similar groups are now active in almost all specialties, with information available on the Internet in the Cochrane Database of Systematic Reviews. Developmental-behavioral pediatrics should be part of this movement to evidence-based medicine.

  11. Choice of optical system is critical for the security of double random phase encryption systems

    NASA Astrophysics Data System (ADS)

    Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Cassidy, Derek; Zhao, Liang; Ryle, James P.; Healy, John J.; Sheridan, John T.

    2017-06-01

    The linear canonical transform (LCT) is used in modeling a coherent light-field propagation through first-order optical systems. Recently, a generic optical system, known as the quadratic phase encoding system (QPES), for encrypting a two-dimensional image has been reported. In such systems, two random phase keys and the individual LCT parameters (α,β,γ) serve as secret keys of the cryptosystem. It is important that such encryption systems also satisfy some dynamic security properties. We, therefore, examine such systems using two cryptographic evaluation methods, the avalanche effect and bit independence criterion, which indicate the degree of security of the cryptographic algorithms using QPES. We compared our simulation results with the conventional Fourier and the Fresnel transform-based double random phase encryption (DRPE) systems. The results show that the LCT-based DRPE has an excellent avalanche and bit independence characteristics compared to the conventional Fourier and Fresnel-based encryption systems.

  12. The role of treatment fidelity on outcomes during a randomized field trial of an autism intervention

    PubMed Central

    Mandell, David S; Stahmer, Aubyn C; Shin, Sujie; Xie, Ming; Reisinger, Erica; Marcus, Steven C

    2013-01-01

    This randomized field trial comparing Strategies for Teaching based on Autism Research and Structured Teaching enrolled educators in 33 kindergarten-through-second-grade autism support classrooms and 119 students, aged 5–8 years in the School District of Philadelphia. Students were assessed at the beginning and end of the academic year using the Differential Ability Scales. Program fidelity was measured through video coding and use of a checklist. Outcomes were assessed using linear regression with random effects for classroom and student. Average fidelity was 57% in Strategies for Teaching based on Autism Research classrooms and 48% in Structured Teaching classrooms. There was a 9.2-point (standard deviation = 9.6) increase in Differential Ability Scales score over the 8-month study period, but no main effect of program. There was a significant interaction between fidelity and group. In classrooms with either low or high program fidelity, students in Strategies for Teaching based on Autism Research experienced a greater gain in Differential Ability Scales score than students in Structured Teaching (11.2 vs 5.5 points and 11.3 vs 8.9 points, respectively). In classrooms with moderate fidelity, students in Structured Teaching experienced a greater gain than students in Strategies for Teaching based on Autism Research (10.1 vs 4.4 points). The results suggest significant variability in implementation of evidence-based practices, even with supports, and also suggest the need to address challenging issues related to implementation measurement in community settings. PMID:23592849

  13. Patterns of relapse from a phase 3 Study of response-based therapy for intermediate-risk Hodgkin lymphoma (AHOD0031): a report from the Children's Oncology Group.

    PubMed

    Dharmarajan, Kavita V; Friedman, Debra L; Schwartz, Cindy L; Chen, Lu; FitzGerald, T J; McCarten, Kathleen M; Kessel, Sandy K; Iandoli, Matt; Constine, Louis S; Wolden, Suzanne L

    2015-05-01

    The study was designed to determine whether response-based therapy improves outcomes in intermediate-risk Hodgkin lymphoma. We examined patterns of first relapse in the study. From September 2002 to July 2010, 1712 patients <22 years old with stage I-IIA with bulk, I-IIAE, I-IIB, and IIIA-IVA with or without doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide were enrolled. Patients were categorized as rapid (RER) or slow early responders (SER) after 2 cycles of doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide (ABVE-PC). The SER patients were randomized to 2 additional ABVE-PC cycles or augmented chemotherapy with 21 Gy involved field radiation therapy (IFRT). RER patients were stipulated to undergo 2 additional ABVE-PC cycles and were then randomized to 21 Gy IFRT or no further treatment if complete response (CR) was achieved. RER without CR patients were non-randomly assigned to 21 Gy IFRT. Relapses were characterized without respect to site (initial, new, or both; and initial bulk or initial nonbulk), and involved field radiation therapy field (in-field, out-of-field, or both). Patients were grouped by treatment assignment (SER; RER/no CR; RER/CR/IFRT; and RER/CR/no IFRT). Summary statistics were reported. At 4-year median follow-up, 244 patients had experienced relapse, 198 of whom were fully evaluable for review. Those who progressed during treatment (n=30) or lacked relapse imaging (n=16) were excluded. The median time to relapse was 12.8 months. Of the 198 evaluable patients, 30% were RER/no CR, 26% were SER, 26% were RER/CR/no IFRT, 16% were RER/CR/IFRT, and 2% remained uncategorized. The 74% and 75% relapses involved initially bulky and nonbulky sites, respectively. First relapses rarely occurred at exclusively new or out-of-field sites. By contrast, relapses usually occurred at nodal sites of initial bulky and nonbulky disease. Although response-based therapy has helped define treatment for selected RER patients, it has not improved outcome for SER patients or facilitated refinement of IFRT volumes or doses. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Global mean-field phase diagram of the spin-1 Ising ferromagnet in a random crystal field

    NASA Astrophysics Data System (ADS)

    Borelli, M. E. S.; Carneiro, C. E. I.

    1996-02-01

    We study the phase diagram of the mean-field spin-1 Ising ferromagnet in a uniform magnetic field H and a random crystal field Δi, with probability distribution P( Δi) = pδ( Δi - Δ) + (1 - p) δ( Δi). We analyse the effects of randomness on the first-order surfaces of the Δ- T- H phase diagram for different values of the concentration p and show how these surfaces are affected by the dilution of the crystal field.

  15. Electron avalanche structure determined by random walk theory

    NASA Technical Reports Server (NTRS)

    Englert, G. W.

    1973-01-01

    A self-consistent avalanche solution which accounts for collective long range Coulomb interactions as well as short range elastic and inelastic collisions between electrons and background atoms is made possible by a random walk technique. Results show that the electric field patterns in the early formation stages of avalanches in helium are close to those obtained from theory based on constant transport coefficients. Regions of maximum and minimum induced electrostatic potential phi are located on the axis of symmetry and within the volume covered by the electron swarm. As formation time continues, however, the region of minimum phi moves to slightly higher radii and the electric field between the extrema becomes somewhat erratic. In the intermediate formation periods the avalanche growth is slightly retarded by the high concentration of ions in the tail which oppose the external electric field. Eventually the formation of ions and electrons in the localized regions of high field strength more than offset this effect causing a very abrupt increase in avalanche growth.

  16. Directionality fields generated by a local Hilbert transform

    NASA Astrophysics Data System (ADS)

    Ahmed, W. W.; Herrero, R.; Botey, M.; Hayran, Z.; Kurt, H.; Staliunas, K.

    2018-03-01

    We propose an approach based on a local Hilbert transform to design non-Hermitian potentials generating arbitrary vector fields of directionality, p ⃗(r ⃗) , with desired shapes and topologies. We derive a local Hilbert transform to systematically build such potentials by modifying background potentials (being either regular or random, extended or localized). We explore particular directionality fields, for instance in the form of a focus to create sinks for probe fields (which could help to increase absorption at the sink), or to generate vortices in the probe fields. Physically, the proposed directionality fields provide a flexible mechanism for dynamical shaping and precise control over probe fields leading to novel effects in wave dynamics.

  17. Study protocol for a group randomized controlled trial of a classroom-based intervention aimed at preventing early risk factors for drug abuse: integrating effectiveness and implementation research.

    PubMed

    Poduska, Jeanne; Kellam, Sheppard; Brown, C Hendricks; Ford, Carla; Windham, Amy; Keegan, Natalie; Wang, Wei

    2009-09-02

    While a number of preventive interventions delivered within schools have shown both short-term and long-term impact in epidemiologically based randomized field trials, programs are not often sustained with high-quality implementation over time. This study was designed to support two purposes. The first purpose was to test the effectiveness of a universal classroom-based intervention, the Whole Day First Grade Program (WD), aimed at two early antecedents to drug abuse and other problem behaviors, namely, aggressive, disruptive behavior and poor academic achievement. The second purpose--the focus of this paper--was to examine the utility of a multilevel structure to support high levels of implementation during the effectiveness trial, to sustain WD practices across additional years, and to train additional teachers in WD practices. The WD intervention integrated three components, each previously tested separately: classroom behavior management; instruction, specifically reading; and family-classroom partnerships around behavior and learning. Teachers and students in 12 schools were randomly assigned to receive either the WD intervention or the standard first-grade program of the school system (SC). Three consecutive cohorts of first graders were randomized within schools to WD or SC classrooms and followed through the end of third grade to test the effectiveness of the WD intervention. Teacher practices were assessed over three years to examine the utility of the multilevel structure to support sustainability and scaling-up. The design employed in this trial appears to have considerable utility to provide data on WD effectiveness and to inform the field with regard to structures required to move evidence-based programs into practice. NCT00257088.

  18. Three-Dimensional Electromagnetic Scattering from Layered Media with Rough Interfaces for Subsurface Radar Remote Sensing

    NASA Astrophysics Data System (ADS)

    Duan, Xueyang

    The objective of this dissertation is to develop forward scattering models for active microwave remote sensing of natural features represented by layered media with rough interfaces. In particular, soil profiles are considered, for which a model of electromagnetic scattering from multilayer rough surfaces with or without buried random media is constructed. Starting from a single rough surface, radar scattering is modeled using the stabilized extended boundary condition method (SEBCM). This method solves the long-standing instability issue of the classical EBCM, and gives three-dimensional full wave solutions over large ranges of surface roughnesses with higher computational efficiency than pure numerical solutions, e.g., method of moments (MoM). Based on this single surface solution, multilayer rough surface scattering is modeled using the scattering matrix approach and the model is used for a comprehensive sensitivity analysis of the total ground scattering as a function of layer separation, subsurface statistics, and sublayer dielectric properties. The buried inhomogeneities such as rocks and vegetation roots are considered for the first time in the forward scattering model. Radar scattering from buried random media is modeled by the aggregate transition matrix using either the recursive transition matrix approach for spherical or short-length cylindrical scatterers, or the generalized iterative extended boundary condition method we developed for long cylinders or root-like cylindrical clusters. These approaches take the field interactions among scatterers into account with high computational efficiency. The aggregate transition matrix is transformed to a scattering matrix for the full solution to the layered-medium problem. This step is based on the near-to-far field transformation of the numerical plane wave expansion of the spherical harmonics and the multipole expansion of plane waves. This transformation consolidates volume scattering from the buried random medium with the scattering from layered structure in general. Combined with scattering from multilayer rough surfaces, scattering contributions from subsurfaces and vegetation roots can be then simulated. Solutions of both the rough surface scattering and random media scattering are validated numerically, experimentally, or both. The experimental validations have been carried out using a laboratory-based transmit-receive system for scattering from random media and a new bistatic tower-mounted radar system for field-based surface scattering measurements.

  19. Automated feature extraction and spatial organization of seafloor pockmarks, Belfast Bay, Maine, USA

    USGS Publications Warehouse

    Andrews, Brian D.; Brothers, Laura L.; Barnhardt, Walter A.

    2010-01-01

    Seafloor pockmarks occur worldwide and may represent millions of m3 of continental shelf erosion, but few numerical analyses of their morphology and spatial distribution of pockmarks exist. We introduce a quantitative definition of pockmark morphology and, based on this definition, propose a three-step geomorphometric method to identify and extract pockmarks from high-resolution swath bathymetry. We apply this GIS-implemented approach to 25 km2 of bathymetry collected in the Belfast Bay, Maine USA pockmark field. Our model extracted 1767 pockmarks and found a linear pockmark depth-to-diameter ratio for pockmarks field-wide. Mean pockmark depth is 7.6 m and mean diameter is 84.8 m. Pockmark distribution is non-random, and nearly half of the field's pockmarks occur in chains. The most prominent chains are oriented semi-normal to the steepest gradient in Holocene sediment thickness. A descriptive model yields field-wide spatial statistics indicating that pockmarks are distributed in non-random clusters. Results enable quantitative comparison of pockmarks in fields worldwide as well as similar concave features, such as impact craters, dolines, or salt pools.

  20. Criticality of the random field Ising model in and out of equilibrium: A nonperturbative functional renormalization group description

    NASA Astrophysics Data System (ADS)

    Balog, Ivan; Tarjus, Gilles; Tissier, Matthieu

    2018-03-01

    We show that, contrary to previous suggestions based on computer simulations or erroneous theoretical treatments, the critical points of the random-field Ising model out of equilibrium, when quasistatically changing the applied source at zero temperature, and in equilibrium are not in the same universality class below some critical dimension dD R≈5.1 . We demonstrate this by implementing a nonperturbative functional renormalization group for the associated dynamical field theory. Above dD R, the avalanches, which characterize the evolution of the system at zero temperature, become irrelevant at large distance, and hysteresis and equilibrium critical points are then controlled by the same fixed point. We explain how to use computer simulation and finite-size scaling to check the correspondence between in and out of equilibrium criticality in a far less ambiguous way than done so far.

  1. On the design of random metasurface based devices.

    PubMed

    Dupré, Matthieu; Hsu, Liyi; Kanté, Boubacar

    2018-05-08

    Metasurfaces are generally designed by placing scatterers in periodic or pseudo-periodic grids. We propose and discuss design rules for functional metasurfaces with randomly placed anisotropic elements that randomly sample a well-defined phase function. By analyzing the focusing performance of random metasurface lenses as a function of their density and the density of the phase-maps used to design them, we find that the performance of 1D metasurfaces is mostly governed by their density while 2D metasurfaces strongly depend on both the density and the near-field coupling configuration of the surface. The proposed approach is used to design all-polarization random metalenses at near infrared frequencies. Challenges, as well as opportunities of random metasurfaces compared to periodic ones are discussed. Our results pave the way to new approaches in the design of nanophotonic structures and devices from lenses to solar energy concentrators.

  2. Exploration of Teaching Skills of Pre-Service High School Teachers' through Self-Regulated Learning Based on Learning Style

    ERIC Educational Resources Information Center

    Habibi; Kuswanto, Heru; Yanti, Fitri April

    2017-01-01

    An expert in the field of science is often difficult to teach his knowledge to students. Conversely someone who is expert in the field of education is certainly more expert in transferring knowledge. The purpose of this research is to explore the skill of teaching skill preservice of physics teacher of High School. Samples were taken randomly as…

  3. Using the Major Field Test for a Bachelor's Degree in Business as a Learning Outcomes Assessment: Evidence from a Review of 20 Years of Institution-Based Research

    ERIC Educational Resources Information Center

    Ling, Guangming; Bochenek, Jennifer; Burkander, Kri

    2015-01-01

    By applying multilevel models with random effects, the authors reviewed and synthesized findings from 30 studies that were published in the last 20 years exploring the relationship between the Educational Testing Service Major Field Test for a Bachelor's Degree in Business (MFTB) and related factors. The results suggest that MFTB scores correlated…

  4. Source-Device-Independent Ultrafast Quantum Random Number Generation.

    PubMed

    Marangon, Davide G; Vallone, Giuseppe; Villoresi, Paolo

    2017-02-10

    Secure random numbers are a fundamental element of many applications in science, statistics, cryptography and more in general in security protocols. We present a method that enables the generation of high-speed unpredictable random numbers from the quadratures of an electromagnetic field without any assumption on the input state. The method allows us to eliminate the numbers that can be predicted due to the presence of classical and quantum side information. In particular, we introduce a procedure to estimate a bound on the conditional min-entropy based on the entropic uncertainty principle for position and momentum observables of infinite dimensional quantum systems. By the above method, we experimentally demonstrated the generation of secure true random bits at a rate greater than 1.7 Gbit/s.

  5. Cover estimation and payload location using Markov random fields

    NASA Astrophysics Data System (ADS)

    Quach, Tu-Thach

    2014-02-01

    Payload location is an approach to find the message bits hidden in steganographic images, but not necessarily their logical order. Its success relies primarily on the accuracy of the underlying cover estimators and can be improved if more estimators are used. This paper presents an approach based on Markov random field to estimate the cover image given a stego image. It uses pairwise constraints to capture the natural two-dimensional statistics of cover images and forms a basis for more sophisticated models. Experimental results show that it is competitive against current state-of-the-art estimators and can locate payload embedded by simple LSB steganography and group-parity steganography. Furthermore, when combined with existing estimators, payload location accuracy improves significantly.

  6. Inflation with a graceful exit in a random landscape

    NASA Astrophysics Data System (ADS)

    Pedro, F. G.; Westphal, A.

    2017-03-01

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N ≪ 10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  7. Optimal estimation of spatially variable recharge and transmissivity fields under steady-state groundwater flow. Part 1. Theory

    NASA Astrophysics Data System (ADS)

    Graham, Wendy D.; Tankersley, Claude D.

    1994-05-01

    Stochastic methods are used to analyze two-dimensional steady groundwater flow subject to spatially variable recharge and transmissivity. Approximate partial differential equations are developed for the covariances and cross-covariances between the random head, transmissivity and recharge fields. Closed-form solutions of these equations are obtained using Fourier transform techniques. The resulting covariances and cross-covariances can be incorporated into a Bayesian conditioning procedure which provides optimal estimates of the recharge, transmissivity and head fields given available measurements of any or all of these random fields. Results show that head measurements contain valuable information for estimating the random recharge field. However, when recharge is treated as a spatially variable random field, the value of head measurements for estimating the transmissivity field can be reduced considerably. In a companion paper, the method is applied to a case study of the Upper Floridan Aquifer in NE Florida.

  8. The random field Blume-Capel model revisited

    NASA Astrophysics Data System (ADS)

    Santos, P. V.; da Costa, F. A.; de Araújo, J. M.

    2018-04-01

    We have revisited the mean-field treatment for the Blume-Capel model under the presence of a discrete random magnetic field as introduced by Kaufman and Kanner (1990). The magnetic field (H) versus temperature (T) phase diagrams for given values of the crystal field D were recovered in accordance to Kaufman and Kanner original work. However, our main goal in the present work was to investigate the distinct structures of the crystal field versus temperature phase diagrams as the random magnetic field is varied because similar models have presented reentrant phenomenon due to randomness. Following previous works we have classified the distinct phase diagrams according to five different topologies. The topological structure of the phase diagrams is maintained for both H - T and D - T cases. Although the phase diagrams exhibit a richness of multicritical phenomena we did not found any reentrant effect as have been seen in similar models.

  9. Compact quantum random number generator based on superluminescent light-emitting diodes

    NASA Astrophysics Data System (ADS)

    Wei, Shihai; Yang, Jie; Fan, Fan; Huang, Wei; Li, Dashuang; Xu, Bingjie

    2017-12-01

    By measuring the amplified spontaneous emission (ASE) noise of the superluminescent light emitting diodes, we propose and realize a quantum random number generator (QRNG) featured with practicability. In the QRNG, after the detection and amplification of the ASE noise, the data acquisition and randomness extraction which is integrated in a field programmable gate array (FPGA) are both implemented in real-time, and the final random bit sequences are delivered to a host computer with a real-time generation rate of 1.2 Gbps. Further, to achieve compactness, all the components of the QRNG are integrated on three independent printed circuit boards with a compact design, and the QRNG is packed in a small enclosure sized 140 mm × 120 mm × 25 mm. The final random bit sequences can pass all the NIST-STS and DIEHARD tests.

  10. A New Algorithm with Plane Waves and Wavelets for Random Velocity Fields with Many Spatial Scales

    NASA Astrophysics Data System (ADS)

    Elliott, Frank W.; Majda, Andrew J.

    1995-03-01

    A new Monte Carlo algorithm for constructing and sampling stationary isotropic Gaussian random fields with power-law energy spectrum, infrared divergence, and fractal self-similar scaling is developed here. The theoretical basis for this algorithm involves the fact that such a random field is well approximated by a superposition of random one-dimensional plane waves involving a fixed finite number of directions. In general each one-dimensional plane wave is the sum of a random shear layer and a random acoustical wave. These one-dimensional random plane waves are then simulated by a wavelet Monte Carlo method for a single space variable developed recently by the authors. The computational results reported in this paper demonstrate remarkable low variance and economical representation of such Gaussian random fields through this new algorithm. In particular, the velocity structure function for an imcorepressible isotropic Gaussian random field in two space dimensions with the Kolmogoroff spectrum can be simulated accurately over 12 decades with only 100 realizations of the algorithm with the scaling exponent accurate to 1.1% and the constant prefactor accurate to 6%; in fact, the exponent of the velocity structure function can be computed over 12 decades within 3.3% with only 10 realizations. Furthermore, only 46,592 active computational elements are utilized in each realization to achieve these results for 12 decades of scaling behavior.

  11. Evidence-based medicine in plastic surgery: where did it come from and where is it going?

    PubMed

    Ricci, Joseph A; Desai, Naman S

    2014-05-01

    Evidence-based medicine, particularly randomized controlled trials, influence many of the daily decisions within plastic surgery as well as nearly every other medical specialty, and will continue to play a larger role in medicine in the future. Even though it is certainly not a new idea, evidence-based medicine continues to remain a hot topic among members of the healthcare community. As evidence-based medicine continues to grow and evolve, it is becoming more important for all physicians to understand the fundamentals of evidence-based medicine: how evidence-based medicine has changed, and how to successfully incorporate it into the daily practice of medicine. Admittedly, the wide acceptance and implementation of evidence-based medicine has been slower in surgical fields such as plastic surgery given the difficulty in performing large scale blinded randomized controlled trials due to the inherent nature of a surgical intervention as a treatment modality. Despite these challenges, the plastic surgery literature has recently begun to respond to the demand for more evidence-based medicine. Today's plastic surgeons are making a concerted embrace evidence-based medicine by increasing the amount of out of high-level clinical evidence and should be encouraged to continue to further their endeavors in the field of evidence-based medicine in the future. © 2014 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.

  12. Estimation of beam material random field properties via sensitivity-based model updating using experimental frequency response functions

    NASA Astrophysics Data System (ADS)

    Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.

    2018-03-01

    Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.

  13. Markov-random-field-based super-resolution mapping for identification of urban trees in VHR images

    NASA Astrophysics Data System (ADS)

    Ardila, Juan P.; Tolpekin, Valentyn A.; Bijker, Wietske; Stein, Alfred

    2011-11-01

    Identification of tree crowns from remote sensing requires detailed spectral information and submeter spatial resolution imagery. Traditional pixel-based classification techniques do not fully exploit the spatial and spectral characteristics of remote sensing datasets. We propose a contextual and probabilistic method for detection of tree crowns in urban areas using a Markov random field based super resolution mapping (SRM) approach in very high resolution images. Our method defines an objective energy function in terms of the conditional probabilities of panchromatic and multispectral images and it locally optimizes the labeling of tree crown pixels. Energy and model parameter values are estimated from multiple implementations of SRM in tuning areas and the method is applied in QuickBird images to produce a 0.6 m tree crown map in a city of The Netherlands. The SRM output shows an identification rate of 66% and commission and omission errors in small trees and shrub areas. The method outperforms tree crown identification results obtained with maximum likelihood, support vector machines and SRM at nominal resolution (2.4 m) approaches.

  14. Portal imaging based definition of the planning target volume during pelvic irradiation for gynecological malignancies.

    PubMed

    Mock, U; Dieckmann, K; Wolff, U; Knocke, T H; Pötter, R

    1999-08-01

    Geometrical accuracy in patient positioning can vary substantially during external radiotherapy. This study estimated the set-up accuracy during pelvic irradiation for gynecological malignancies for determination of safety margins (planning target volume, PTV). Based on electronic portal imaging devices (EPID), 25 patients undergoing 4-field pelvic irradiation for gynecological malignancies were analyzed with regard to set-up accuracy during the treatment course. Regularly performed EPID images were used in order to systematically assess the systematic and random component of set-up displacements. Anatomical matching of verification and simulation images was followed by measuring corresponding distances between the central axis and anatomical features. Data analysis of set-up errors referred to the x-, y-,and z-axes. Additionally, cumulative frequencies were evaluated. A total of 50 simulation films and 313 verification images were analyzed. For the anterior-posterior (AP) beam direction mean deviations along the x- and z-axes were 1.5 mm and -1.9 mm, respectively. Moreover, random errors of 4.8 mm (x-axis) and 3.0 mm (z-axis) were determined. Concerning the latero-lateral treatment fields, the systematic errors along the two axes were calculated to 2.9 mm (y-axis) and -2.0 mm (z-axis) and random errors of 3.8 mm and 3.5 mm were found, respectively. The cumulative frequency of misalignments < or =5 mm showed values of 75% (AP fields) and 72% (latero-lateral fields). With regard to cumulative frequencies < or =10 mm quantification revealed values of 97% for both beam directions. During external pelvic irradiation therapy for gynecological malignancies, EPID images on a regular basis revealed acceptable set-up inaccuracies. Safety margins (PTV) of 1 cm appear to be sufficient, accounting for more than 95% of all deviations.

  15. Anomalous diffusion on a random comblike structure

    NASA Astrophysics Data System (ADS)

    Havlin, Shlomo; Kiefer, James E.; Weiss, George H.

    1987-08-01

    We have recently studied a random walk on a comblike structure as an analog of diffusion on a fractal structure. In our earlier work, the comb was assumed to have a deterministic structure, the comb having teeth of infinite length. In the present paper we study diffusion on a one-dimensional random comb, the length of whose teeth are random variables with an asymptotic stable law distribution φ(L)~L-(1+γ) where 0<γ<=1. Two mean-field methods are used for the analysis, one based on the continuous-time random walk, and the second a self-consistent scaling theory. Both lead to the same conclusions. We find that the diffusion exponent characterizing the mean-square displacement along the backbone of the comb is dw=4/(1+γ) for γ<1 and dw=2 for γ>=1. The probability of being at the origin at time t is P0(t)~t-ds/2 for large t with ds=(3-γ)/2 for γ<1 and ds=1 for γ>1. When a field is applied along the backbone of the comb the diffusion exponent is dw=2/(1+γ) for γ<1 and dw=1 for γ>=1. The theoretical results are confirmed using the exact enumeration method.

  16. Flexible random lasers with tunable lasing emissions.

    PubMed

    Lee, Ya-Ju; Chou, Chun-Yang; Yang, Zu-Po; Nguyen, Thi Bich Hanh; Yao, Yung-Chi; Yeh, Ting-Wei; Tsai, Meng-Tsan; Kuo, Hao-Chun

    2018-04-19

    In this study, we experimentally demonstrated a flexible random laser fabricated on a polyethylene terephthalate (PET) substrate with a high degree of tunability in lasing emissions. Random lasing oscillation arises mainly from the resonance coupling between the emitted photons of gain medium (Rhodamine 6G, R6G) and the localized surface plasmon (LSP) of silver nanoprisms (Ag NPRs), which increases the effective cross-section for multiple light scattering, thus stimulating the lasing emissions. More importantly, it was found that the random lasing wavelength is blue-shifted monolithically with the increase in bending strains exerted on the PET substrate, and a maximum shift of ∼15 nm was achieved in the lasing wavelength, when a 50% bending strain was exerted on the PET substrate. Such observation is highly repeatable and reversible, and this validates that we can control the lasing wavelength by simply bending the flexible substrate decorated with the Ag NPRs. The scattering spectrum of the Ag NPRs was obtained using a dark-field microscope to understand the mechanism for the dependence of the wavelength shift on the exerted bending strains. As a result, we believe that the experimental demonstration of tunable lasing emissions based on the revealed structure is expected to open up a new application field of random lasers.

  17. Restoration of dimensional reduction in the random-field Ising model at five dimensions

    NASA Astrophysics Data System (ADS)

    Fytas, Nikolaos G.; Martín-Mayor, Víctor; Picco, Marco; Sourlas, Nicolas

    2017-04-01

    The random-field Ising model is one of the few disordered systems where the perturbative renormalization group can be carried out to all orders of perturbation theory. This analysis predicts dimensional reduction, i.e., that the critical properties of the random-field Ising model in D dimensions are identical to those of the pure Ising ferromagnet in D -2 dimensions. It is well known that dimensional reduction is not true in three dimensions, thus invalidating the perturbative renormalization group prediction. Here, we report high-precision numerical simulations of the 5D random-field Ising model at zero temperature. We illustrate universality by comparing different probability distributions for the random fields. We compute all the relevant critical exponents (including the critical slowing down exponent for the ground-state finding algorithm), as well as several other renormalization-group invariants. The estimated values of the critical exponents of the 5D random-field Ising model are statistically compatible to those of the pure 3D Ising ferromagnet. These results support the restoration of dimensional reduction at D =5 . We thus conclude that the failure of the perturbative renormalization group is a low-dimensional phenomenon. We close our contribution by comparing universal quantities for the random-field problem at dimensions 3 ≤D <6 to their values in the pure Ising model at D -2 dimensions, and we provide a clear verification of the Rushbrooke equality at all studied dimensions.

  18. Restoration of dimensional reduction in the random-field Ising model at five dimensions.

    PubMed

    Fytas, Nikolaos G; Martín-Mayor, Víctor; Picco, Marco; Sourlas, Nicolas

    2017-04-01

    The random-field Ising model is one of the few disordered systems where the perturbative renormalization group can be carried out to all orders of perturbation theory. This analysis predicts dimensional reduction, i.e., that the critical properties of the random-field Ising model in D dimensions are identical to those of the pure Ising ferromagnet in D-2 dimensions. It is well known that dimensional reduction is not true in three dimensions, thus invalidating the perturbative renormalization group prediction. Here, we report high-precision numerical simulations of the 5D random-field Ising model at zero temperature. We illustrate universality by comparing different probability distributions for the random fields. We compute all the relevant critical exponents (including the critical slowing down exponent for the ground-state finding algorithm), as well as several other renormalization-group invariants. The estimated values of the critical exponents of the 5D random-field Ising model are statistically compatible to those of the pure 3D Ising ferromagnet. These results support the restoration of dimensional reduction at D=5. We thus conclude that the failure of the perturbative renormalization group is a low-dimensional phenomenon. We close our contribution by comparing universal quantities for the random-field problem at dimensions 3≤D<6 to their values in the pure Ising model at D-2 dimensions, and we provide a clear verification of the Rushbrooke equality at all studied dimensions.

  19. Efficient 3D porous microstructure reconstruction via Gaussian random field and hybrid optimization.

    PubMed

    Jiang, Z; Chen, W; Burkhart, C

    2013-11-01

    Obtaining an accurate three-dimensional (3D) structure of a porous microstructure is important for assessing the material properties based on finite element analysis. Whereas directly obtaining 3D images of the microstructure is impractical under many circumstances, two sets of methods have been developed in literature to generate (reconstruct) 3D microstructure from its 2D images: one characterizes the microstructure based on certain statistical descriptors, typically two-point correlation function and cluster correlation function, and then performs an optimization process to build a 3D structure that matches those statistical descriptors; the other method models the microstructure using stochastic models like a Gaussian random field and generates a 3D structure directly from the function. The former obtains a relatively accurate 3D microstructure, but computationally the optimization process can be very intensive, especially for problems with large image size; the latter generates a 3D microstructure quickly but sacrifices the accuracy due to issues in numerical implementations. A hybrid optimization approach of modelling the 3D porous microstructure of random isotropic two-phase materials is proposed in this paper, which combines the two sets of methods and hence maintains the accuracy of the correlation-based method with improved efficiency. The proposed technique is verified for 3D reconstructions based on silica polymer composite images with different volume fractions. A comparison of the reconstructed microstructures and the optimization histories for both the original correlation-based method and our hybrid approach demonstrates the improved efficiency of the approach. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  20. 3D polarisation speckle as a demonstration of tensor version of the van Cittert-Zernike theorem for stochastic electromagnetic beams

    NASA Astrophysics Data System (ADS)

    Ma, Ning; Zhao, Juan; Hanson, Steen G.; Takeda, Mitsuo; Wang, Wei

    2016-10-01

    Laser speckle has received extensive studies of its basic properties and associated applications. In the majority of research on speckle phenomena, the random optical field has been treated as a scalar optical field, and the main interest has been concentrated on their statistical properties and applications of its intensity distribution. Recently, statistical properties of random electric vector fields referred to as Polarization Speckle have come to attract new interest because of their importance in a variety of areas with practical applications such as biomedical optics and optical metrology. Statistical phenomena of random electric vector fields have close relevance to the theories of speckles, polarization and coherence theory. In this paper, we investigate the correlation tensor for stochastic electromagnetic fields modulated by a depolarizer consisting of a rough-surfaced retardation plate. Under the assumption that the microstructure of the scattering surface on the depolarizer is as fine as to be unresolvable in our observation region, we have derived a relationship between the polarization matrix/coherency matrix for the modulated electric fields behind the rough-surfaced retardation plate and the coherence matrix under the free space geometry. This relation is regarded as entirely analogous to the van Cittert-Zernike theorem of classical coherence theory. Within the paraxial approximation as represented by the ABCD-matrix formalism, the three-dimensional structure of the generated polarization speckle is investigated based on the correlation tensor, indicating a typical carrot structure with a much longer axial dimension than the extent in its transverse dimension.

  1. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-01-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  2. Variational approach to probabilistic finite elements

    NASA Astrophysics Data System (ADS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  3. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1987-01-01

    Probabilistic finite element method (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties, and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  4. Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2005-11-01

    We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.

  5. Examining the Use of Web-Based Tests for Testing Academic Vocabulary in EAP Instruction

    ERIC Educational Resources Information Center

    Dashtestani, Reza

    2015-01-01

    Interest in Web-based and computer-assisted language testing is growing in the field of English for academic purposes (EAP). In this study, four groups of undergraduate EAP students (n = 120), each group consisted of 30 students, were randomly selected from four different disciplines, i.e. biology, political sciences, psychology, and law. The four…

  6. Psychophysical spectro-temporal receptive fields in an auditory task.

    PubMed

    Shub, Daniel E; Richards, Virginia M

    2009-05-01

    Psychophysical relative weighting functions, which provide information about the importance of different regions of a stimulus in forming decisions, are traditionally estimated using trial-based procedures, where a single stimulus is presented and a single response is recorded. Everyday listening is much more "free-running" in that we often must detect randomly occurring signals in the presence of a continuous background. Psychophysical relative weighting functions have not been measured with free-running paradigms. Here, we combine a free-running paradigm with the reverse correlation technique used to estimate physiological spectro-temporal receptive fields (STRFs) to generate psychophysical relative weighting functions that are analogous to physiological STRFs. The psychophysical task required the detection of a fixed target signal (a sequence of spectro-temporally coherent tone pips with a known frequency) in the presence of a continuously presented informational masker (spectro-temporally random tone pips). A comparison of psychophysical relative weighting functions estimated with the current free-running paradigm and trial-based paradigms, suggests that in informational-masking tasks subjects' decision strategies are similar in both free-running and trial-based paradigms. For more cognitively challenging tasks there may be differences in the decision strategies with free-running and trial-based paradigms.

  7. An Efficient ERP-Based Brain-Computer Interface Using Random Set Presentation and Face Familiarity

    PubMed Central

    Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup. PMID:25384045

  8. An efficient ERP-based brain-computer interface using random set presentation and face familiarity.

    PubMed

    Yeom, Seul-Ki; Fazli, Siamac; Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup.

  9. Safety assessment of a shallow foundation using the random finite element method

    NASA Astrophysics Data System (ADS)

    Zaskórski, Łukasz; Puła, Wojciech

    2015-04-01

    A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical probability distribution fits the empirical probability distribution of bearing capacity basing on 3000 realizations. Assessed probability distribution was applied to compute design values of the bearing capacity and related reliability indices β. Conducted analysis were carried out for a cohesion soil. Hence a friction angle and a cohesion were defined as a random parameters and characterized by two dimensional random fields. A friction angle was described by a bounded distribution as it differs within limited range. While a lognormal distribution was applied in case of a cohesion. Other properties - Young's modulus, Poisson's ratio and unit weight were assumed as deterministic values because they have negligible influence on the stochastic bearing capacity. Griffiths D. V., & Fenton G. A. (1993). Seepage beneath water retaining structures founded on spatially random soil. Géotechnique, 43(6), 577-587.

  10. Effect of antifreeze protein on heterogeneous ice nucleation based on a two-dimensional random-field Ising model

    NASA Astrophysics Data System (ADS)

    Dong, Zhen; Wang, Jianjun; Zhou, Xin

    2017-05-01

    Antifreeze proteins (AFPs) are the key biomolecules that protect many species from suffering the extreme conditions. Their unique properties of antifreezing provide the potential of a wide range of applications. Inspired by the present experimental approaches of creating an antifreeze surface by coating AFPs, here we present a two-dimensional random-field lattice Ising model to study the effect of AFPs on heterogeneous ice nucleation. The model shows that both the size and the free-energy effect of individual AFPs and their surface coverage dominate the antifreeze capacity of an AFP-coated surface. The simulation results are consistent with the recent experiments qualitatively, revealing the origin of the surprisingly low antifreeze capacity of an AFP-coated surface when the coverage is not particularly high as shown in experiment. These results will hopefully deepen our understanding of the antifreeze effects and thus be potentially useful for designing novel antifreeze coating materials based on biomolecules.

  11. Evidence-Based Medicine in Aesthetic Surgery: The Significance of Level to Aesthetic Surgery.

    PubMed

    Rohrich, Rod J; Cho, Min-Jeong

    2017-05-01

    Since its popularization in the 1980s, evidence-based medicine has become the cornerstone of American health care. Many specialties rapidly adapted to the paradigm shift of health care by delivering treatment using the evidence-based guidelines. However, the field of plastic surgery has been slow to implement evidence-based medicine compared with the other specialties because of the challenges of performing randomized controlled trials, such as funding, variability in surgical skills, and difficulty with standardization of techniques. To date, aesthetic surgery has been at the forefront of evidence-based medicine in plastic surgery by having the most randomized controlled trials. Nevertheless, a detailed analysis of these studies has not been previously performed. In this article, the level I and II articles of aesthetic surgery are discussed to increase awareness of high-quality evidence-based medicine in aesthetic surgery.

  12. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  13. Persistence and Lifelong Fidelity of Phase Singularities in Optical Random Waves.

    PubMed

    De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L

    2017-11-17

    Phase singularities are locations where light is twisted like a corkscrew, with positive or negative topological charge depending on the twisting direction. Among the multitude of singularities arising in random wave fields, some can be found at the same location, but only when they exhibit opposite topological charge, which results in their mutual annihilation. New pairs can be created as well. With near-field experiments supported by theory and numerical simulations, we study the persistence and pairing statistics of phase singularities in random optical fields as a function of the excitation wavelength. We demonstrate how such entities can encrypt fundamental properties of the random fields in which they arise.

  14. Persistence and Lifelong Fidelity of Phase Singularities in Optical Random Waves

    NASA Astrophysics Data System (ADS)

    De Angelis, L.; Alpeggiani, F.; Di Falco, A.; Kuipers, L.

    2017-11-01

    Phase singularities are locations where light is twisted like a corkscrew, with positive or negative topological charge depending on the twisting direction. Among the multitude of singularities arising in random wave fields, some can be found at the same location, but only when they exhibit opposite topological charge, which results in their mutual annihilation. New pairs can be created as well. With near-field experiments supported by theory and numerical simulations, we study the persistence and pairing statistics of phase singularities in random optical fields as a function of the excitation wavelength. We demonstrate how such entities can encrypt fundamental properties of the random fields in which they arise.

  15. Study on the algorithm of computational ghost imaging based on discrete fourier transform measurement matrix

    NASA Astrophysics Data System (ADS)

    Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua

    2016-07-01

    On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.

  16. Methodological and Ethical Challenges in a Web-Based Randomized Controlled Trial of a Domestic Violence Intervention

    PubMed Central

    Valpied, Jodie; Koziol-McLain, Jane; Glass, Nancy; Hegarty, Kelsey

    2017-01-01

    The use of Web-based methods to deliver and evaluate interventions is growing in popularity, particularly in a health care context. They have shown particular promise in responding to sensitive or stigmatized issues such as mental health and sexually transmitted infections. In the field of domestic violence (DV), however, the idea of delivering and evaluating interventions via the Web is still relatively new. Little is known about how to successfully navigate several challenges encountered by the researchers while working in this area. This paper uses the case study of I-DECIDE, a Web-based healthy relationship tool and safety decision aid for women experiencing DV, developed in Australia. The I-DECIDE website has recently been evaluated through a randomized controlled trial, and we outline some of the methodological and ethical challenges encountered during recruitment, retention, and evaluation. We suggest that with careful consideration of these issues, randomized controlled trials can be safely conducted via the Web in this sensitive area. PMID:28351830

  17. Random scalar fields and hyperuniformity

    NASA Astrophysics Data System (ADS)

    Ma, Zheng; Torquato, Salvatore

    2017-06-01

    Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystals and liquids. Hyperuniform systems have attracted recent attention because they are endowed with novel transport and optical properties. Recently, the hyperuniformity concept has been generalized to characterize two-phase media, scalar fields, and random vector fields. In this paper, we devise methods to explicitly construct hyperuniform scalar fields. Specifically, we analyze spatial patterns generated from Gaussian random fields, which have been used to model the microwave background radiation and heterogeneous materials, the Cahn-Hilliard equation for spinodal decomposition, and Swift-Hohenberg equations that have been used to model emergent pattern formation, including Rayleigh-Bénard convection. We show that the Gaussian random scalar fields can be constructed to be hyperuniform. We also numerically study the time evolution of spinodal decomposition patterns and demonstrate that they are hyperuniform in the scaling regime. Moreover, we find that labyrinth-like patterns generated by the Swift-Hohenberg equation are effectively hyperuniform. We show that thresholding (level-cutting) a hyperuniform Gaussian random field to produce a two-phase random medium tends to destroy the hyperuniformity of the progenitor scalar field. We then propose guidelines to achieve effectively hyperuniform two-phase media derived from thresholded non-Gaussian fields. Our investigation paves the way for new research directions to characterize the large-structure spatial patterns that arise in physics, chemistry, biology, and ecology. Moreover, our theoretical results are expected to guide experimentalists to synthesize new classes of hyperuniform materials with novel physical properties via coarsening processes and using state-of-the-art techniques, such as stereolithography and 3D printing.

  18. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less

  19. Theory and implementation of a very high throughput true random number generator in field programmable gate array.

    PubMed

    Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao

    2016-04-01

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  20. Document page structure learning for fixed-layout e-books using conditional random fields

    NASA Astrophysics Data System (ADS)

    Tao, Xin; Tang, Zhi; Xu, Canhui

    2013-12-01

    In this paper, a model is proposed to learn logical structure of fixed-layout document pages by combining support vector machine (SVM) and conditional random fields (CRF). Features related to each logical label and their dependencies are extracted from various original Portable Document Format (PDF) attributes. Both local evidence and contextual dependencies are integrated in the proposed model so as to achieve better logical labeling performance. With the merits of SVM as local discriminative classifier and CRF modeling contextual correlations of adjacent fragments, it is capable of resolving the ambiguities of semantic labels. The experimental results show that CRF based models with both tree and chain graph structures outperform the SVM model with an increase of macro-averaged F1 by about 10%.

  1. Design elements in implementation research: a structured review of child welfare and child mental health studies.

    PubMed

    Landsverk, John; Brown, C Hendricks; Rolls Reutz, Jennifer; Palinkas, Lawrence; Horwitz, Sarah McCue

    2011-01-01

    Implementation science is an emerging field of research with considerable penetration in physical medicine and less in the fields of mental health and social services. There remains a lack of consensus on methodological approaches to the study of implementation processes and tests of implementation strategies. This paper addresses the need for methods development through a structured review that describes design elements in nine studies testing implementation strategies for evidence-based interventions addressing mental health problems of children in child welfare and child mental health settings. Randomized trial designs were dominant with considerable use of mixed method designs in the nine studies published since 2005. The findings are discussed in reference to the limitations of randomized designs in implementation science and the potential for use of alternative designs.

  2. Random Assignment: Practical Considerations from Field Experiments.

    ERIC Educational Resources Information Center

    Dunford, Franklyn W.

    1990-01-01

    Seven qualitative issues associated with randomization that have the potential to weaken or destroy otherwise sound experimental designs are reviewed and illustrated via actual field experiments. Issue areas include ethics and legality, liability risks, manipulation of randomized outcomes, hidden bias, design intrusiveness, case flow, and…

  3. Patterns of Relapse From a Phase 3 Study of Response-Based Therapy for Intermediate-Risk Hodgkin Lymphoma (AHOD0031): A Report From the Children's Oncology Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dharmarajan, Kavita V.; Friedman, Debra L.; Schwartz, Cindy L.

    2015-05-01

    Purpose: The study was designed to determine whether response-based therapy improves outcomes in intermediate-risk Hodgkin lymphoma. We examined patterns of first relapse in the study. Patients and Methods: From September 2002 to July 2010, 1712 patients <22 years old with stage I-IIA with bulk, I-IIAE, I-IIB, and IIIA-IVA with or without doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide were enrolled. Patients were categorized as rapid (RER) or slow early responders (SER) after 2 cycles of doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide (ABVE-PC). The SER patients were randomized to 2 additional ABVE-PC cycles or augmented chemotherapy with 21 Gy involved field radiationmore » therapy (IFRT). RER patients were stipulated to undergo 2 additional ABVE-PC cycles and were then randomized to 21 Gy IFRT or no further treatment if complete response (CR) was achieved. RER without CR patients were non-randomly assigned to 21 Gy IFRT. Relapses were characterized without respect to site (initial, new, or both; and initial bulk or initial nonbulk), and involved field radiation therapy field (in-field, out-of-field, or both). Patients were grouped by treatment assignment (SER; RER/no CR; RER/CR/IFRT; and RER/CR/no IFRT). Summary statistics were reported. Results: At 4-year median follow-up, 244 patients had experienced relapse, 198 of whom were fully evaluable for review. Those who progressed during treatment (n=30) or lacked relapse imaging (n=16) were excluded. The median time to relapse was 12.8 months. Of the 198 evaluable patients, 30% were RER/no CR, 26% were SER, 26% were RER/CR/no IFRT, 16% were RER/CR/IFRT, and 2% remained uncategorized. The 74% and 75% relapses involved initially bulky and nonbulky sites, respectively. First relapses rarely occurred at exclusively new or out-of-field sites. By contrast, relapses usually occurred at nodal sites of initial bulky and nonbulky disease. Conclusion: Although response-based therapy has helped define treatment for selected RER patients, it has not improved outcome for SER patients or facilitated refinement of IFRT volumes or doses.« less

  4. IMAGINE: Interstellar MAGnetic field INference Engine

    NASA Astrophysics Data System (ADS)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  5. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  6. Functional response and capture timing in an individual-based model: predation by northern squawfish (Ptychocheilus oregonensis) on juvenile salmonids in the Columbia River

    USGS Publications Warehouse

    Petersen, James H.; DeAngelis, Donald L.

    1992-01-01

    The behavior of individual northern squawfish (Ptychocheilus oregonensis) preying on juvenile salmonids was modeled to address questions about capture rate and the timing of prey captures (random versus contagious). Prey density, predator weight, prey weight, temperature, and diel feeding pattern were first incorporated into predation equations analogous to Holling Type 2 and Type 3 functional response models. Type 2 and Type 3 equations fit field data from the Columbia River equally well, and both models predicted predation rates on five of seven independent dates. Selecting a functional response type may be complicated by variable predation rates, analytical methods, and assumptions of the model equations. Using the Type 2 functional response, random versus contagious timing of prey capture was tested using two related models. ln the simpler model, salmon captures were assumed to be controlled by a Poisson renewal process; in the second model, several salmon captures were assumed to occur during brief "feeding bouts", modeled with a compound Poisson process. Salmon captures by individual northern squawfish were clustered through time, rather than random, based on comparison of model simulations and field data. The contagious-feeding result suggests that salmonids may be encountered as patches or schools in the river.

  7. New constraints on modelling the random magnetic field of the MW

    NASA Astrophysics Data System (ADS)

    Beck, Marcus C.; Beck, Alexander M.; Beck, Rainer; Dolag, Klaus; Strong, Andrew W.; Nielaba, Peter

    2016-05-01

    We extend the description of the isotropic and anisotropic random component of the small-scale magnetic field within the existing magnetic field model of the Milky Way from Jansson & Farrar, by including random realizations of the small-scale component. Using a magnetic-field power spectrum with Gaussian random fields, the NE2001 model for the thermal electrons and the Galactic cosmic-ray electron distribution from the current GALPROP model we derive full-sky maps for the total and polarized synchrotron intensity as well as the Faraday rotation-measure distribution. While previous work assumed that small-scale fluctuations average out along the line-of-sight or which only computed ensemble averages of random fields, we show that these fluctuations need to be carefully taken into account. Comparing with observational data we obtain not only good agreement with 408 MHz total and WMAP7 22 GHz polarized intensity emission maps, but also an improved agreement with Galactic foreground rotation-measure maps and power spectra, whose amplitude and shape strongly depend on the parameters of the random field. We demonstrate that a correlation length of 0≈22 pc (05 pc being a 5σ lower limit) is needed to match the slope of the observed power spectrum of Galactic foreground rotation-measure maps. Using multiple realizations allows us also to infer errors on individual observables. We find that previously-used amplitudes for random and anisotropic random magnetic field components need to be rescaled by factors of ≈0.3 and 0.6 to account for the new small-scale contributions. Our model predicts a rotation measure of -2.8±7.1 rad/m2 and 04.4±11. rad/m2 for the north and south Galactic poles respectively, in good agreement with observations. Applying our model to deflections of ultra-high-energy cosmic rays we infer a mean deflection of ≈3.5±1.1 degree for 60 EeV protons arriving from CenA.

  8. Modelling past land use using archaeological and pollen data

    NASA Astrophysics Data System (ADS)

    Pirzamanbein, Behnaz; Lindström, johan; Poska, Anneli; Gaillard-Lemdahl, Marie-José

    2016-04-01

    Accurate maps of past land use are necessary for studying the impact of anthropogenic land-cover changes on climate and biodiversity. We develop a Bayesian hierarchical model to reconstruct the land use using Gaussian Markov random fields. The model uses two observations sets: 1) archaeological data, representing human settlements, urbanization and agricultural findings; and 2) pollen-based land estimates of the three land-cover types Coniferous forest, Broadleaved forest and Unforested/Open land. The pollen based estimates are obtained from the REVEALS model, based on pollen counts from lakes and bogs. Our developed model uses the sparse pollen-based estimations to reconstruct the spatial continuous cover of three land cover types. Using the open-land component and the archaeological data, the extent of land-use is reconstructed. The model is applied on three time periods - centred around 1900 CE, 1000 and, 4000 BCE over Sweden for which both pollen-based estimates and archaeological data are available. To estimate the model parameters and land use, a block updated Markov chain Monte Carlo (MCMC) algorithm is applied. Using the MCMC posterior samples uncertainties in land-use predictions are computed. Due to lack of good historic land use data, model results are evaluated by cross-validation. Keywords. Spatial reconstruction, Gaussian Markov random field, Fossil pollen records, Archaeological data, Human land-use, Prediction uncertainty

  9. 3D displacement field measurement with correlation based on the micro-geometrical surface texture

    NASA Astrophysics Data System (ADS)

    Bubaker-Isheil, Halima; Serri, Jérôme; Fontaine, Jean-François

    2011-07-01

    Image correlation methods are widely used in experimental mechanics to obtain displacement field measurements. Currently, these methods are applied using digital images of the initial and deformed surfaces sprayed with black or white paint. Speckle patterns are then captured and the correlation is performed with a high degree of accuracy to an order of 0.01 pixels. In 3D, however, stereo-correlation leads to a lower degree of accuracy. Correlation techniques are based on the search for a sub-image (or pattern) displacement field. The work presented in this paper introduces a new correlation-based approach for 3D displacement field measurement that uses an additional 3D laser scanner and a CMM (Coordinate Measurement Machine). Unlike most existing methods that require the presence of markers on the observed object (such as black speckle, grids or random patterns), this approach relies solely on micro-geometrical surface textures such as waviness, roughness and aperiodic random defects. The latter are assumed to remain sufficiently small thus providing an adequate estimate of the particle displacement. The proposed approach can be used in a wide range of applications such as sheet metal forming with large strains. The method proceeds by first obtaining cloud points using the 3D laser scanner mounted on a CMM. These points are used to create 2D maps that are then correlated. In this respect, various criteria have been investigated for creating maps consisting of patterns, which facilitate the correlation procedure. Once the maps are created, the correlation between both configurations (initial and moved) is carried out using traditional methods developed for field measurements. Measurement validation was conducted using experiments in 2D and 3D with good results for rigid displacements in 2D, 3D and 2D rotations.

  10. Phase-only asymmetric optical cryptosystem based on random modulus decomposition

    NASA Astrophysics Data System (ADS)

    Xu, Hongfeng; Xu, Wenhui; Wang, Shuaihua; Wu, Shaofan

    2018-06-01

    We propose a phase-only asymmetric optical cryptosystem based on random modulus decomposition (RMD). The cryptosystem is presented for effectively improving the capacity to resist various attacks, including the attack of iterative algorithms. On the one hand, RMD and phase encoding are combined to remove the constraints that can be used in the attacking process. On the other hand, the security keys (geometrical parameters) introduced by Fresnel transform can increase the key variety and enlarge the key space simultaneously. Numerical simulation results demonstrate the strong feasibility, security and robustness of the proposed cryptosystem. This cryptosystem will open up many new opportunities in the application fields of optical encryption and authentication.

  11. Quantitative model of price diffusion and market friction based on trading as a mechanistic random process.

    PubMed

    Daniels, Marcus G; Farmer, J Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-14

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  12. Quantitative Model of Price Diffusion and Market Friction Based on Trading as a Mechanistic Random Process

    NASA Astrophysics Data System (ADS)

    Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-01

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  13. Magnetic microstructures for regulating Brownian motion

    NASA Astrophysics Data System (ADS)

    Sooryakumar, Ratnasingham

    2013-03-01

    Nature has proven that it is possible to engineer complex nanoscale machines in the presence of thermal fluctuations. These biological complexes, which harness random thermal energy to provide functionality, yield a framework to develop related artificial, i.e., nonbiological, phenomena and devices. A major challenge to achieving positional control of fluid-borne submicron sized objects is regulating their Brownian fluctuations. In this talk a magnetic-field-based trap that regulates the thermal fluctuations of superparamagnetic beads in suspension will be presented. Local domain-wall fields originating from patterned magnetic wires, whose strength and profile are tuned by weak external fields, enable bead trajectories within the trap to be managed and easily varied between strong confinements and delocalized spatial excursions. Moreover, the frequency spectrum of the trapped bead responds to fields as a power-law function with a tunable, non-integer exponent. When extended to a cluster of particles, the trapping landscape preferentially stabilizes them into formations of 5-fold symmetry, while their Brownian fluctuations result in frequent transitions between different cluster configurations. The quantitative understanding of the Brownian dynamics together with the ability to tune the extent of the fluctuations enables the wire-based platform to serve as a model system to investigate the competition between random and deterministic forces. Funding from the U.S. Army Research Office under contract W911NF-10-1-0353 is acknowledged.

  14. Strong flux pinning at 4.2 K in SmBa2Cu3O y coated conductors with BaHfO3 nanorods controlled by low growth temperature

    NASA Astrophysics Data System (ADS)

    Miura, S.; Tsuchiya, Y.; Yoshida, Y.; Ichino, Y.; Awaji, S.; Matsumoto, K.; Ibi, A.; Izumi, T.

    2017-08-01

    In order to apply REBa2Cu3O y (REBCO, RE = rare earth elements or Y) coated conductors in high magnetic field, coil-based applications, the isotropic improvement of their critical current performance with respect to the directions of the magnetic field under these operating conditions is required. Most applications operate at temperatures lower than 50 K and magnetic fields over 2 T. In this study, the improvement of critical current density (J c) performance for various applied magnetic field directions was achieved by controlling the nanostructure of the BaHfO3 (BHO)-doped SmBa2Cu3O y (SmBCO) films on metallic substrates. The corresponding minimum J c value of the films at 40 K under an applied 3 T field was 5.2 MA cm-2, which is over ten times higher than that of a fully optimized Nb-Ti wire at 4.2 K. At 4.2 K, under a 17.5 T field, a flux pinning force density of 1.4 TN m-3 for B//c was realized; this value is among the highest values reported for REBCO films to date. More importantly, the F p for B//c corresponds to the minimum value for various applied magnetic field directions. We investigated the dominant flux pinning centers of films at 4.2 K using the anisotropic scaling approach based on the effective mass model. The dominant flux pinning centers are random pinning centers at 4.2 K, i.e., a high pinning performance was achieved by the high number density of random pins in the matrix of the BHO-doped SmBCO films.

  15. Development of a Model Competency-Based Orientation Program

    DTIC Science & Technology

    1988-05-01

    S. (1938). Basic writings of Sigmund Freud . New York: Random House. Hagerty, B.K. (1986). A competency-based orientation program for psychiatric...education, and nursing will be presented. • ..... Beginning with the field of psychology, Freud (1938) described motivation using the concept of psychic...Gosnell, D.J. (1987). Comparing two methods of hospital orientation for cost effective- ness. Journal of Nursing Staff Development, 3 , 3-8. Freud

  16. Do mobile phone base stations affect sleep of residents? Results from an experimental double-blind sham-controlled field study.

    PubMed

    Danker-Hopfe, Heidi; Dorn, Hans; Bornkessel, Christian; Sauter, Cornelia

    2010-01-01

    The aim of the present double-blind, sham-controlled, balanced randomized cross-over study was to disentangle effects of electromagnetic fields (EMF) and non-EMF effects of mobile phone base stations on objective and subjective sleep quality. In total 397 residents aged 18-81 years (50.9% female) from 10 German sites, where no mobile phone service was available, were exposed to sham and GSM (Global System for Mobile Communications, 900 MHz and 1,800 MHz) base station signals by an experimental base station while their sleep was monitored at their homes during 12 nights. Participants were randomly exposed to real (GSM) or sham exposure for five nights each. Individual measurement of EMF exposure, questionnaires on sleep disorders, overall sleep quality, attitude towards mobile communication, and on subjective sleep quality (morning and evening protocols) as well as objective sleep data (frontal EEG and EOG recordings) were gathered. Analysis of the subjective and objective sleep data did not reveal any significant differences between the real and sham condition. During sham exposure nights, objective and subjective sleep efficiency, wake after sleep onset, and subjective sleep latency were significantly worse in participants with concerns about possible health risks resulting from base stations than in participants who were not concerned. The study did not provide any evidence for short-term physiological effects of EMF emitted by mobile phone base stations on objective and subjective sleep quality. However, the results indicate that mobile phone base stations as such (not the electromagnetic fields) may have a significant negative impact on sleep quality. (c) 2010 Wiley-Liss, Inc.

  17. Dispersion Analysis Using Particle Tracking Simulations Through Heterogeneity Based on Outcrop Lidar Imagery

    NASA Astrophysics Data System (ADS)

    Klise, K. A.; Weissmann, G. S.; McKenna, S. A.; Tidwell, V. C.; Frechette, J. D.; Wawrzyniec, T. F.

    2007-12-01

    Solute plumes are believed to disperse in a non-Fickian manner due to small-scale heterogeneity and variable velocities that create preferential pathways. In order to accurately predict dispersion in naturally complex geologic media, the connection between heterogeneity and dispersion must be better understood. Since aquifer properties can not be measured at every location, it is common to simulate small-scale heterogeneity with random field generators based on a two-point covariance (e.g., through use of sequential simulation algorithms). While these random fields can produce preferential flow pathways, it is unknown how well the results simulate solute dispersion through natural heterogeneous media. To evaluate the influence that complex heterogeneity has on dispersion, we utilize high-resolution terrestrial lidar to identify and model lithofacies from outcrop for application in particle tracking solute transport simulations using RWHet. The lidar scan data are used to produce a lab (meter) scale two-dimensional model that captures 2-8 mm scale natural heterogeneity. Numerical simulations utilize various methods to populate the outcrop structure captured by the lidar-based image with reasonable hydraulic conductivity values. The particle tracking simulations result in residence time distributions used to evaluate the nature of dispersion through complex media. Particle tracking simulations through conductivity fields produced from the lidar images are then compared to particle tracking simulations through hydraulic conductivity fields produced from sequential simulation algorithms. Based on this comparison, the study aims to quantify the difference in dispersion when using realistic and simplified representations of aquifer heterogeneity. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Spatial Distribution of Phase Singularities in Optical Random Vector Waves.

    PubMed

    De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L

    2016-08-26

    Phase singularities are dislocations widely studied in optical fields as well as in other areas of physics. With experiment and theory we show that the vectorial nature of light affects the spatial distribution of phase singularities in random light fields. While in scalar random waves phase singularities exhibit spatial distributions reminiscent of particles in isotropic liquids, in vector fields their distribution for the different vector components becomes anisotropic due to the direct relation between propagation and field direction. By incorporating this relation in the theory for scalar fields by Berry and Dennis [Proc. R. Soc. A 456, 2059 (2000)], we quantitatively describe our experiments.

  19. Magnetic Field Line Random Walk in Arbitrarily Stretched Isotropic Turbulence

    NASA Astrophysics Data System (ADS)

    Wongpan, P.; Ruffolo, D.; Matthaeus, W. H.; Rowlands, G.

    2006-12-01

    Many types of space and laboratory plasmas involve turbulent fluctuations with an approximately uniform mean magnetic field B_0, and the field line random walk plays an important role in guiding particle motions. Much of the relevant literature concerns isotropic turbulence, and has mostly been perturbative, i.e., for small fluctuations, or based on numerical simulations for specific conditions. On the other hand, solar wind turbulence is apparently anisotropic, and has been modeled as a sum of idealized two-dimensional and one dimensional (slab) components, but with the deficiency of containing no oblique wave vectors. In the present work, we address the above issues with non-perturbative analytic calculations of diffusive field line random walks for unpolarized, arbitrarily stretched isotropic turbulence, including the limits of nearly one-dimensional (highly stretched) and nearly two-dimensional (highly squashed) turbulence. We develop implicit analytic formulae for the diffusion coefficients D_x and D_z, two coupled integral equations in which D_x and D_z appear inside 3-dimensional integrals over all k-space, are solved numerically with the aid of Mathematica routines for specific cases. We can vary the parameters B0 and β, the stretching along z for constant turbulent energy. Furthermore, we obtain analytic closed-form solutions in all extreme cases. We obtain 0.54 < D_z/D_x < 2, indicating an approximately isotropic random walk even for very anisotropic (unpolarized) turbulence, a surprising result. For a given β, the diffusion coefficient vs. B0 can be described by a Padé approximant. We find quasilinear behavior at high B0 and percolative behavior at low B_0. Partially supported by a Sritrangthong Scholarship from the Faculty of Science, Mahidol University; the Thailand Research Fund; NASA Grant NNG05GG83G; and Thailand's Commission for Higher Education.

  20. Exchange bias mechanism in FM/FM/AF spin valve systems in the presence of random unidirectional anisotropy field at the AF interface: The role played by the interface roughness due to randomness

    NASA Astrophysics Data System (ADS)

    Yüksel, Yusuf

    2018-05-01

    We propose an atomistic model and present Monte Carlo simulation results regarding the influence of FM/AF interface structure on the hysteresis mechanism and exchange bias behavior for a spin valve type FM/FM/AF magnetic junction. We simulate perfectly flat and roughened interface structures both with uncompensated interfacial AF moments. In order to simulate rough interface effect, we introduce the concept of random exchange anisotropy field induced at the interface, and acting on the interface AF spins. Our results yield that different types of the random field distributions of anisotropy field may lead to different behavior of exchange bias.

  1. A Numerical Simulation of Scattering from One-Dimensional Inhomogeneous Dielectric Random Surfaces

    NASA Technical Reports Server (NTRS)

    Sarabandi, Kamal; Oh, Yisok; Ulaby, Fawwaz T.

    1996-01-01

    In this paper, an efficient numerical solution for the scattering problem of inhomogeneous dielectric rough surfaces is presented. The inhomogeneous dielectric random surface represents a bare soil surface and is considered to be comprised of a large number of randomly positioned dielectric humps of different sizes, shapes, and dielectric constants above an impedance surface. Clods with nonuniform moisture content and rocks are modeled by inhomogeneous dielectric humps and the underlying smooth wet soil surface is modeled by an impedance surface. In this technique, an efficient numerical solution for the constituent dielectric humps over an impedance surface is obtained using Green's function derived by the exact image theory in conjunction with the method of moments. The scattered field from a sample of the rough surface is obtained by summing the scattered fields from all the individual humps of the surface coherently ignoring the effect of multiple scattering between the humps. The statistical behavior of the scattering coefficient sigma(sup 0) is obtained from the calculation of scattered fields of many different realizations of the surface. Numerical results are presented for several different roughnesses and dielectric constants of the random surfaces. The numerical technique is verified by comparing the numerical solution with the solution based on the small perturbation method and the physical optics model for homogeneous rough surfaces. This technique can be used to study the behavior of scattering coefficient and phase difference statistics of rough soil surfaces for which no analytical solution exists.

  2. A random walk approach to quantum algorithms.

    PubMed

    Kendon, Vivien M

    2006-12-15

    The development of quantum algorithms based on quantum versions of random walks is placed in the context of the emerging field of quantum computing. Constructing a suitable quantum version of a random walk is not trivial; pure quantum dynamics is deterministic, so randomness only enters during the measurement phase, i.e. when converting the quantum information into classical information. The outcome of a quantum random walk is very different from the corresponding classical random walk owing to the interference between the different possible paths. The upshot is that quantum walkers find themselves further from their starting point than a classical walker on average, and this forms the basis of a quantum speed up, which can be exploited to solve problems faster. Surprisingly, the effect of making the walk slightly less than perfectly quantum can optimize the properties of the quantum walk for algorithmic applications. Looking to the future, even with a small quantum computer available, the development of quantum walk algorithms might proceed more rapidly than it has, especially for solving real problems.

  3. Modal energy analysis for mechanical systems excited by spatially correlated loads

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Fei, Qingguo; Li, Yanbin; Wu, Shaoqing; Chen, Qiang

    2018-10-01

    MODal ENergy Analysis (MODENA) is an energy-based method, which is proposed to deal with vibroacoustic problems. The performance of MODENA on the energy analysis of a mechanical system under spatially correlated excitation is investigated. A plate/cavity coupling system excited by a pressure field is studied in a numerical example, in which four kinds of pressure fields are involved, which include the purely random pressure field, the perfectly correlated pressure field, the incident diffuse field, and the turbulent boundary layer pressure fluctuation. The total energies of subsystems differ to reference solution only in the case of purely random pressure field and only for the non-excited subsystem (the cavity). A deeper analysis on the scale of modal energy is further conducted via another numerical example, in which two structural modes excited by correlated forces are coupled with one acoustic mode. A dimensionless correlation strength factor is proposed to determine the correlation strength between modal forces. Results show that the error on modal energy increases with the increment of the correlation strength factor. A criterion is proposed to establish a link between the error and the correlation strength factor. According to the criterion, the error is negligible when the correlation strength is weak, in this situation the correlation strength factor is less than a critical value.

  4. Coordination and Management of Multisite Complementary and Alternative Medicine (CAM) Therapies: Experience from a Multisite Reflexology Intervention Trial

    PubMed Central

    Rahbar, Mohammad H.; Wyatt, Gwen; Sikorskii, Alla; Victorson, David; Ardjomand-Hessabi, Manouchehr

    2011-01-01

    Background Multisite randomized clinical trials allow for increased research collaboration among investigators and expedite data collection efforts. As a result, government funding agencies typically look favorably upon this approach. As the field of complementary and alternative medicine (CAM) continues to evolve, so do increased calls for the use of more rigorous study design and trial methodologies, which can present challenges for investigators. Purpose To describe the processes involved in the coordination and management of a multisite randomized clinical trial of a CAM intervention. Methods Key aspects related to the coordination and management of a multisite CAM randomized clinical trial are presented, including organizational and site selection considerations, recruitment concerns and issues related to data collection and randomization to treatment groups. Management and monitoring of data, as well as quality assurance procedures are described. Finally, a real world perspective is shared from a recently conducted multisite randomized clinical trial of reflexology for women diagnosed with advanced breast cancer. Results The use of multiple sites in the conduct of CAM-based randomized clinical trials can provide an efficient, collaborative and robust approach to study coordination and data collection that maximizes efficiency and ensures the quality of results. Conclusions Multisite randomized clinical trial designs can offer the field of CAM research a more standardized and efficient approach to examine the effectiveness of novel therapies and treatments. Special attention must be given to intervention fidelity, consistent data collection and ensuring data quality. Assessment and reporting of quantitative indicators of data quality should be required. PMID:21664296

  5. Clustering, randomness, and regularity in cloud fields: 2. Cumulus cloud fields

    NASA Astrophysics Data System (ADS)

    Zhu, T.; Lee, J.; Weger, R. C.; Welch, R. M.

    1992-12-01

    During the last decade a major controversy has been brewing concerning the proper characterization of cumulus convection. The prevailing view has been that cumulus clouds form in clusters, in which cloud spacing is closer than that found for the overall cloud field and which maintains its identity over many cloud lifetimes. This "mutual protection hypothesis" of Randall and Huffman (1980) has been challenged by the "inhibition hypothesis" of Ramirez et al. (1990) which strongly suggests that the spatial distribution of cumuli must tend toward a regular distribution. A dilemma has resulted because observations have been reported to support both hypotheses. The present work reports a detailed analysis of cumulus cloud field spatial distributions based upon Landsat, Advanced Very High Resolution Radiometer, and Skylab data. Both nearest-neighbor and point-to-cloud cumulative distribution function statistics are investigated. The results show unequivocally that when both large and small clouds are included in the cloud field distribution, the cloud field always has a strong clustering signal. The strength of clustering is largest at cloud diameters of about 200-300 m, diminishing with increasing cloud diameter. In many cases, clusters of small clouds are found which are not closely associated with large clouds. As the small clouds are eliminated from consideration, the cloud field typically tends towards regularity. Thus it would appear that the "inhibition hypothesis" of Ramirez and Bras (1990) has been verified for the large clouds. However, these results are based upon the analysis of point processes. A more exact analysis also is made which takes into account the cloud size distributions. Since distinct clouds are by definition nonoverlapping, cloud size effects place a restriction upon the possible locations of clouds in the cloud field. The net effect of this analysis is that the large clouds appear to be randomly distributed, with only weak tendencies towards regularity. For clouds less than 1 km in diameter, the average nearest-neighbor distance is equal to 3-7 cloud diameters. For larger clouds, the ratio of cloud nearest-neighbor distance to cloud diameter increases sharply with increasing cloud diameter. This demonstrates that large clouds inhibit the growth of other large clouds in their vicinity. Nevertheless, this leads to random distributions of large clouds, not regularity.

  6. A home-based exercise program for children with congenital heart disease following interventional cardiac catheterization: study protocol for a randomized controlled trial.

    PubMed

    Du, Qing; Salem, Yasser; Liu, Hao Howe; Zhou, Xuan; Chen, Sun; Chen, Nan; Yang, Xiaoyan; Liang, Juping; Sun, Kun

    2017-01-23

    Cardiac catheterization has opened an innovative treatment field for cardiac disease; this treatment is becoming the most popular approach for pediatric congenital heart disease (CHD) and has led to a significant growth in the number of children with cardiac catheterization. Unfortunately, based on evidence, it has been demonstrated that the majority of children with CHD are at an increased risk of "non-cardiac" problems. Effective exercise therapy could improve their functional status significantly. As studies identifying the efficacy of exercise therapy are rare in this field, the aims of this study are to (1) identify the efficacy of a home-based exercise program to improve the motor function of children with CHD with cardiac catheterization, (2) reduce parental anxiety and parenting burden, and (3) improve the quality of life for parents whose children are diagnosed with CHD with cardiac catheterization through the program. A total of 300 children who will perform a cardiac catheterization will be randomly assigned to two groups: a home-based intervention group and a control group. The home-based intervention group will carry out a home-based exercise program, and the control group will receive only home-based exercise education. Assessments will be undertaken before catheterization and at 1, 3, and 6 months after catheterization. Motor ability quotients will be assessed as the primary outcomes. The modified Ross score, cardiac function, speed of sound at the tibia, functional independence of the children, anxiety, quality of life, and caregiver burden of their parents or the main caregivers will be the secondary outcome measurements. The proposed prospective randomized controlled trial will evaluate the efficiency of a home-based exercise program for children with CHD with cardiac catheterization. We anticipate that the home-based exercise program may represent a valuable and efficient intervention for children with CHD and their families. http://www.chictr.org.cn/ on: ChiCTR-IOR-16007762 . Registered on 13 January 2016.

  7. Alternative Goal Structures for Computer Game-Based Learning

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2008-01-01

    This field study investigated the application of cooperative, competitive, and individualistic goal structures in classroom use of computer math games and its impact on students' math performance and math learning attitudes. One hundred and sixty 5th-grade students were recruited and randomly assigned to Teams-Games-Tournament cooperative gaming,…

  8. An Interaction-Based Approach to Enhancing Secondary School Instruction and Student Achievement

    ERIC Educational Resources Information Center

    Allen, Joseph; Pianta, Robert; Gregory, Anne; Mikami, Amori; Lun, Janetta

    2011-01-01

    Improving teaching quality is widely recognized as critical to addressing deficiencies in secondary school education, yet the field has struggled to identify rigorously evaluated teacher-development approaches that can produce reliable gains in student achievement. A randomized controlled trial of My Teaching Partner-Secondary--a Web-mediated…

  9. Spin dynamics of random Ising chain in coexisting transverse and longitudinal magnetic fields

    NASA Astrophysics Data System (ADS)

    Liu, Zhong-Qiang; Jiang, Su-Rong; Kong, Xiang-Mu; Xu, Yu-Liang

    2017-05-01

    The dynamics of the random Ising spin chain in coexisting transverse and longitudinal magnetic fields is studied by the recursion method. Both the spin autocorrelation function and its spectral density are investigated by numerical calculations. It is found that system's dynamical behaviors depend on the deviation σJ of the random exchange coupling between nearest-neighbor spins and the ratio rlt of the longitudinal and the transverse fields: (i) For rlt = 0, the system undergoes two crossovers from N independent spins precessing about the transverse magnetic field to a collective-mode behavior, and then to a central-peak behavior as σJ increases. (ii) For rlt ≠ 0, the system may exhibit a coexistence behavior of a collective-mode one and a central-peak one. When σJ is small (or large enough), system undergoes a crossover from a coexistence behavior (or a disordered behavior) to a central-peak behavior as rlt increases. (iii) Increasing σJ depresses effects of both the transverse and the longitudinal magnetic fields. (iv) Quantum random Ising chain in coexisting magnetic fields may exhibit under-damping and critical-damping characteristics simultaneously. These results indicate that changing the external magnetic fields may control and manipulate the dynamics of the random Ising chain.

  10. Cosmic Rays in Intermittent Magnetic Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shukurov, Anvar; Seta, Amit; Bushby, Paul J.

    The propagation of cosmic rays in turbulent magnetic fields is a diffusive process driven by the scattering of the charged particles by random magnetic fluctuations. Such fields are usually highly intermittent, consisting of intense magnetic filaments and ribbons surrounded by weaker, unstructured fluctuations. Studies of cosmic-ray propagation have largely overlooked intermittency, instead adopting Gaussian random magnetic fields. Using test particle simulations, we calculate cosmic-ray diffusivity in intermittent, dynamo-generated magnetic fields. The results are compared with those obtained from non-intermittent magnetic fields having identical power spectra. The presence of magnetic intermittency significantly enhances cosmic-ray diffusion over a wide range of particlemore » energies. We demonstrate that the results can be interpreted in terms of a correlated random walk.« less

  11. Effect of alignment of easy axes on dynamic magnetization of immobilized magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Yoshida, Takashi; Matsugi, Yuki; Tsujimura, Naotaka; Sasayama, Teruyoshi; Enpuku, Keiji; Viereck, Thilo; Schilling, Meinhard; Ludwig, Frank

    2017-04-01

    In some biomedical applications of magnetic nanoparticles (MNPs), the particles are physically immobilized. In this study, we explore the effect of the alignment of the magnetic easy axes on the dynamic magnetization of immobilized MNPs under an AC excitation field. We prepared three immobilized MNP samples: (1) a sample in which easy axes are randomly oriented, (2) a parallel-aligned sample in which easy axes are parallel to the AC field, and (3) an orthogonally aligned sample in which easy axes are perpendicular to the AC field. First, we show that the parallel-aligned sample has the largest hysteresis in the magnetization curve and the largest harmonic magnetization spectra, followed by the randomly oriented and orthogonally aligned samples. For example, 1.6-fold increase was observed in the area of the hysteresis loop of the parallel-aligned sample compared to that of the randomly oriented sample. To quantitatively discuss the experimental results, we perform a numerical simulation based on a Fokker-Planck equation, in which probability distributions for the directions of the easy axes are taken into account in simulating the prepared MNP samples. We obtained quantitative agreement between experiment and simulation. These results indicate that the dynamic magnetization of immobilized MNPs is significantly affected by the alignment of the easy axes.

  12. Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Petrosino, Anthony

    2003-01-01

    Introduces the articles of this special issue focusing on randomized field trials in criminology. In spite of the overall lack of randomized field trials in criminology, some agencies and individuals are able to mount an impressive number of field trials, and these articles focus on their experiences. (SLD)

  13. Space-time models based on random fields with local interactions

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.; Tsantili, Ivi C.

    2016-08-01

    The analysis of space-time data from complex, real-life phenomena requires the use of flexible and physically motivated covariance functions. In most cases, it is not possible to explicitly solve the equations of motion for the fields or the respective covariance functions. In the statistical literature, covariance functions are often based on mathematical constructions. In this paper, we propose deriving space-time covariance functions by solving “effective equations of motion”, which can be used as statistical representations of systems with diffusive behavior. In particular, we propose to formulate space-time covariance functions based on an equilibrium effective Hamiltonian using the linear response theory. The effective space-time dynamics is then generated by a stochastic perturbation around the equilibrium point of the classical field Hamiltonian leading to an associated Langevin equation. We employ a Hamiltonian which extends the classical Gaussian field theory by including a curvature term and leads to a diffusive Langevin equation. Finally, we derive new forms of space-time covariance functions.

  14. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    PubMed

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  15. Unsupervised change detection of multispectral images based on spatial constraint chi-squared transform and Markov random field model

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; Wang, Chao; Shen, Shaohong; Huang, Fengchen; Ma, Zhenli

    2016-10-01

    Chi-squared transform (CST), as a statistical method, can describe the difference degree between vectors. The CST-based methods operate directly on information stored in the difference image and are simple and effective methods for detecting changes in remotely sensed images that have been registered and aligned. However, the technique does not take spatial information into consideration, which leads to much noise in the result of change detection. An improved unsupervised change detection method is proposed based on spatial constraint CST (SCCST) in combination with a Markov random field (MRF) model. First, the mean and variance matrix of the difference image of bitemporal images are estimated by an iterative trimming method. In each iteration, spatial information is injected to reduce scattered changed points (also known as "salt and pepper" noise). To determine the key parameter confidence level in the SCCST method, a pseudotraining dataset is constructed to estimate the optimal value. Then, the result of SCCST, as an initial solution of change detection, is further improved by the MRF model. The experiments on simulated and real multitemporal and multispectral images indicate that the proposed method performs well in comprehensive indices compared with other methods.

  16. Incorporating interaction networks into the determination of functionally related hit genes in genomic experiments with Markov random fields

    PubMed Central

    Robinson, Sean; Nevalainen, Jaakko; Pinna, Guillaume; Campalans, Anna; Radicella, J. Pablo; Guyon, Laurent

    2017-01-01

    Abstract Motivation: Incorporating gene interaction data into the identification of ‘hit’ genes in genomic experiments is a well-established approach leveraging the ‘guilt by association’ assumption to obtain a network based hit list of functionally related genes. We aim to develop a method to allow for multivariate gene scores and multiple hit labels in order to extend the analysis of genomic screening data within such an approach. Results: We propose a Markov random field-based method to achieve our aim and show that the particular advantages of our method compared with those currently used lead to new insights in previously analysed data as well as for our own motivating data. Our method additionally achieves the best performance in an independent simulation experiment. The real data applications we consider comprise of a survival analysis and differential expression experiment and a cell-based RNA interference functional screen. Availability and implementation: We provide all of the data and code related to the results in the paper. Contact: sean.j.robinson@utu.fi or laurent.guyon@cea.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28881978

  17. An Optimization-based Framework to Learn Conditional Random Fields for Multi-label Classification

    PubMed Central

    Naeini, Mahdi Pakdaman; Batal, Iyad; Liu, Zitao; Hong, CharmGil; Hauskrecht, Milos

    2015-01-01

    This paper studies multi-label classification problem in which data instances are associated with multiple, possibly high-dimensional, label vectors. This problem is especially challenging when labels are dependent and one cannot decompose the problem into a set of independent classification problems. To address the problem and properly represent label dependencies we propose and study a pairwise conditional random Field (CRF) model. We develop a new approach for learning the structure and parameters of the CRF from data. The approach maximizes the pseudo likelihood of observed labels and relies on the fast proximal gradient descend for learning the structure and limited memory BFGS for learning the parameters of the model. Empirical results on several datasets show that our approach outperforms several multi-label classification baselines, including recently published state-of-the-art methods. PMID:25927015

  18. Power spectral ensity of markov texture fields

    NASA Technical Reports Server (NTRS)

    Shanmugan, K. S.; Holtzman, J. C.

    1984-01-01

    Texture is an important image characteristic. A variety of spatial domain techniques were proposed for extracting and utilizing textural features for segmenting and classifying images. for the most part, these spatial domain techniques are ad hos in nature. A markov random field model for image texture is discussed. A frequency domain description of image texture is derived in terms of the power spectral density. This model is used for designing optimum frequency domain filters for enhancing, restoring and segmenting images based on their textural properties.

  19. Reactivating dynamics for the susceptible-infected-susceptible model: a simple method to simulate the absorbing phase

    NASA Astrophysics Data System (ADS)

    Macedo-Filho, A.; Alves, G. A.; Costa Filho, R. N.; Alves, T. F. A.

    2018-04-01

    We investigated the susceptible-infected-susceptible model on a square lattice in the presence of a conjugated field based on recently proposed reactivating dynamics. Reactivating dynamics consists of reactivating the infection by adding one infected site, chosen randomly when the infection dies out, avoiding the dynamics being trapped in the absorbing state. We show that the reactivating dynamics can be interpreted as the usual dynamics performed in the presence of an effective conjugated field, named the reactivating field. The reactivating field scales as the inverse of the lattice number of vertices n, which vanishes at the thermodynamic limit and does not affect any scaling properties including ones related to the conjugated field.

  20. The Differential Effect of Basic Mathematics Skills Homework via a Web-Based Intelligent Tutoring System across Achievement Subgroups and Mathematics Domains: A Randomized Field Experiment

    ERIC Educational Resources Information Center

    Bartelet, Dimona; Ghysels, Joris; Groot, Wim; Haelermans, Carla; van den Brink, Henriëtte Maassen

    2016-01-01

    This article examines an educational experiment with a unique combination of 3 elements: homework, the use of information and communication technology and a large degree of freedom of choice (student autonomy). More particularly, we study the effectiveness of a web-based intelligent tutoring system (ITS) that a school offers to its students as…

  1. Magnetic field line random walk in models and simulations of reduced magnetohydrodynamic turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snodin, A. P.; Ruffolo, D.; Oughton, S.

    2013-12-10

    The random walk of magnetic field lines is examined numerically and analytically in the context of reduced magnetohydrodynamic (RMHD) turbulence, which provides a useful description of plasmas dominated by a strong mean field, such as in the solar corona. A recently developed non-perturbative theory of magnetic field line diffusion is compared with the diffusion coefficients obtained by accurate numerical tracing of magnetic field lines for both synthetic models and direct numerical simulations of RMHD. Statistical analysis of an ensemble of trajectories confirms the applicability of the theory, which very closely matches the numerical field line diffusion coefficient as a functionmore » of distance z along the mean magnetic field for a wide range of the Kubo number R. This theory employs Corrsin's independence hypothesis, sometimes thought to be valid only at low R. However, the results demonstrate that it works well up to R = 10, both for a synthetic RMHD model and an RMHD simulation. The numerical results from the RMHD simulation are compared with and without phase randomization, demonstrating a clear effect of coherent structures on the field line random walk for a very low Kubo number.« less

  2. Improving guideline concordance in multidisciplinary teams: preliminary results of a cluster-randomized trial evaluating the effect of a web-based audit and feedback intervention with outreach visits

    PubMed Central

    van Engen-Verheul, Mariëtte M.; Gude, Wouter T.; van der Veer, Sabine N.; Kemps, Hareld M.C.; Jaspers, Monique M.W.; de Keizer, Nicolette F.; Peek, Niels

    2015-01-01

    Despite their widespread use, audit and feedback (A&F) interventions show variable effectiveness on improving professional performance. Based on known facilitators of successful A&F interventions, we developed a web-based A&F intervention with indicator-based performance feedback, benchmark information, action planning and outreach visits. The goal of the intervention was to engage with multidisciplinary teams to overcome barriers to guideline concordance and to improve overall team performance in the field of cardiac rehabilitation (CR). To assess its effectiveness we conducted a cluster-randomized trial in 18 CR clinics (14,847 patients) already working with computerized decision support (CDS). Our preliminary results showed no increase in concordance with guideline recommendations regarding prescription of CR therapies. Future analyses will investigate whether our intervention did improve team performance on other quality indicators. PMID:26958310

  3. Decision tree modeling using R.

    PubMed

    Zhang, Zhongheng

    2016-08-01

    In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.

  4. Unsteady Fast Random Particle Mesh method for efficient prediction of tonal and broadband noises of a centrifugal fan unit

    NASA Astrophysics Data System (ADS)

    Heo, Seung; Cheong, Cheolung; Kim, Taehoon

    2015-09-01

    In this study, efficient numerical method is proposed for predicting tonal and broadband noises of a centrifugal fan unit. The proposed method is based on Hybrid Computational Aero-Acoustic (H-CAA) techniques combined with Unsteady Fast Random Particle Mesh (U-FRPM) method. The U-FRPM method is developed by extending the FRPM method proposed by Ewert et al. and is utilized to synthesize turbulence flow field from unsteady RANS solutions. The H-CAA technique combined with U-FRPM method is applied to predict broadband as well as tonal noises of a centrifugal fan unit in a household refrigerator. Firstly, unsteady flow field driven by a rotating fan is computed by solving the RANS equations with Computational Fluid Dynamic (CFD) techniques. Main source regions around the rotating fan are identified by examining the computed flow fields. Then, turbulence flow fields in the main source regions are synthesized by applying the U-FRPM method. The acoustic analogy is applied to model acoustic sources in the main source regions. Finally, the centrifugal fan noise is predicted by feeding the modeled acoustic sources into an acoustic solver based on the Boundary Element Method (BEM). The sound spectral levels predicted using the current numerical method show good agreements with the measured spectra at the Blade Pass Frequencies (BPFs) as well as in the high frequency range. On the more, the present method enables quantitative assessment of relative contributions of identified source regions to the sound field by comparing predicted sound pressure spectrum due to modeled sources.

  5. The effects of the one-step replica symmetry breaking on the Sherrington-Kirkpatrick spin glass model in the presence of random field with a joint Gaussian probability density function for the exchange interactions and random fields

    NASA Astrophysics Data System (ADS)

    Hadjiagapiou, Ioannis A.; Velonakis, Ioannis N.

    2018-07-01

    The Sherrington-Kirkpatrick Ising spin glass model, in the presence of a random magnetic field, is investigated within the framework of the one-step replica symmetry breaking. The two random variables (exchange integral interaction Jij and random magnetic field hi) are drawn from a joint Gaussian probability density function characterized by a correlation coefficient ρ, assuming positive and negative values. The thermodynamic properties, the three different phase diagrams and system's parameters are computed with respect to the natural parameters of the joint Gaussian probability density function at non-zero and zero temperatures. The low temperature negative entropy controversy, a result of the replica symmetry approach, has been partly remedied in the current study, leading to a less negative result. In addition, the present system possesses two successive spin glass phase transitions with characteristic temperatures.

  6. Random number generators for large-scale parallel Monte Carlo simulations on FPGA

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Wang, F.; Liu, B.

    2018-05-01

    Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.

  7. A Memory-Based Programmable Logic Device Using Look-Up Table Cascade with Synchronous Static Random Access Memories

    NASA Astrophysics Data System (ADS)

    Nakamura, Kazuyuki; Sasao, Tsutomu; Matsuura, Munehiro; Tanaka, Katsumasa; Yoshizumi, Kenichi; Nakahara, Hiroki; Iguchi, Yukihiro

    2006-04-01

    A large-scale memory-technology-based programmable logic device (PLD) using a look-up table (LUT) cascade is developed in the 0.35-μm standard complementary metal oxide semiconductor (CMOS) logic process. Eight 64 K-bit synchronous SRAMs are connected to form an LUT cascade with a few additional circuits. The features of the LUT cascade include: 1) a flexible cascade connection structure, 2) multi phase pseudo asynchronous operations with synchronous static random access memory (SRAM) cores, and 3) LUT-bypass redundancy. This chip operates at 33 MHz in 8-LUT cascades at 122 mW. Benchmark results show that it achieves a comparable performance to field programmable gate array (FPGAs).

  8. Digital Sound Encryption with Logistic Map and Number Theoretic Transform

    NASA Astrophysics Data System (ADS)

    Satria, Yudi; Gabe Rizky, P. H.; Suryadi, MT

    2018-03-01

    Digital sound security has limits on encrypting in Frequency Domain. Number Theoretic Transform based on field (GF 2521 – 1) improve and solve that problem. The algorithm for this sound encryption is based on combination of Chaos function and Number Theoretic Transform. The Chaos function that used in this paper is Logistic Map. The trials and the simulations are conducted by using 5 different digital sound files data tester in Wave File Extension Format and simulated at least 100 times each. The key stream resulted is random with verified by 15 NIST’s randomness test. The key space formed is very big which more than 10469. The processing speed of algorithm for encryption is slightly affected by Number Theoretic Transform.

  9. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    PubMed

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.

  10. Analysis And Validation of the Field Coupled Through an Aperture in an Avionics Enclosure

    NASA Astrophysics Data System (ADS)

    Bakore, Rahul

    This work focused on accurately predicting the current response of an equipment under test (EUT) to a random electromagnetic field representing a threat source to model radio frequency directed energy weapons (RFDEWs). The modeled EUT consists of a single wire attached to the interior wall of a shielding enclosure that includes an aperture on one face. An in-house computational electromagnetic (CEM) code based on method of moments (MOM) and accelerated by the multi-level fast multipole algorithm (MLFMA), was enhanced through the implementation of first order vector basis functions that approximates the EUT surface current. The electric field integral equation (EFIE) is solved using MOM/MLFMA. Use of first-order basis functions gives a large savings in computational time over the previous implementation with zero-order Rao-Wilton-Glisson basis functions. A sample EUT was fabricated and tested within an anechoic chamber and a reverberation chamber over a wide frequency band. In the anechoic chamber measurements, the current response on the wire within the EUT due to a single uniform plane wave was found and compared with the numerical simulations. In the reverberation chamber measurements, the mean current magnitude excited on the wire within the EUT by a mechanically stirred random field was measured and compared with the numerical simulations. The measured scattering parameter between the source antenna and the EUT measurement port was used to derive the current response on the wire in both chambers. The numerically simulated currents agree very well with the measurements in both the anechoic and reverberation chambers over the measured frequency band, confirming the validity of the numerical approach for calculating EUT response due to a random field. An artificial neural network (ANN) was trained that can rapidly provide the mean induced current response of an EUT due to a random field under different aperture configurations arbitrarily placed on one face of an EUT. However, ANN proved no better than simple linear interpolation in approximating the induced currents on EUTs that give strong resonances and nulls in the response.

  11. Effect of random errors in planar PIV data on pressure estimation in vortex dominated flows

    NASA Astrophysics Data System (ADS)

    McClure, Jeffrey; Yarusevych, Serhiy

    2015-11-01

    The sensitivity of pressure estimation techniques from Particle Image Velocimetry (PIV) measurements to random errors in measured velocity data is investigated using the flow over a circular cylinder as a test case. Direct numerical simulations are performed for ReD = 100, 300 and 1575, spanning laminar, transitional, and turbulent wake regimes, respectively. A range of random errors typical for PIV measurements is applied to synthetic PIV data extracted from numerical results. A parametric study is then performed using a number of common pressure estimation techniques. Optimal temporal and spatial resolutions are derived based on the sensitivity of the estimated pressure fields to the simulated random error in velocity measurements, and the results are compared to an optimization model derived from error propagation theory. It is shown that the reductions in spatial and temporal scales at higher Reynolds numbers leads to notable changes in the optimal pressure evaluation parameters. The effect of smaller scale wake structures is also quantified. The errors in the estimated pressure fields are shown to depend significantly on the pressure estimation technique employed. The results are used to provide recommendations for the use of pressure and force estimation techniques from experimental PIV measurements in vortex dominated laminar and turbulent wake flows.

  12. Enhancement of Electron Acceleration in Laser Wakefields by Random Fields

    NASA Astrophysics Data System (ADS)

    Tataronis, J. A.; Petržílka, V.

    1999-11-01

    There is increasing evidence that intense laser pulses can accelerate electrons to high energies. The energy appears to increase with the distance over which the electrons are accelerated. This is difficult to explain by electron trapping in a single wakefield wave.^1 We demonstrate that enhanced electron acceleration can arise in inhomogeneous laser wakefields through the effects of spontaneously excited random fields. This acceleration mechanism is analogous to fast electron production by random fields near rf antennae in fusion devices and helicon plasma sources.^2 Electron acceleration in a transverse laser wave due to random field effects was recently found.^3 In the present study we solve numerically the governing equations of an ensemble of test electrons in a longitudinal electric wakefield perturbed by random fields. [1pt] Supported by the Czech grant IGA A1043701 and the U.S. DOE under grant No. DE-FG02-97ER54398. [1pt] 1. A. Pukhov and J. Meyer-ter-Vehn, in Superstrong Fields in Plasmas, AIP Conf. Proc. 426, p. 93 (1997). 2. V. Petržílka, J. A. Tataronis, et al., in Proc. Varenna - Lausanne Fusion Theory Workshop, p. 95 (1998). 3. J. Meyer-ter-Vehn and Z. M. Sheng, Phys. Plasmas 6, 641 (1999).

  13. What is quantum in quantum randomness?

    PubMed

    Grangier, P; Auffèves, A

    2018-07-13

    It is often said that quantum and classical randomness are of different nature, the former being ontological and the latter epistemological. However, so far the question of 'What is quantum in quantum randomness?', i.e. what is the impact of quantization and discreteness on the nature of randomness, remains to be answered. In a first part, we make explicit the differences between quantum and classical randomness within a recently proposed ontology for quantum mechanics based on contextual objectivity. In this view, quantum randomness is the result of contextuality and quantization. We show that this approach strongly impacts the purposes of quantum theory as well as its areas of application. In particular, it challenges current programmes inspired by classical reductionism, aiming at the emergence of the classical world from a large number of quantum systems. In a second part, we analyse quantum physics and thermodynamics as theories of randomness, unveiling their mutual influences. We finally consider new technological applications of quantum randomness that have opened up in the emerging field of quantum thermodynamics.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  14. The Effects of ABRACADABRA on Reading Outcomes: A Meta-Analysis of Applied Field Research

    ERIC Educational Resources Information Center

    Abrami, Philip; Borohkovski, Eugene; Lysenko, Larysa

    2015-01-01

    This meta-analysis summarizes research on the effects of a comprehensive, interactive web-based software (AXXX) on the development of reading competencies among kindergarteners and elementary students. Findings from seven randomized control trials and quasi-experimental studies undertaken in a variety of contexts across Canada, Australia and Kenya…

  15. A Deep-Structured Conditional Random Field Model for Object Silhouette Tracking

    PubMed Central

    Shafiee, Mohammad Javad; Azimifar, Zohreh; Wong, Alexander

    2015-01-01

    In this work, we introduce a deep-structured conditional random field (DS-CRF) model for the purpose of state-based object silhouette tracking. The proposed DS-CRF model consists of a series of state layers, where each state layer spatially characterizes the object silhouette at a particular point in time. The interactions between adjacent state layers are established by inter-layer connectivity dynamically determined based on inter-frame optical flow. By incorporate both spatial and temporal context in a dynamic fashion within such a deep-structured probabilistic graphical model, the proposed DS-CRF model allows us to develop a framework that can accurately and efficiently track object silhouettes that can change greatly over time, as well as under different situations such as occlusion and multiple targets within the scene. Experiment results using video surveillance datasets containing different scenarios such as occlusion and multiple targets showed that the proposed DS-CRF approach provides strong object silhouette tracking performance when compared to baseline methods such as mean-shift tracking, as well as state-of-the-art methods such as context tracking and boosted particle filtering. PMID:26313943

  16. BMRF-Net: a software tool for identification of protein interaction subnetworks by a bagging Markov random field-based method.

    PubMed

    Shi, Xu; Barnes, Robert O; Chen, Li; Shajahan-Haq, Ayesha N; Hilakivi-Clarke, Leena; Clarke, Robert; Wang, Yue; Xuan, Jianhua

    2015-07-15

    Identification of protein interaction subnetworks is an important step to help us understand complex molecular mechanisms in cancer. In this paper, we develop a BMRF-Net package, implemented in Java and C++, to identify protein interaction subnetworks based on a bagging Markov random field (BMRF) framework. By integrating gene expression data and protein-protein interaction data, this software tool can be used to identify biologically meaningful subnetworks. A user friendly graphic user interface is developed as a Cytoscape plugin for the BMRF-Net software to deal with the input/output interface. The detailed structure of the identified networks can be visualized in Cytoscape conveniently. The BMRF-Net package has been applied to breast cancer data to identify significant subnetworks related to breast cancer recurrence. The BMRF-Net package is available at http://sourceforge.net/projects/bmrfcjava/. The package is tested under Ubuntu 12.04 (64-bit), Java 7, glibc 2.15 and Cytoscape 3.1.0. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Fourier transform infrared spectroscopy microscopic imaging classification based on spatial-spectral features

    NASA Astrophysics Data System (ADS)

    Liu, Lian; Yang, Xiukun; Zhong, Mingliang; Liu, Yao; Jing, Xiaojun; Yang, Qin

    2018-04-01

    The discrete fractional Brownian incremental random (DFBIR) field is used to describe the irregular, random, and highly complex shapes of natural objects such as coastlines and biological tissues, for which traditional Euclidean geometry cannot be used. In this paper, an anisotropic variable window (AVW) directional operator based on the DFBIR field model is proposed for extracting spatial characteristics of Fourier transform infrared spectroscopy (FTIR) microscopic imaging. Probabilistic principal component analysis first extracts spectral features, and then the spatial features of the proposed AVW directional operator are combined with the former to construct a spatial-spectral structure, which increases feature-related information and helps a support vector machine classifier to obtain more efficient distribution-related information. Compared to Haralick’s grey-level co-occurrence matrix, Gabor filters, and local binary patterns (e.g. uniform LBPs, rotation-invariant LBPs, uniform rotation-invariant LBPs), experiments on three FTIR spectroscopy microscopic imaging datasets show that the proposed AVW directional operator is more advantageous in terms of classification accuracy, particularly for low-dimensional spaces of spatial characteristics.

  18. Influence of stochastic geometric imperfections on the load-carrying behaviour of thin-walled structures using constrained random fields

    NASA Astrophysics Data System (ADS)

    Lauterbach, S.; Fina, M.; Wagner, W.

    2018-04-01

    Since structural engineering requires highly developed and optimized structures, the thickness dependency is one of the most controversially debated topics. This paper deals with stability analysis of lightweight thin structures combined with arbitrary geometrical imperfections. Generally known design guidelines only consider imperfections for simple shapes and loading, whereas for complex structures the lower-bound design philosophy still holds. Herein, uncertainties are considered with an empirical knockdown factor representing a lower bound of existing measurements. To fully understand and predict expected bearable loads, numerical investigations are essential, including geometrical imperfections. These are implemented into a stand-alone program code with a stochastic approach to compute random fields as geometric imperfections that are applied to nodes of the finite element mesh of selected structural examples. The stochastic approach uses the Karhunen-Loève expansion for the random field discretization. For this approach, the so-called correlation length l_c controls the random field in a powerful way. This parameter has a major influence on the buckling shape, and also on the stability load. First, the impact of the correlation length is studied for simple structures. Second, since most structures for engineering devices are more complex and combined structures, these are intensively discussed with the focus on constrained random fields for e.g. flange-web-intersections. Specific constraints for those random fields are pointed out with regard to the finite element model. Further, geometrical imperfections vanish where the structure is supported.

  19. Random forest regression modelling for forest aboveground biomass estimation using RISAT-1 PolSAR and terrestrial LiDAR data

    NASA Astrophysics Data System (ADS)

    Mangla, Rohit; Kumar, Shashi; Nandy, Subrata

    2016-05-01

    SAR and LiDAR remote sensing have already shown the potential of active sensors for forest parameter retrieval. SAR sensor in its fully polarimetric mode has an advantage to retrieve scattering property of different component of forest structure and LiDAR has the capability to measure structural information with very high accuracy. This study was focused on retrieval of forest aboveground biomass (AGB) using Terrestrial Laser Scanner (TLS) based point clouds and scattering property of forest vegetation obtained from decomposition modelling of RISAT-1 fully polarimetric SAR data. TLS data was acquired for 14 plots of Timli forest range, Uttarakhand, India. The forest area is dominated by Sal trees and random sampling with plot size of 0.1 ha (31.62m*31.62m) was adopted for TLS and field data collection. RISAT-1 data was processed to retrieve SAR data based variables and TLS point clouds based 3D imaging was done to retrieve LiDAR based variables. Surface scattering, double-bounce scattering, volume scattering, helix and wire scattering were the SAR based variables retrieved from polarimetric decomposition. Tree heights and stem diameters were used as LiDAR based variables retrieved from single tree vertical height and least square circle fit methods respectively. All the variables obtained for forest plots were used as an input in a machine learning based Random Forest Regression Model, which was developed in this study for forest AGB estimation. Modelled output for forest AGB showed reliable accuracy (RMSE = 27.68 t/ha) and a good coefficient of determination (0.63) was obtained through the linear regression between modelled AGB and field-estimated AGB. The sensitivity analysis showed that the model was more sensitive for the major contributed variables (stem diameter and volume scattering) and these variables were measured from two different remote sensing techniques. This study strongly recommends the integration of SAR and LiDAR data for forest AGB estimation.

  20. The effects of Internet or interactive computer-based patient education in the field of breast cancer: a systematic literature review.

    PubMed

    Ryhänen, Anne M; Siekkinen, Mervi; Rankinen, Sirkku; Korvenranta, Heikki; Leino-Kilpi, Helena

    2010-04-01

    The aim of this systematic review was to analyze what kind of Internet or interactive computer-based patient education programs have been developed and to analyze the effectiveness of these programs in the field of breast cancer patient education. Patient education for breast cancer patients is an important intervention to empower the patient. However, we know very little about the effects and potential of Internet-based patient education in the empowerment of breast cancer patients. Complete databases were searched covering the period from the beginning of each database to November 2008. Studies were included if they concerned patient education for breast cancer patients with Internet or interactive computer programs and were based on randomized controlled, on clinical trials or quasi-experimental studies. We identified 14 articles involving 2374 participants. The design was randomized controlled trial in nine papers, in two papers clinical trial and in three quasi-experimental. Seven of the studies were randomized to experimental and control groups, in two papers participants were grouped by ethnic and racial differences and by mode of Internet use and three studies measured the same group pre- and post-tests after using a computer program. The interventions used were described as interactive computer or multimedia programs and use of the Internet. The methodological solutions of the studies varied. The effects of the studies were diverse except for knowledge-related issues. Internet or interactive computer-based patient education programs in the care of breast cancer patients may have positive effect increasing breast cancer knowledge. The results suggest a positive relationship between the Internet or computer-based patient education program use and the knowledge level of patients with breast cancer but a diverse relationship between patient's participation and other outcome measures. There is need to develop and research more Internet-based patient education. 2009 Elsevier Ireland Ltd. All rights reserved.

  1. Detection and inpainting of facial wrinkles using texture orientation fields and Markov random field modeling.

    PubMed

    Batool, Nazre; Chellappa, Rama

    2014-09-01

    Facial retouching is widely used in media and entertainment industry. Professional software usually require a minimum level of user expertise to achieve the desirable results. In this paper, we present an algorithm to detect facial wrinkles/imperfection. We believe that any such algorithm would be amenable to facial retouching applications. The detection of wrinkles/imperfections can allow these skin features to be processed differently than the surrounding skin without much user interaction. For detection, Gabor filter responses along with texture orientation field are used as image features. A bimodal Gaussian mixture model (GMM) represents distributions of Gabor features of normal skin versus skin imperfections. Then, a Markov random field model is used to incorporate the spatial relationships among neighboring pixels for their GMM distributions and texture orientations. An expectation-maximization algorithm then classifies skin versus skin wrinkles/imperfections. Once detected automatically, wrinkles/imperfections are removed completely instead of being blended or blurred. We propose an exemplar-based constrained texture synthesis algorithm to inpaint irregularly shaped gaps left by the removal of detected wrinkles/imperfections. We present results conducted on images downloaded from the Internet to show the efficacy of our algorithms.

  2. Analysis of dependent scattering mechanism in hard-sphere Yukawa random media

    NASA Astrophysics Data System (ADS)

    Wang, B. X.; Zhao, C. Y.

    2018-06-01

    The structural correlations in the microscopic structures of random media can induce the dependent scattering mechanism and thus influence the optical scattering properties. Based on our recent theory on the dependent scattering mechanism in random media composed of discrete dipolar scatterers [B. X. Wang and C. Y. Zhao, Phys. Rev. A 97, 023836 (2018)], in this paper, we study the hard-sphere Yukawa random media, in order to further elucidate the role of structural correlations in the dependent scattering mechanism and hence optical scattering properties. Here, we consider charged colloidal suspensions, whose effective pair interaction between colloids is described by a screened Coulomb (Yukawa) potential. By means of adding salt ions, the pair interaction between the charged particles can be flexibly tailored and therefore the structural correlations are modified. It is shown that this strategy can affect the optical properties significantly. For colloidal TiO2 suspensions, the modification of electric and magnetic dipole excitations induced by the structural correlations can substantially influence the optical scattering properties, in addition to the far-field interference effect described by the structure factor. However, this modification is only slightly altered by different salt concentrations and is mainly because of the packing-density-dependent screening effect. On the other hand, for low refractive index colloidal polystyrene suspensions, the dependent scattering mechanism mainly involves the far-field interference effect, and the effective exciting field amplitude for the electric dipole almost remains unchanged under different structural correlations. The present study has profound implications for understanding the role of structural correlations in the dependent scattering mechanism.

  3. Delivering successful randomized controlled trials in surgery: Methods to optimize collaboration and study design.

    PubMed

    Blencowe, Natalie S; Cook, Jonathan A; Pinkney, Thomas; Rogers, Chris; Reeves, Barnaby C; Blazeby, Jane M

    2017-04-01

    Randomized controlled trials in surgery are notoriously difficult to design and conduct due to numerous methodological and cultural challenges. Over the last 5 years, several UK-based surgical trial-related initiatives have been funded to address these issues. These include the development of Surgical Trials Centers and Surgical Specialty Leads (individual surgeons responsible for championing randomized controlled trials in their specialist fields), both funded by the Royal College of Surgeons of England; networks of research-active surgeons in training; and investment in methodological research relating to surgical randomized controlled trials (to address issues such as recruitment, blinding, and the selection and standardization of interventions). This article discusses these initiatives more in detail and provides exemplar cases to illustrate how the methodological challenges have been tackled. The initiatives have surpassed expectations, resulting in a renaissance in surgical research throughout the United Kingdom, such that the number of patients entering surgical randomized controlled trials has doubled.

  4. Random isotropic one-dimensional XY-model

    NASA Astrophysics Data System (ADS)

    Gonçalves, L. L.; Vieira, A. P.

    1998-01-01

    The 1D isotropic s = ½XY-model ( N sites), with random exchange interaction in a transverse random field is considered. The random variables satisfy bimodal quenched distributions. The solution is obtained by using the Jordan-Wigner fermionization and a canonical transformation, reducing the problem to diagonalizing an N × N matrix, corresponding to a system of N noninteracting fermions. The calculations are performed numerically for N = 1000, and the field-induced magnetization at T = 0 is obtained by averaging the results for the different samples. For the dilute case, in the uniform field limit, the magnetization exhibits various discontinuities, which are the consequence of the existence of disconnected finite clusters distributed along the chain. Also in this limit, for finite exchange constants J A and J B, as the probability of J A varies from one to zero, the saturation field is seen to vary from Γ A to Γ B, where Γ A(Γ B) is the value of the saturation field for the pure case with exchange constant equal to J A(J B) .

  5. Micromechanical analysis of composites with fibers distributed randomly over the transverse cross-section

    NASA Astrophysics Data System (ADS)

    Weng, Jingmeng; Wen, Weidong; Cui, Haitao; Chen, Bo

    2018-06-01

    A new method to generate the random distribution of fibers in the transverse cross-section of fiber reinforced composites with high fiber volume fraction is presented in this paper. Based on the microscopy observation of the transverse cross-sections of unidirectional composite laminates, hexagon arrangement is set as the initial arrangement status, and the initial velocity of each fiber is arbitrary at an arbitrary direction, the micro-scale representative volume element (RVE) is established by simulating perfectly elastic collision. Combined with the proposed periodic boundary conditions which are suitable for multi-axial loading, the effective elastic properties of composite materials can be predicted. The predicted properties show reasonable agreement with experimental results. By comparing the stress field of RVE with fibers distributed randomly and RVE with fibers distributed periodically, the predicted elastic modulus of RVE with fibers distributed randomly is greater than RVE with fibers distributed periodically.

  6. Exact PDF equations and closure approximations for advective-reactive transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venturi, D.; Tartakovsky, Daniel M.; Tartakovsky, Alexandre M.

    2013-06-01

    Mathematical models of advection–reaction phenomena rely on advective flow velocity and (bio) chemical reaction rates that are notoriously random. By using functional integral methods, we derive exact evolution equations for the probability density function (PDF) of the state variables of the advection–reaction system in the presence of random transport velocity and random reaction rates with rather arbitrary distributions. These PDF equations are solved analytically for transport with deterministic flow velocity and a linear reaction rate represented mathematically by a heterog eneous and strongly-correlated random field. Our analytical solution is then used to investigate the accuracy and robustness of the recentlymore » proposed large-eddy diffusivity (LED) closure approximation [1]. We find that the solution to the LED-based PDF equation, which is exact for uncorrelated reaction rates, is accurate even in the presence of strong correlations and it provides an upper bound of predictive uncertainty.« less

  7. Investigating the Contextual Interference Effect Using Combination Sports Skills in Open and Closed Skill Environments

    PubMed Central

    Cheong, Jadeera P.G.; Lay, Brendan; Razman, Rizal

    2016-01-01

    This study attempted to present conditions that were closer to the real-world setting of team sports. The primary purpose was to examine the effects of blocked, random and game-based training practice schedules on the learning of the field hockey trap, close dribble and push pass that were practiced in combination. The secondary purpose was to investigate the effects of predictability of the environment on the learning of field hockey sport skills according to different practice schedules. A game-based training protocol represented a form of random practice in an unstable environment and was compared against a blocked and a traditional random practice schedule. In general, all groups improved dribble and push accuracy performance during the acquisition phase when assessed in a closed environment. In the retention phase, there were no differences between the three groups. When assessed in an open skills environment, all groups improved their percentage of successful executions for trapping and passing execution, and improved total number of attempts and total number of successful executions for both dribbling and shooting execution. Between-group differences were detected for dribbling execution with the game-based group scoring a higher number of dribbling successes. The CI effect did not emerge when practicing and assessing multiple sport skills in a closed skill environment, even when the skills were practiced in combination. However, when skill assessment was conducted in a real-world situation, there appeared to be some support for the CI effect. Key points The contextual interference effect was not supported when practicing several skills in combination when the sports skills were assessed in a closed skill environment. There appeared to be some support for the contextual interference effect when sports skills were assessed in an open skill environment, which were similar to a real game situation. A game-based training schedule can be used as an alternative practice schedule as it displayed superior learning compared to a blocked practice schedule when assessed by the game performance test (real-world setting). The game-based training schedule also matched the blocked and random practice schedules in the other tests. PMID:26957940

  8. Investigating the Contextual Interference Effect Using Combination Sports Skills in Open and Closed Skill Environments.

    PubMed

    Cheong, Jadeera P G; Lay, Brendan; Razman, Rizal

    2016-03-01

    This study attempted to present conditions that were closer to the real-world setting of team sports. The primary purpose was to examine the effects of blocked, random and game-based training practice schedules on the learning of the field hockey trap, close dribble and push pass that were practiced in combination. The secondary purpose was to investigate the effects of predictability of the environment on the learning of field hockey sport skills according to different practice schedules. A game-based training protocol represented a form of random practice in an unstable environment and was compared against a blocked and a traditional random practice schedule. In general, all groups improved dribble and push accuracy performance during the acquisition phase when assessed in a closed environment. In the retention phase, there were no differences between the three groups. When assessed in an open skills environment, all groups improved their percentage of successful executions for trapping and passing execution, and improved total number of attempts and total number of successful executions for both dribbling and shooting execution. Between-group differences were detected for dribbling execution with the game-based group scoring a higher number of dribbling successes. The CI effect did not emerge when practicing and assessing multiple sport skills in a closed skill environment, even when the skills were practiced in combination. However, when skill assessment was conducted in a real-world situation, there appeared to be some support for the CI effect. Key pointsThe contextual interference effect was not supported when practicing several skills in combination when the sports skills were assessed in a closed skill environment.There appeared to be some support for the contextual interference effect when sports skills were assessed in an open skill environment, which were similar to a real game situation.A game-based training schedule can be used as an alternative practice schedule as it displayed superior learning compared to a blocked practice schedule when assessed by the game performance test (real-world setting). The game-based training schedule also matched the blocked and random practice schedules in the other tests.

  9. Microseismic response characteristics modeling and locating of underground water supply pipe leak

    NASA Astrophysics Data System (ADS)

    Wang, J.; Liu, J.

    2015-12-01

    In traditional methods of pipeline leak location, geophones must be located on the pipe wall. If the exact location of the pipeline is unknown, the leaks cannot be identified accurately. To solve this problem, taking into account the characteristics of the pipeline leak, we propose a continuous random seismic source model and construct geological models to investigate the proposed method for locating underground pipeline leaks. Based on two dimensional (2D) viscoacoustic equations and the staggered grid finite-difference (FD) algorithm, the microseismic wave field generated by a leaking pipe is modeled. Cross-correlation analysis and the simulated annealing (SA) algorithm were utilized to obtain the time difference and the leak location. We also analyze and discuss the effect of the number of recorded traces, the survey layout, and the offset and interval of the traces on the accuracy of the estimated location. The preliminary results of the simulation and data field experiment indicate that (1) a continuous random source can realistically represent the leak microseismic wave field in a simulation using 2D visco-acoustic equations and a staggered grid FD algorithm. (2) The cross-correlation method is effective for calculating the time difference of the direct wave relative to the reference trace. However, outside the refraction blind zone, the accuracy of the time difference is reduced by the effects of the refracted wave. (3) The acquisition method of time difference based on the microseismic theory and SA algorithm has a great potential for locating leaks from underground pipelines from an array located on the ground surface. Keywords: Viscoacoustic finite-difference simulation; continuous random source; simulated annealing algorithm; pipeline leak location

  10. How many photons are needed to reconstruct random objects in coherent X-ray diffractive imaging?

    PubMed

    Jahn, T; Wilke, R N; Chushkin, Y; Salditt, T

    2017-01-01

    This paper presents an investigation of the reconstructibility of coherent X-ray diffractive imaging diffraction patterns for a class of binary random `bitmap' objects. Combining analytical results and numerical simulations, the critical fluence per bitmap pixel is determined, for arbitrary contrast values (absorption level and phase shift), both for the optical near- and far-field. This work extends previous investigations based on information theory, enabling a comparison of the amount of information carried by single photons in different diffraction regimes. The experimental results show an order-of-magnitude agreement.

  11. MAGNETIC FIELD LINE RANDOM WALK FOR DISTURBED FLUX SURFACES: TRAPPING EFFECTS AND MULTIPLE ROUTES TO BOHM DIFFUSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghilea, M. C.; Ruffolo, D.; Sonsrettee, W.

    2011-11-01

    The magnetic field line random walk (FLRW) is important for the transport of energetic particles in many astrophysical situations. While all authors agree on the quasilinear diffusion of field lines for fluctuations that mainly vary parallel to a large-scale field, for the opposite case of fluctuations that mainly vary in the perpendicular directions, there has been an apparent conflict between concepts of Bohm diffusion and percolation/trapping effects. Here computer simulation and non-perturbative analytic techniques are used to re-examine the FLRW in magnetic turbulence with slab and two-dimensional (2D) components, in which 2D flux surfaces are disturbed by the slab fluctuations.more » Previous non-perturbative theories for D{sub perpendicular}, based on Corrsin's hypothesis, have identified a slab contribution with quasilinear behavior and a 2D contribution due to Bohm diffusion with diffusive decorrelation (DD), combined in a quadratic formula. Here we present analytic theories for other routes to Bohm diffusion, with random ballistic decorrelation (RBD) either due to the 2D component itself (for a weak slab contribution) or the total fluctuation field (for a strong slab contribution), combined in a direct sum with the slab contribution. Computer simulations confirm the applicability of RBD routes for weak or strong slab contributions, while the DD route applies for a moderate slab contribution. For a very low slab contribution, interesting trapping effects are found, including a depressed diffusion coefficient and subdiffusive behavior. Thus quasilinear, Bohm, and trapping behaviors are all found in the same system, together with an overall viewpoint to explain these behaviors.« less

  12. Mapping Sub-Saharan African Agriculture in High-Resolution Satellite Imagery with Computer Vision & Machine Learning

    NASA Astrophysics Data System (ADS)

    Debats, Stephanie Renee

    Smallholder farms dominate in many parts of the world, including Sub-Saharan Africa. These systems are characterized by small, heterogeneous, and often indistinct field patterns, requiring a specialized methodology to map agricultural landcover. In this thesis, we developed a benchmark labeled data set of high-resolution satellite imagery of agricultural fields in South Africa. We presented a new approach to mapping agricultural fields, based on efficient extraction of a vast set of simple, highly correlated, and interdependent features, followed by a random forest classifier. The algorithm achieved similar high performance across agricultural types, including spectrally indistinct smallholder fields, and demonstrated the ability to generalize across large geographic areas. In sensitivity analyses, we determined multi-temporal images provided greater performance gains than the addition of multi-spectral bands. We also demonstrated how active learning can be incorporated in the algorithm to create smaller, more efficient training data sets, which reduced computational resources, minimized the need for humans to hand-label data, and boosted performance. We designed a patch-based uncertainty metric to drive the active learning framework, based on the regular grid of a crowdsourcing platform, and demonstrated how subject matter experts can be replaced with fleets of crowdsourcing workers. Our active learning algorithm achieved similar performance as an algorithm trained with randomly selected data, but with 62% less data samples. This thesis furthers the goal of providing accurate agricultural landcover maps, at a scale that is relevant for the dominant smallholder class. Accurate maps are crucial for monitoring and promoting agricultural production. Furthermore, improved agricultural landcover maps will aid a host of other applications, including landcover change assessments, cadastral surveys to strengthen smallholder land rights, and constraints for crop modeling and famine prediction.

  13. The non-equilibrium allele frequency spectrum in a Poisson random field framework.

    PubMed

    Kaj, Ingemar; Mugal, Carina F

    2016-10-01

    In population genetic studies, the allele frequency spectrum (AFS) efficiently summarizes genome-wide polymorphism data and shapes a variety of allele frequency-based summary statistics. While existing theory typically features equilibrium conditions, emerging methodology requires an analytical understanding of the build-up of the allele frequencies over time. In this work, we use the framework of Poisson random fields to derive new representations of the non-equilibrium AFS for the case of a Wright-Fisher population model with selection. In our approach, the AFS is a scaling-limit of the expectation of a Poisson stochastic integral and the representation of the non-equilibrium AFS arises in terms of a fixation time probability distribution. The known duality between the Wright-Fisher diffusion process and a birth and death process generalizing Kingman's coalescent yields an additional representation. The results carry over to the setting of a random sample drawn from the population and provide the non-equilibrium behavior of sample statistics. Our findings are consistent with and extend a previous approach where the non-equilibrium AFS solves a partial differential forward equation with a non-traditional boundary condition. Moreover, we provide a bridge to previous coalescent-based work, and hence tie several frameworks together. Since frequency-based summary statistics are widely used in population genetics, for example, to identify candidate loci of adaptive evolution, to infer the demographic history of a population, or to improve our understanding of the underlying mechanics of speciation events, the presented results are potentially useful for a broad range of topics. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Filling of Cloud-Induced Gaps for Land Use and Land Cover Classifications Around Refugee Camps

    NASA Astrophysics Data System (ADS)

    Braun, Andreas; Hagensieker, Ron; Hochschild, Volker

    2016-08-01

    Clouds cover is one of the main constraints in the field of optical remote sensing. Especially the use of multispectral imagery is affected by either fully obscured data or parts of the image which remain unusable. This study compares four algorithms for the filling of cloud induced gaps in classified land cover products based on Markov Random Fields (MRF), Random Forest (RF), Closest Spectral Fit (CSF) operators. They are tested on a classified image of Sentinel-2 where artificial clouds are filled by information derived from a scene of Sentinel-1. The approaches rely on different mathematical principles and therefore produced results varying in both pattern and quality. Overall accuracies for the filled areas range from 57 to 64 %. Best results are achieved by CSF, however some classes (e.g. sands and grassland) remain critical through all approaches.

  15. Efficient prediction designs for random fields.

    PubMed

    Müller, Werner G; Pronzato, Luc; Rendas, Joao; Waldl, Helmut

    2015-03-01

    For estimation and predictions of random fields, it is increasingly acknowledged that the kriging variance may be a poor representative of true uncertainty. Experimental designs based on more elaborate criteria that are appropriate for empirical kriging (EK) are then often non-space-filling and very costly to determine. In this paper, we investigate the possibility of using a compound criterion inspired by an equivalence theorem type relation to build designs quasi-optimal for the EK variance when space-filling designs become unsuitable. Two algorithms are proposed, one relying on stochastic optimization to explicitly identify the Pareto front, whereas the second uses the surrogate criteria as local heuristic to choose the points at which the (costly) true EK variance is effectively computed. We illustrate the performance of the algorithms presented on both a simple simulated example and a real oceanographic dataset. © 2014 The Authors. Applied Stochastic Models in Business and Industry published by John Wiley & Sons, Ltd.

  16. CMOS integration of high-k/metal gate transistors in diffusion and gate replacement (D&GR) scheme for dynamic random access memory peripheral circuits

    NASA Astrophysics Data System (ADS)

    Dentoni Litta, Eugenio; Ritzenthaler, Romain; Schram, Tom; Spessot, Alessio; O’Sullivan, Barry; Machkaoutsan, Vladimir; Fazan, Pierre; Ji, Yunhyuck; Mannaert, Geert; Lorant, Christophe; Sebaai, Farid; Thiam, Arame; Ercken, Monique; Demuynck, Steven; Horiguchi, Naoto

    2018-04-01

    Integration of high-k/metal gate stacks in peripheral transistors is a major candidate to ensure continued scaling of dynamic random access memory (DRAM) technology. In this paper, the CMOS integration of diffusion and gate replacement (D&GR) high-k/metal gate stacks is investigated, evaluating four different approaches for the critical patterning step of removing the N-type field effect transistor (NFET) effective work function (eWF) shifter stack from the P-type field effect transistor (PFET) area. The effect of plasma exposure during the patterning step is investigated in detail and found to have a strong impact on threshold voltage tunability. A CMOS integration scheme based on an experimental wet-compatible photoresist is developed and the fulfillment of the main device metrics [equivalent oxide thickness (EOT), eWF, gate leakage current density, on/off currents, short channel control] is demonstrated.

  17. Macro-/Micro-Controlled 3D Lithium-Ion Batteries via Additive Manufacturing and Electric Field Processing.

    PubMed

    Li, Jie; Liang, Xinhua; Liou, Frank; Park, Jonghyun

    2018-01-30

    This paper presents a new concept for making battery electrodes that can simultaneously control macro-/micro-structures and help address current energy storage technology gaps and future energy storage requirements. Modern batteries are fabricated in the form of laminated structures that are composed of randomly mixed constituent materials. This randomness in conventional methods can provide a possibility of developing new breakthrough processing techniques to build well-organized structures that can improve battery performance. In the proposed processing, an electric field (EF) controls the microstructures of manganese-based electrodes, while additive manufacturing controls macro-3D structures and the integration of both scales. The synergistic control of micro-/macro-structures is a novel concept in energy material processing that has considerable potential for providing unprecedented control of electrode structures, thereby enhancing performance. Electrochemical tests have shown that these new electrodes exhibit superior performance in their specific capacity, areal capacity, and life cycle.

  18. Assessing the significance of global and local correlations under spatial autocorrelation: a nonparametric approach.

    PubMed

    Viladomat, Júlia; Mazumder, Rahul; McInturff, Alex; McCauley, Douglas J; Hastie, Trevor

    2014-06-01

    We propose a method to test the correlation of two random fields when they are both spatially autocorrelated. In this scenario, the assumption of independence for the pair of observations in the standard test does not hold, and as a result we reject in many cases where there is no effect (the precision of the null distribution is overestimated). Our method recovers the null distribution taking into account the autocorrelation. It uses Monte-Carlo methods, and focuses on permuting, and then smoothing and scaling one of the variables to destroy the correlation with the other, while maintaining at the same time the initial autocorrelation. With this simulation model, any test based on the independence of two (or more) random fields can be constructed. This research was motivated by a project in biodiversity and conservation in the Biology Department at Stanford University. © 2014, The International Biometric Society.

  19. High energy X-ray phase and dark-field imaging using a random absorption mask.

    PubMed

    Wang, Hongchang; Kashyap, Yogesh; Cai, Biao; Sawhney, Kawal

    2016-07-28

    High energy X-ray imaging has unique advantage over conventional X-ray imaging, since it enables higher penetration into materials with significantly reduced radiation damage. However, the absorption contrast in high energy region is considerably low due to the reduced X-ray absorption cross section for most materials. Even though the X-ray phase and dark-field imaging techniques can provide substantially increased contrast and complementary information, fabricating dedicated optics for high energies still remain a challenge. To address this issue, we present an alternative X-ray imaging approach to produce transmission, phase and scattering signals at high X-ray energies by using a random absorption mask. Importantly, in addition to the synchrotron radiation source, this approach has been demonstrated for practical imaging application with a laboratory-based microfocus X-ray source. This new imaging method could be potentially useful for studying thick samples or heavy materials for advanced research in materials science.

  20. On-Chip Fluorescence Switching System for Constructing a Rewritable Random Access Data Storage Device.

    PubMed

    Nguyen, Hoang Hiep; Park, Jeho; Hwang, Seungwoo; Kwon, Oh Seok; Lee, Chang-Soo; Shin, Yong-Beom; Ha, Tai Hwan; Kim, Moonil

    2018-01-10

    We report the development of on-chip fluorescence switching system based on DNA strand displacement and DNA hybridization for the construction of a rewritable and randomly accessible data storage device. In this study, the feasibility and potential effectiveness of our proposed system was evaluated with a series of wet experiments involving 40 bits (5 bytes) of data encoding a 5-charactered text (KRIBB). Also, a flexible data rewriting function was achieved by converting fluorescence signals between "ON" and "OFF" through DNA strand displacement and hybridization events. In addition, the proposed system was successfully validated on a microfluidic chip which could further facilitate the encoding and decoding process of data. To the best of our knowledge, this is the first report on the use of DNA hybridization and DNA strand displacement in the field of data storage devices. Taken together, our results demonstrated that DNA-based fluorescence switching could be applicable to construct a rewritable and randomly accessible data storage device through controllable DNA manipulations.

  1. Automated brain tumor segmentation using spatial accuracy-weighted hidden Markov Random Field.

    PubMed

    Nie, Jingxin; Xue, Zhong; Liu, Tianming; Young, Geoffrey S; Setayesh, Kian; Guo, Lei; Wong, Stephen T C

    2009-09-01

    A variety of algorithms have been proposed for brain tumor segmentation from multi-channel sequences, however, most of them require isotropic or pseudo-isotropic resolution of the MR images. Although co-registration and interpolation of low-resolution sequences, such as T2-weighted images, onto the space of the high-resolution image, such as T1-weighted image, can be performed prior to the segmentation, the results are usually limited by partial volume effects due to interpolation of low-resolution images. To improve the quality of tumor segmentation in clinical applications where low-resolution sequences are commonly used together with high-resolution images, we propose the algorithm based on Spatial accuracy-weighted Hidden Markov random field and Expectation maximization (SHE) approach for both automated tumor and enhanced-tumor segmentation. SHE incorporates the spatial interpolation accuracy of low-resolution images into the optimization procedure of the Hidden Markov Random Field (HMRF) to segment tumor using multi-channel MR images with different resolutions, e.g., high-resolution T1-weighted and low-resolution T2-weighted images. In experiments, we evaluated this algorithm using a set of simulated multi-channel brain MR images with known ground-truth tissue segmentation and also applied it to a dataset of MR images obtained during clinical trials of brain tumor chemotherapy. The results show that more accurate tumor segmentation results can be obtained by comparing with conventional multi-channel segmentation algorithms.

  2. Reducing RANS Model Error Using Random Forest

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Xun; Wu, Jin-Long; Xiao, Heng; Ling, Julia

    2016-11-01

    Reynolds-Averaged Navier-Stokes (RANS) models are still the work-horse tools in the turbulence modeling of industrial flows. However, the model discrepancy due to the inadequacy of modeled Reynolds stresses largely diminishes the reliability of simulation results. In this work we use a physics-informed machine learning approach to improve the RANS modeled Reynolds stresses and propagate them to obtain the mean velocity field. Specifically, the functional forms of Reynolds stress discrepancies with respect to mean flow features are trained based on an offline database of flows with similar characteristics. The random forest model is used to predict Reynolds stress discrepancies in new flows. Then the improved Reynolds stresses are propagated to the velocity field via RANS equations. The effects of expanding the feature space through the use of a complete basis of Galilean tensor invariants are also studied. The flow in a square duct, which is challenging for standard RANS models, is investigated to demonstrate the merit of the proposed approach. The results show that both the Reynolds stresses and the propagated velocity field are improved over the baseline RANS predictions. SAND Number: SAND2016-7437 A

  3. A dissipative random velocity field for fully developed fluid turbulence

    NASA Astrophysics Data System (ADS)

    Chevillard, Laurent; Pereira, Rodrigo; Garban, Christophe

    2016-11-01

    We investigate the statistical properties, based on numerical simulations and analytical calculations, of a recently proposed stochastic model for the velocity field of an incompressible, homogeneous, isotropic and fully developed turbulent flow. A key step in the construction of this model is the introduction of some aspects of the vorticity stretching mechanism that governs the dynamics of fluid particles along their trajectory. An additional further phenomenological step aimed at including the long range correlated nature of turbulence makes this model depending on a single free parameter that can be estimated from experimental measurements. We confirm the realism of the model regarding the geometry of the velocity gradient tensor, the power-law behaviour of the moments of velocity increments, including the intermittent corrections, and the existence of energy transfers across scales. We quantify the dependence of these basic properties of turbulent flows on the free parameter and derive analytically the spectrum of exponents of the structure functions in a simplified non dissipative case. A perturbative expansion shows that energy transfers indeed take place, justifying the dissipative nature of this random field.

  4. A novel approach to assess the treatment response using Gaussian random field in PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Mengdie; Guo, Ning; Hu, Guangshu

    2016-02-15

    Purpose: The assessment of early therapeutic response to anticancer therapy is vital for treatment planning and patient management in clinic. With the development of personal treatment plan, the early treatment response, especially before any anatomically apparent changes after treatment, becomes urgent need in clinic. Positron emission tomography (PET) imaging serves an important role in clinical oncology for tumor detection, staging, and therapy response assessment. Many studies on therapy response involve interpretation of differences between two PET images, usually in terms of standardized uptake values (SUVs). However, the quantitative accuracy of this measurement is limited. This work proposes a statistically robustmore » approach for therapy response assessment based on Gaussian random field (GRF) to provide a statistically more meaningful scale to evaluate therapy effects. Methods: The authors propose a new criterion for therapeutic assessment by incorporating image noise into traditional SUV method. An analytical method based on the approximate expressions of the Fisher information matrix was applied to model the variance of individual pixels in reconstructed images. A zero mean unit variance GRF under the null hypothesis (no response to therapy) was obtained by normalizing each pixel of the post-therapy image with the mean and standard deviation of the pretherapy image. The performance of the proposed method was evaluated by Monte Carlo simulation, where XCAT phantoms (128{sup 2} pixels) with lesions of various diameters (2–6 mm), multiple tumor-to-background contrasts (3–10), and different changes in intensity (6.25%–30%) were used. The receiver operating characteristic curves and the corresponding areas under the curve were computed for both the proposed method and the traditional methods whose figure of merit is the percentage change of SUVs. The formula for the false positive rate (FPR) estimation was developed for the proposed therapy response assessment utilizing local average method based on random field. The accuracy of the estimation was validated in terms of Euler distance and correlation coefficient. Results: It is shown that the performance of therapy response assessment is significantly improved by the introduction of variance with a higher area under the curve (97.3%) than SUVmean (91.4%) and SUVmax (82.0%). In addition, the FPR estimation serves as a good prediction for the specificity of the proposed method, consistent with simulation outcome with ∼1 correlation coefficient. Conclusions: In this work, the authors developed a method to evaluate therapy response from PET images, which were modeled as Gaussian random field. The digital phantom simulations demonstrated that the proposed method achieved a large reduction in statistical variability through incorporating knowledge of the variance of the original Gaussian random field. The proposed method has the potential to enable prediction of early treatment response and shows promise for application to clinical practice. In future work, the authors will report on the robustness of the estimation theory for application to clinical practice of therapy response evaluation, which pertains to binary discrimination tasks at a fixed location in the image such as detection of small and weak lesion.« less

  5. Optimal Symmetric Multimodal Templates and Concatenated Random Forests for Supervised Brain Tumor Segmentation (Simplified) with ANTsR.

    PubMed

    Tustison, Nicholas J; Shrinidhi, K L; Wintermark, Max; Durst, Christopher R; Kandel, Benjamin M; Gee, James C; Grossman, Murray C; Avants, Brian B

    2015-04-01

    Segmenting and quantifying gliomas from MRI is an important task for diagnosis, planning intervention, and for tracking tumor changes over time. However, this task is complicated by the lack of prior knowledge concerning tumor location, spatial extent, shape, possible displacement of normal tissue, and intensity signature. To accommodate such complications, we introduce a framework for supervised segmentation based on multiple modality intensity, geometry, and asymmetry feature sets. These features drive a supervised whole-brain and tumor segmentation approach based on random forest-derived probabilities. The asymmetry-related features (based on optimal symmetric multimodal templates) demonstrate excellent discriminative properties within this framework. We also gain performance by generating probability maps from random forest models and using these maps for a refining Markov random field regularized probabilistic segmentation. This strategy allows us to interface the supervised learning capabilities of the random forest model with regularized probabilistic segmentation using the recently developed ANTsR package--a comprehensive statistical and visualization interface between the popular Advanced Normalization Tools (ANTs) and the R statistical project. The reported algorithmic framework was the top-performing entry in the MICCAI 2013 Multimodal Brain Tumor Segmentation challenge. The challenge data were widely varying consisting of both high-grade and low-grade glioma tumor four-modality MRI from five different institutions. Average Dice overlap measures for the final algorithmic assessment were 0.87, 0.78, and 0.74 for "complete", "core", and "enhanced" tumor components, respectively.

  6. Anomalous diffusion in the evolution of soccer championship scores: Real data, mean-field analysis, and an agent-based model

    NASA Astrophysics Data System (ADS)

    da Silva, Roberto; Vainstein, Mendeli H.; Gonçalves, Sebastián; Paula, Felipe S. F.

    2013-08-01

    Statistics of soccer tournament scores based on the double round robin system of several countries are studied. Exploring the dynamics of team scoring during tournament seasons from recent years we find evidences of superdiffusion. A mean-field analysis results in a drift velocity equal to that of real data but in a different diffusion coefficient. Along with the analysis of real data we present the results of simulations of soccer tournaments obtained by an agent-based model which successfully describes the final scoring distribution [da Silva , Comput. Phys. Commun.CPHCBZ0010-465510.1016/j.cpc.2012.10.030 184, 661 (2013)]. Such model yields random walks of scores over time with the same anomalous diffusion as observed in real data.

  7. A distance limited method for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  8. School-Based Drug Prevention among At-Risk Adolescents: Effects of ALERT Plus

    ERIC Educational Resources Information Center

    Longshore, Douglas; Ellickson, Phyllis L.; McCaffrey, Daniel F.; St. Clair, Patricia A.

    2007-01-01

    In a recent randomized field trial, Ellickson et al. found the Project ALERT drug prevention curriculum curbed alcohol misuse and tobacco and marijuana use among eighth-grade adolescents. This article reports effects among ninth-grade at-risk adolescents. Comparisons between at-risk girls in ALERT Plus schools (basic curriculum extended to ninth…

  9. Employment Opportunities in Applied Biological and Agricultural Occupations in the Metropolitan Area of Chicago.

    ERIC Educational Resources Information Center

    Thomas, Hollie B.; Neavill, Arthur

    Based on questionnaire data collected from a sample of employers, this phase of a larger research project ascertained employment opportunities in the area of applied biological and agricultural occupations in the metropolitan area of Chicago. Specific fields of business surveyed by stratified random sample were animal care, animal health care,…

  10. Evidence-Based Medicine and Child Mental Health Services: A Broad Approach to Evaluation is Needed.

    ERIC Educational Resources Information Center

    McGuire, Jacqueline Barnes; And Others

    1997-01-01

    Describes quasi-experimental designs to be used as alternatives to randomized controlled trials in decisions concerning clinical practice and policy-making in the child mental health field. Highlights importance of taking a systems-level approach to evaluation, and describes ways in which qualitative outcomes measures can be used to sensitively…

  11. A study of the breast cancer dynamics in North Carolina.

    PubMed

    Christakos, G; Lai, J J

    1997-11-01

    This work is concerned with the study of breast cancer incidence in the State of North Carolina. Methodologically, the current analysis illustrates the importance of spatiotemporal random field modelling and introduces a mode of reasoning that is based on a combination of inductive and deductive processes. The composite space/time analysis utilizes the variability characteristics of incidence and the mathematical features of the random field model to fit it to the data. The analysis is significantly general and can efficiently represent non-homogeneous and non-stationary characteristics of breast cancer variation. Incidence predictions are produced using data at the same time period as well as data from other time periods and disease registries. The random field provides a rigorous and systematic method for generating detailed maps, which offer a quantitative description of the incidence variation from place to place and from time to time, together with a measure of the accuracy of the incidence maps. Spatiotemporal mapping accounts for the geographical locations and the time instants of the incidence observations, which is not usually the case with most empirical Bayes methods. It is also more accurate than purely spatial statistics methods, and can offer valuable information about the breast cancer risk and dynamics in North Carolina. Field studies could be initialized in high-rate areas identified by the maps in an effort to uncover environmental or life-style factors that might be responsible for the high risk rates. Also, the incidence maps can help elucidate causal mechanisms, explain disease occurrences at a certain scale, and offer guidance in health management and administration.

  12. Random Interchange of Magnetic Connectivity

    NASA Astrophysics Data System (ADS)

    Matthaeus, W. H.; Ruffolo, D. J.; Servidio, S.; Wan, M.; Rappazzo, A. F.

    2015-12-01

    Magnetic connectivity, the connection between two points along a magnetic field line, has a stochastic character associated with field lines random walking in space due to magnetic fluctuations, but connectivity can also change in time due to dynamical activity [1]. For fluctuations transverse to a strong mean field, this connectivity change be caused by stochastic interchange due to component reconnection. The process may be understood approximately by formulating a diffusion-like Fokker-Planck coefficient [2] that is asymptotically related to standard field line random walk. Quantitative estimates are provided, for transverse magnetic field models and anisotropic models such as reduced magnetohydrodynamics. In heliospheric applications, these estimates may be useful for understanding mixing between open and close field line regions near coronal hole boundaries, and large latitude excursions of connectivity associated with turbulence. [1] A. F. Rappazzo, W. H. Matthaeus, D. Ruffolo, S. Servidio & M. Velli, ApJL, 758, L14 (2012) [2] D. Ruffolo & W. Matthaeus, ApJ, 806, 233 (2015)

  13. The ABC (in any D) of logarithmic CFT

    NASA Astrophysics Data System (ADS)

    Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro

    2017-10-01

    Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.

  14. A weighted belief-propagation algorithm for estimating volume-related properties of random polytopes

    NASA Astrophysics Data System (ADS)

    Font-Clos, Francesc; Massucci, Francesco Alessandro; Pérez Castillo, Isaac

    2012-11-01

    In this work we introduce a novel weighted message-passing algorithm based on the cavity method for estimating volume-related properties of random polytopes, properties which are relevant in various research fields ranging from metabolic networks, to neural networks, to compressed sensing. We propose, as opposed to adopting the usual approach consisting in approximating the real-valued cavity marginal distributions by a few parameters, using an algorithm to faithfully represent the entire marginal distribution. We explain various alternatives for implementing the algorithm and benchmarking the theoretical findings by showing concrete applications to random polytopes. The results obtained with our approach are found to be in very good agreement with the estimates produced by the Hit-and-Run algorithm, known to produce uniform sampling.

  15. Probabilistic Structures Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The basic formulation for probabilistic finite element analysis is described and demonstrated on a few sample problems. This formulation is based on iterative perturbation that uses the factorized stiffness on the unperturbed system as the iteration preconditioner for obtaining the solution to the perturbed problem. This approach eliminates the need to compute, store and manipulate explicit partial derivatives of the element matrices and force vector, which not only reduces memory usage considerably, but also greatly simplifies the coding and validation tasks. All aspects for the proposed formulation were combined in a demonstration problem using a simplified model of a curved turbine blade discretized with 48 shell elements, and having random pressure and temperature fields with partial correlation, random uniform thickness, and random stiffness at the root.

  16. Transport properties of random media: A new effective medium theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busch, K.; Soukoulis, C.M.

    We present a new method for efficient, accurate calculations of transport properties of random media. It is based on the principle that the wave energy density should be uniform when averaged over length scales larger than the size of the scatterers. This scheme captures the effects of resonant scattering of the individual scatterer exactly, as well as the multiple scattering in a mean-field sense. It has been successfully applied to both ``scalar`` and ``vector`` classical wave calculations. Results for the energy transport velocity are in agreement with experiment. This approach is of general use and can be easily extended tomore » treat different types of wave propagation in random media. {copyright} {ital 1995} {ital The} {ital American} {ital Physical} {ital Society}.« less

  17. Fabrication and viscoelastic characteristics of waste tire rubber based magnetorheological elastomer

    NASA Astrophysics Data System (ADS)

    Ubaidillah; Choi, H. J.; Mazlan, S. A.; Imaduddin, F.; Harjana

    2016-11-01

    In this study, waste tire rubber (WTR) was successfully converted into magnetorheological (MR) elastomer via high-pressure and high-temperature reclamation. The physical and rheological properties of WTR based MR elastomers were assessed for performance. The revulcanization process was at the absence of magnetic fields. Thus, the magnetizable particles were allowed to distribute randomly. To confirm the particle dispersion in the MR elastomer matrix, an observation by scanning electron microscopy was used. The magnetization saturation and other magnetic properties were obtained through vibrating sample magnetometer. Rheological properties including MR effect were examined under oscillatory loadings in the absence and presence of magnetic fields using rotational rheometer. The WTR based MR elastomer exhibited tunable intrinsic properties under presentation of magnetic fields. The storage and loss modulus, along with the loss factor, changed with increases in frequency and during magnetization. Interestingly, a Payne effect phenomenon was seen in all samples during dynamic swept strain testing. The Payne effect was significantly increased with incremental increases in the magnetic field. This phenomenon was interpreted as the process of formation-destruction-reformation undergone by the internal network chains in the MR elastomers.

  18. On Some Methods in Safety Evaluation in Geotechnics

    NASA Astrophysics Data System (ADS)

    Puła, Wojciech; Zaskórski, Łukasz

    2015-06-01

    The paper demonstrates how the reliability methods can be utilised in order to evaluate safety in geotechnics. Special attention is paid to the so-called reliability based design that can play a useful and complementary role to Eurocode 7. In the first part, a brief review of first- and second-order reliability methods is given. Next, two examples of reliability-based design are demonstrated. The first one is focussed on bearing capacity calculation and is dedicated to comparison with EC7 requirements. The second one analyses a rigid pile subjected to lateral load and is oriented towards working stress design method. In the second part, applications of random field to safety evaluations in geotechnics are addressed. After a short review of the theory a Random Finite Element algorithm to reliability based design of shallow strip foundation is given. Finally, two illustrative examples for cohesive and cohesionless soils are demonstrated.

  19. A semi-floating gate memory based on van der Waals heterostructures for quasi-non-volatile applications

    NASA Astrophysics Data System (ADS)

    Liu, Chunsen; Yan, Xiao; Song, Xiongfei; Ding, Shijin; Zhang, David Wei; Zhou, Peng

    2018-05-01

    As conventional circuits based on field-effect transistors are approaching their physical limits due to quantum phenomena, semi-floating gate transistors have emerged as an alternative ultrafast and silicon-compatible technology. Here, we show a quasi-non-volatile memory featuring a semi-floating gate architecture with band-engineered van der Waals heterostructures. This two-dimensional semi-floating gate memory demonstrates 156 times longer refresh time with respect to that of dynamic random access memory and ultrahigh-speed writing operations on nanosecond timescales. The semi-floating gate architecture greatly enhances the writing operation performance and is approximately 106 times faster than other memories based on two-dimensional materials. The demonstrated characteristics suggest that the quasi-non-volatile memory has the potential to bridge the gap between volatile and non-volatile memory technologies and decrease the power consumption required for frequent refresh operations, enabling a high-speed and low-power random access memory.

  20. Optimal spatial sampling techniques for ground truth data in microwave remote sensing of soil moisture

    NASA Technical Reports Server (NTRS)

    Rao, R. G. S.; Ulaby, F. T.

    1977-01-01

    The paper examines optimal sampling techniques for obtaining accurate spatial averages of soil moisture, at various depths and for cell sizes in the range 2.5-40 acres, with a minimum number of samples. Both simple random sampling and stratified sampling procedures are used to reach a set of recommended sample sizes for each depth and for each cell size. Major conclusions from statistical sampling test results are that (1) the number of samples required decreases with increasing depth; (2) when the total number of samples cannot be prespecified or the moisture in only one single layer is of interest, then a simple random sample procedure should be used which is based on the observed mean and SD for data from a single field; (3) when the total number of samples can be prespecified and the objective is to measure the soil moisture profile with depth, then stratified random sampling based on optimal allocation should be used; and (4) decreasing the sensor resolution cell size leads to fairly large decreases in samples sizes with stratified sampling procedures, whereas only a moderate decrease is obtained in simple random sampling procedures.

  1. Field evaluation of a random forest activity classifier for wrist-worn accelerometer data.

    PubMed

    Pavey, Toby G; Gilson, Nicholas D; Gomersall, Sjaan R; Clark, Bronwyn; Trost, Stewart G

    2017-01-01

    Wrist-worn accelerometers are convenient to wear and associated with greater wear-time compliance. Previous work has generally relied on choreographed activity trials to train and test classification models. However, validity in free-living contexts is starting to emerge. Study aims were: (1) train and test a random forest activity classifier for wrist accelerometer data; and (2) determine if models trained on laboratory data perform well under free-living conditions. Twenty-one participants (mean age=27.6±6.2) completed seven lab-based activity trials and a 24h free-living trial (N=16). Participants wore a GENEActiv monitor on the non-dominant wrist. Classification models recognising four activity classes (sedentary, stationary+, walking, and running) were trained using time and frequency domain features extracted from 10-s non-overlapping windows. Model performance was evaluated using leave-one-out-cross-validation. Models were implemented using the randomForest package within R. Classifier accuracy during the 24h free living trial was evaluated by calculating agreement with concurrently worn activPAL monitors. Overall classification accuracy for the random forest algorithm was 92.7%. Recognition accuracy for sedentary, stationary+, walking, and running was 80.1%, 95.7%, 91.7%, and 93.7%, respectively for the laboratory protocol. Agreement with the activPAL data (stepping vs. non-stepping) during the 24h free-living trial was excellent and, on average, exceeded 90%. The ICC for stepping time was 0.92 (95% CI=0.75-0.97). However, sensitivity and positive predictive values were modest. Mean bias was 10.3min/d (95% LOA=-46.0 to 25.4min/d). The random forest classifier for wrist accelerometer data yielded accurate group-level predictions under controlled conditions, but was less accurate at identifying stepping verse non-stepping behaviour in free living conditions Future studies should conduct more rigorous field-based evaluations using observation as a criterion measure. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  2. Semiautomatic tumor segmentation with multimodal images in a conditional random field framework.

    PubMed

    Hu, Yu-Chi; Grossberg, Michael; Mageras, Gikas

    2016-04-01

    Volumetric medical images of a single subject can be acquired using different imaging modalities, such as computed tomography, magnetic resonance imaging (MRI), and positron emission tomography. In this work, we present a semiautomatic segmentation algorithm that can leverage the synergies between different image modalities while integrating interactive human guidance. The algorithm provides a statistical segmentation framework partly automating the segmentation task while still maintaining critical human oversight. The statistical models presented are trained interactively using simple brush strokes to indicate tumor and nontumor tissues and using intermediate results within a patient's image study. To accomplish the segmentation, we construct the energy function in the conditional random field (CRF) framework. For each slice, the energy function is set using the estimated probabilities from both user brush stroke data and prior approved segmented slices within a patient study. The progressive segmentation is obtained using a graph-cut-based minimization. Although no similar semiautomated algorithm is currently available, we evaluated our method with an MRI data set from Medical Image Computing and Computer Assisted Intervention Society multimodal brain segmentation challenge (BRATS 2012 and 2013) against a similar fully automatic method based on CRF and a semiautomatic method based on grow-cut, and our method shows superior performance.

  3. Application of the quantum spin glass theory to image restoration.

    PubMed

    Inoue, J I

    2001-04-01

    Quantum fluctuation is introduced into the Markov random-field model for image restoration in the context of a Bayesian approach. We investigate the dependence of the quantum fluctuation on the quality of a black and white image restoration by making use of statistical mechanics. We find that the maximum posterior marginal (MPM) estimate based on the quantum fluctuation gives a fine restoration in comparison with the maximum a posteriori estimate or the thermal fluctuation based MPM estimate.

  4. The space transformation in the simulation of multidimensional random fields

    USGS Publications Warehouse

    Christakos, G.

    1987-01-01

    Space transformations are proposed as a mathematically meaningful and practically comprehensive approach to simulate multidimensional random fields. Within this context the turning bands method of simulation is reconsidered and improved in both the space and frequency domains. ?? 1987.

  5. Small-World Network Spectra in Mean-Field Theory

    NASA Astrophysics Data System (ADS)

    Grabow, Carsten; Grosskinsky, Stefan; Timme, Marc

    2012-05-01

    Collective dynamics on small-world networks emerge in a broad range of systems with their spectra characterizing fundamental asymptotic features. Here we derive analytic mean-field predictions for the spectra of small-world models that systematically interpolate between regular and random topologies by varying their randomness. These theoretical predictions agree well with the actual spectra (obtained by numerical diagonalization) for undirected and directed networks and from fully regular to strongly random topologies. These results may provide analytical insights to empirically found features of dynamics on small-world networks from various research fields, including biology, physics, engineering, and social science.

  6. Random field assessment of nanoscopic inhomogeneity of bone

    PubMed Central

    Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu

    2010-01-01

    Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128

  7. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  8. Random-Forest Classification of High-Resolution Remote Sensing Images and Ndsm Over Urban Areas

    NASA Astrophysics Data System (ADS)

    Sun, X. F.; Lin, X. G.

    2017-09-01

    As an intermediate step between raw remote sensing data and digital urban maps, remote sensing data classification has been a challenging and long-standing research problem in the community of remote sensing. In this work, an effective classification method is proposed for classifying high-resolution remote sensing data over urban areas. Starting from high resolution multi-spectral images and 3D geometry data, our method proceeds in three main stages: feature extraction, classification, and classified result refinement. First, we extract color, vegetation index and texture features from the multi-spectral image and compute the height, elevation texture and differential morphological profile (DMP) features from the 3D geometry data. Then in the classification stage, multiple random forest (RF) classifiers are trained separately, then combined to form a RF ensemble to estimate each sample's category probabilities. Finally the probabilities along with the feature importance indicator outputted by RF ensemble are used to construct a fully connected conditional random field (FCCRF) graph model, by which the classification results are refined through mean-field based statistical inference. Experiments on the ISPRS Semantic Labeling Contest dataset show that our proposed 3-stage method achieves 86.9% overall accuracy on the test data.

  9. Improved estimates of partial volume coefficients from noisy brain MRI using spatial context.

    PubMed

    Manjón, José V; Tohka, Jussi; Robles, Montserrat

    2010-11-01

    This paper addresses the problem of accurate voxel-level estimation of tissue proportions in the human brain magnetic resonance imaging (MRI). Due to the finite resolution of acquisition systems, MRI voxels can contain contributions from more than a single tissue type. The voxel-level estimation of this fractional content is known as partial volume coefficient estimation. In the present work, two new methods to calculate the partial volume coefficients under noisy conditions are introduced and compared with current similar methods. Concretely, a novel Markov Random Field model allowing sharp transitions between partial volume coefficients of neighbouring voxels and an advanced non-local means filtering technique are proposed to reduce the errors due to random noise in the partial volume coefficient estimation. In addition, a comparison was made to find out how the different methodologies affect the measurement of the brain tissue type volumes. Based on the obtained results, the main conclusions are that (1) both Markov Random Field modelling and non-local means filtering improved the partial volume coefficient estimation results, and (2) non-local means filtering was the better of the two strategies for partial volume coefficient estimation. Copyright 2010 Elsevier Inc. All rights reserved.

  10. Bottom-up driven involuntary auditory evoked field change: constant sound sequencing amplifies but does not sharpen neural activity.

    PubMed

    Okamoto, Hidehiko; Stracke, Henning; Lagemann, Lothar; Pantev, Christo

    2010-01-01

    The capability of involuntarily tracking certain sound signals during the simultaneous presence of noise is essential in human daily life. Previous studies have demonstrated that top-down auditory focused attention can enhance excitatory and inhibitory neural activity, resulting in sharpening of frequency tuning of auditory neurons. In the present study, we investigated bottom-up driven involuntary neural processing of sound signals in noisy environments by means of magnetoencephalography. We contrasted two sound signal sequencing conditions: "constant sequencing" versus "random sequencing." Based on a pool of 16 different frequencies, either identical (constant sequencing) or pseudorandomly chosen (random sequencing) test frequencies were presented blockwise together with band-eliminated noises to nonattending subjects. The results demonstrated that the auditory evoked fields elicited in the constant sequencing condition were significantly enhanced compared with the random sequencing condition. However, the enhancement was not significantly different between different band-eliminated noise conditions. Thus the present study confirms that by constant sound signal sequencing under nonattentive listening the neural activity in human auditory cortex can be enhanced, but not sharpened. Our results indicate that bottom-up driven involuntary neural processing may mainly amplify excitatory neural networks, but may not effectively enhance inhibitory neural circuits.

  11. Charged-particle motion in multidimensional magnetic-field turbulence

    NASA Technical Reports Server (NTRS)

    Giacalone, J.; Jokipii, J. R.

    1994-01-01

    We present a new analysis of the fundamental physics of charged-particle motion in a turbulent magnetic field using a numerical simulation. The magnetic field fluctuations are taken to be static and to have a power spectrum which is Kolmogorov. The charged particles are treated as test particles. It is shown that when the field turbulence is independent of one coordinate (i.e., k lies in a plane), the motion of these particles across the magnetic field is essentially zero, as required by theory. Consequently, the only motion across the average magnetic field direction that is allowed is that due to field-line random walk. On the other hand, when a fully three-dimensional realization of the turbulence is considered, the particles readily cross the field. Transport coefficients both along and across the ambient magnetic field are computed. This scheme provides a direct computation of the Fokker-Planck coefficients based on the motions of individual particles, and allows for comparison with analytic theory.

  12. Effect of Time Interval Between Tumescent Local Anesthesia Infiltration and Start of Surgery on Operative Field Visibility in Hand Surgery Without Tourniquet.

    PubMed

    Bashir, Muhammad Mustehsan; Qayyum, Rehan; Saleem, Muhammad Hammad; Siddique, Kashif; Khan, Farid Ahmad

    2015-08-01

    To determine the optimal time interval between tumescent local anesthesia infiltration and the start of hand surgery without a tourniquet for improved operative field visibility. Patients aged 16 to 60 years who needed contracture release and tendon repair in the hand were enrolled from the outpatient clinic. Patients were randomized to 10-, 15-, or 25-minute intervals between tumescent anesthetic solution infiltration (0.18% lidocaine and 1:221,000 epinephrine) and the start of surgery. The end point of tumescence anesthetic infiltration was pale and firm skin. The surgical team was blinded to the time of anesthetic infiltration. At the completion of the procedure, the surgeon and the first assistant rated the operative field visibility as excellent, fair, or poor. We used logistic regression models without and with adjustment for confounding variables. Of the 75 patients enrolled in the study, 59 (79%) were males, 7 were randomized to 10-minute time intervals (further randomization was stopped after interim analysis found consistently poor operative field visibility), and 34 were randomized to the each of the 15- and 25-minute groups. Patients who were randomized to the 25-minute delay group had 29 times higher odds of having an excellent operative visual field than those randomized to the 15-minute delay group. After adjusting for age, sex, amount of tumescent solution infiltration, and duration of operation, the odds ratio remained highly significant. We found that an interval of 25 minutes provides vastly superior operative field visibility; 10-minute delay had the poorest results. Therapeutic I. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  13. EMR-based medical knowledge representation and inference via Markov random fields and distributed representation learning.

    PubMed

    Zhao, Chao; Jiang, Jingchi; Guan, Yi; Guo, Xitong; He, Bin

    2018-05-01

    Electronic medical records (EMRs) contain medical knowledge that can be used for clinical decision support (CDS). Our objective is to develop a general system that can extract and represent knowledge contained in EMRs to support three CDS tasks-test recommendation, initial diagnosis, and treatment plan recommendation-given the condition of a patient. We extracted four kinds of medical entities from records and constructed an EMR-based medical knowledge network (EMKN), in which nodes are entities and edges reflect their co-occurrence in a record. Three bipartite subgraphs (bigraphs) were extracted from the EMKN, one to support each task. One part of the bigraph was the given condition (e.g., symptoms), and the other was the condition to be inferred (e.g., diseases). Each bigraph was regarded as a Markov random field (MRF) to support the inference. We proposed three graph-based energy functions and three likelihood-based energy functions. Two of these functions are based on knowledge representation learning and can provide distributed representations of medical entities. Two EMR datasets and three metrics were utilized to evaluate the performance. As a whole, the evaluation results indicate that the proposed system outperformed the baseline methods. The distributed representation of medical entities does reflect similarity relationships with respect to knowledge level. Combining EMKN and MRF is an effective approach for general medical knowledge representation and inference. Different tasks, however, require individually designed energy functions. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Superlattices: problems and new opportunities, nanosolids

    PubMed Central

    2011-01-01

    Superlattices were introduced 40 years ago as man-made solids to enrich the class of materials for electronic and optoelectronic applications. The field metamorphosed to quantum wells and quantum dots, with ever decreasing dimensions dictated by the technological advancements in nanometer regime. In recent years, the field has gone beyond semiconductors to metals and organic solids. Superlattice is simply a way of forming a uniform continuum for whatever purpose at hand. There are problems with doping, defect-induced random switching, and I/O involving quantum dots. However, new opportunities in component-based nanostructures may lead the field of endeavor to new heights. The all important translational symmetry of solids is relaxed and local symmetry is needed in nanosolids. PMID:21711653

  15. Margins of stability in young adults with traumatic transtibial amputation walking in destabilizing environments✫

    PubMed Central

    Beltran, Eduardo J.; Dingwell, Jonathan B.; Wilken, Jason M.

    2014-01-01

    Understanding how lower-limb amputation affects walking stability, specifically in destabilizing environments, is essential for developing effective interventions to prevent falls. This study quantified mediolateral margins of stability (MOS) and MOS sub-components in young individuals with traumatic unilateral transtibial amputation (TTA) and young able-bodied individuals (AB). Thirteen AB and nine TTA completed five 3-minute walking trials in a Computer Assisted Rehabilitation ENvironment (CAREN) system under three each of three test conditions: no perturbations, pseudo-random mediolateral translations of the platform, and pseudo-random mediolateral translations of the visual field. Compared to the unperturbed trials, TTA exhibited increased mean MOS and MOS variability during platform and visual field perturbations (p < 0.010). Also, AB exhibited increased mean MOS during visual field perturbations and increased MOS variability during both platform and visual field perturbations (p < 0.050). During platform perturbations, TTA exhibited significantly greater values than AB for mean MOS (p < 0.050) and MOS variability (p < 0.050); variability of the lateral distance between the center of mass (COM) and base of support at initial contact (p < 0.005); mean and variability of the range of COM motion (p < 0.010); and variability of COM peak velocity (p < 0.050). As determined by mean MOS and MOS variability, young and otherwise healthy individuals with transtibial amputation achieved stability similar to that of their able-bodied counterparts during unperturbed and visually-perturbed walking. However, based on mean and variability of MOS, unilateral transtibial amputation was shown to have affected walking stability during platform perturbations. PMID:24444777

  16. The Effect of Topical Tranexamic Acid on Bleeding Reduction during Functional Endoscopic Sinus Surgery.

    PubMed

    Baradaranfar, Mohammad Hossein; Dadgarnia, Mohammad Hossein; Mahmoudi, Hossein; Behniafard, Nasim; Atighechi, Saeid; Zand, Vahid; Baradaranfar, Amin; Vaziribozorg, Sedighe

    2017-03-01

    Bleeding is a common concern during functional endoscopic sinus surgery (FESS) that can increase the risk of damage to adjacent vital elements by reducing the surgeon's field of view. This study aimed to explore the efficacy of topical tranexamic acid in reducing intraoperative bleeding. This double-blind, randomized clinical trial was conducted in 60 patients with chronic rhinosinusitis with polyposis (CRSwP) who underwent FESS. Patients were randomly divided into two groups; tranexamic or saline treatment. During surgery, normal saline (400 mL) or tranexamic acid (2 g) in normal saline with a total volume of 400 mL were used in the saline and tranexamic groups, respectively, for irrigation and suctioning. The surgeons' assessment of field of view during surgery and intraoperative blood loss were recorded. Mean blood loss was 254.13 mL in the saline group and 235.6 mL in the tranexamic group (P=0.31). No statistically significant differences between the two groups were found in terms of other investigated variables, such as surgical field quality based on Boezzart's scale (P=0.30), surgeon satisfaction based on a Likert scale (P=0.54), or duration of surgery (P=0.22). Use of tranexamic acid (2 g in 400 mL normal saline) through washing of the nasal mucosa during FESS did not significantly reduce blood loss or improve the surgical field of view. Further studies with larger sample sizes and higher drug concentrations, and using other methods of administration, such as spraying or applying pledgets soaked in tranexamic acid, are recommended.

  17. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    PubMed

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  18. Systematic review on the health effects of exposure to radiofrequency electromagnetic fields from mobile phone base stations.

    PubMed

    Röösli, Martin; Frei, Patrizia; Mohler, Evelyn; Hug, Kerstin

    2010-12-01

    to review and evaluate the recent literature on the health effects of exposure to mobile phone base station (MPBS) radiation. we performed a systematic review of randomized human trials conducted in laboratory settings and of epidemiological studies that investigated the health effects of MPBS radiation in the everyday environment. we included in the analysis 17 articles that met our basic quality criteria: 5 randomized human laboratory trials and 12 epidemiological studies. The majority of the papers (14) examined self-reported non-specific symptoms of ill-health. Most of the randomized trials did not detect any association between MPBS radiation and the development of acute symptoms during or shortly after exposure. The sporadically observed associations did not show a consistent pattern with regard to symptoms or types of exposure. We also found that the more sophisticated the exposure assessment, the less likely it was that an effect would be reported. Studies on health effects other than non-specific symptoms and studies on MPBS exposure in children were scarce. the evidence for a missing relationship between MPBS exposure up to 10 volts per metre and acute symptom development can be considered strong because it is based on randomized, blinded human laboratory trials. At present, there is insufficient data to draw firm conclusions about health effects from long-term low-level exposure typically occurring in the everyday environment.

  19. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics

    NASA Astrophysics Data System (ADS)

    El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  20. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics.

    PubMed

    El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  1. Are gay men and lesbians discriminated against when applying for jobs? A four-city, Internet-based field experiment.

    PubMed

    Bailey, John; Wallace, Michael; Wright, Bradley

    2013-01-01

    An Internet-based field experiment was conducted to examine potential hiring discrimination based on sexual orientation; specifically, the "first contact" between job applicants and employers was looked at. In response to Internet job postings on CareerBuilder.com®, more than 4,600 resumes were sent to employers in 4 U.S. cities: Philadelphia, Chicago, Dallas, and San Francisco. The resumes varied randomly with regard to gender, implied sexual orientation, and other characteristics. Two hypotheses were tested: first, that employers' response rates vary by the applicants' assumed sexuality; and second, that employers' Response Rates by Sexuality vary by city. Effects of city were controlled for to hold constant any variation in labor market conditions in the 4 cities. Based on employer responses to the applications, it was concluded that there is no evidence that gay men or lesbians are discriminated against in their first encounter with employers, and no significant variation across cities in these encounters was found. Implications of these results for the literature on hiring discrimination based on sexual orientation, the strengths and limitations of the research, and the potential for the Internet-based field experiment design in future studies of discrimination are discussed.

  2. Statistical characteristics of trajectories of diamagnetic unicellular organisms in a magnetic field.

    PubMed

    Gorobets, Yu I; Gorobets, O Yu

    2015-01-01

    The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Choosing a Transformation in Analyses of Insect Counts from Contagious Distributions with Low Means

    Treesearch

    W.D. Pepper; S.J. Zarnoch; G.L. DeBarr; P. de Groot; C.D. Tangren

    1997-01-01

    Guidelines based on computer simulation are suggested for choosing a transformation of insect counts from negative binomial distributions with low mean counts and high levels of contagion. Typical values and ranges of negative binomial model parameters were determined by fitting the model to data from 19 entomological field studies. Random sampling of negative binomial...

  4. Effects of Automobile Commute Characteristics on Affect and Job Candidate Evaluations: A Field Experiment

    ERIC Educational Resources Information Center

    Van Rooy, David L.

    2006-01-01

    The current study assesses the effects of the commuting environment on affective states and hiring decisions. A total of 136 undergraduate females were randomly assigned to one of four conditions based on the length (10 km vs. 30 km) and level of congestion (low vs. high) during a commute. Multivariate analyses of variance indicate that affective…

  5. The Role of Between-Case Effect Size in Conducting, Interpreting, and Summarizing Single-Case Research. NCER 2015-002

    ERIC Educational Resources Information Center

    Shadish, William R.; Hedges, Larry V.; Horner, Robert H.; Odom, Samuel L.

    2015-01-01

    The field of education is increasingly committed to adopting evidence-based practices. Although randomized experimental designs provide strong evidence of the causal effects of interventions, they are not always feasible. For example, depending upon the research question, it may be difficult for researchers to find the number of children necessary…

  6. Subject-Adaptive Real-Time Sleep Stage Classification Based on Conditional Random Field

    PubMed Central

    Luo, Gang; Min, Wanli

    2007-01-01

    Sleep staging is the pattern recognition task of classifying sleep recordings into sleep stages. This task is one of the most important steps in sleep analysis. It is crucial for the diagnosis and treatment of various sleep disorders, and also relates closely to brain-machine interfaces. We report an automatic, online sleep stager using electroencephalogram (EEG) signal based on a recently-developed statistical pattern recognition method, conditional random field, and novel potential functions that have explicit physical meanings. Using sleep recordings from human subjects, we show that the average classification accuracy of our sleep stager almost approaches the theoretical limit and is about 8% higher than that of existing systems. Moreover, for a new subject snew with limited training data Dnew, we perform subject adaptation to improve classification accuracy. Our idea is to use the knowledge learned from old subjects to obtain from Dnew a regulated estimate of CRF’s parameters. Using sleep recordings from human subjects, we show that even without any Dnew, our sleep stager can achieve an average classification accuracy of 70% on snew. This accuracy increases with the size of Dnew and eventually becomes close to the theoretical limit. PMID:18693884

  7. Capacitorless one-transistor dynamic random-access memory based on asymmetric double-gate Ge/GaAs-heterojunction tunneling field-effect transistor with n-doped boosting layer and drain-underlap structure

    NASA Astrophysics Data System (ADS)

    Yoon, Young Jun; Seo, Jae Hwa; Kang, In Man

    2018-04-01

    In this work, we present a capacitorless one-transistor dynamic random-access memory (1T-DRAM) based on an asymmetric double-gate Ge/GaAs-heterojunction tunneling field-effect transistor (TFET) for DRAM applications. The n-doped boosting layer and gate2 drain-underlap structure is employed in the device to obtain an excellent 1T-DRAM performance. The n-doped layer inserted between the source and channel regions improves the sensing margin because of a high rate of increase in the band-to-band tunneling (BTBT) probability. Furthermore, because the gate2 drain-underlap structure reduces the recombination rate that occurs between the gate2 and drain regions, a device with a gate2 drain-underlap length (L G2_D-underlap) of 10 nm exhibited a longer retention performance. As a result, by applying the n-doped layer and gate2 drain-underlap structure, the proposed device exhibited not only a high sensing margin of 1.11 µA/µm but also a long retention time of greater than 100 ms at a temperature of 358 K (85 °C).

  8. Robust foreground detection: a fusion of masked grey world, probabilistic gradient information and extended conditional random field approach.

    PubMed

    Zulkifley, Mohd Asyraf; Moran, Bill; Rawlinson, David

    2012-01-01

    Foreground detection has been used extensively in many applications such as people counting, traffic monitoring and face recognition. However, most of the existing detectors can only work under limited conditions. This happens because of the inability of the detector to distinguish foreground and background pixels, especially in complex situations. Our aim is to improve the robustness of foreground detection under sudden and gradual illumination change, colour similarity issue, moving background and shadow noise. Since it is hard to achieve robustness using a single model, we have combined several methods into an integrated system. The masked grey world algorithm is introduced to handle sudden illumination change. Colour co-occurrence modelling is then fused with the probabilistic edge-based background modelling. Colour co-occurrence modelling is good in filtering moving background and robust to gradual illumination change, while an edge-based modelling is used for solving a colour similarity problem. Finally, an extended conditional random field approach is used to filter out shadow and afterimage noise. Simulation results show that our algorithm performs better compared to the existing methods, which makes it suitable for higher-level applications.

  9. Distributed memory parallel Markov random fields using graph partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinemann, C.; Perciano, T.; Ushizima, D.

    Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less

  10. Microcredit, family planning programs, and contraceptive behavior: evidence from a field experiment in Ethiopia.

    PubMed

    Desai, Jaikishan; Tarozzi, Alessandro

    2011-05-01

    The impact of community-based family planning programs and access to credit on contraceptive use, fertility, and family size preferences has not been established conclusively in the literature. We provide additional evidence on the possible effect of such programs by describing the results of a randomized field experiment whose main purpose was to increase the use of contraceptive methods in rural areas of Ethiopia. In the experiment, administrative areas were randomly allocated to one of three intervention groups or to a fourth control group. In the first intervention group, both credit and family planning services were provided and the credit officers also provided information on family planning. Only credit or family planning services, but not both, were provided in the other two intervention groups, while areas in the control group received neither type of service. Using pre- and post-intervention surveys, we find that neither type of program, combined or in isolation, led to an increase in contraceptive use that is significantly greater than that observed in the control group. We conjecture that the lack of impact has much to do with the mismatch between women's preferred contraceptive method (injectibles) and the contraceptives provided by community-based agents (pills and condoms).

  11. Interferometric synthetic aperture radar phase unwrapping based on sparse Markov random fields by graph cuts

    NASA Astrophysics Data System (ADS)

    Zhou, Lifan; Chai, Dengfeng; Xia, Yu; Ma, Peifeng; Lin, Hui

    2018-01-01

    Phase unwrapping (PU) is one of the key processes in reconstructing the digital elevation model of a scene from its interferometric synthetic aperture radar (InSAR) data. It is known that two-dimensional (2-D) PU problems can be formulated as maximum a posteriori estimation of Markov random fields (MRFs). However, considering that the traditional MRF algorithm is usually defined on a rectangular grid, it fails easily if large parts of the wrapped data are dominated by noise caused by large low-coherence area or rapid-topography variation. A PU solution based on sparse MRF is presented to extend the traditional MRF algorithm to deal with sparse data, which allows the unwrapping of InSAR data dominated by high phase noise. To speed up the graph cuts algorithm for sparse MRF, we designed dual elementary graphs and merged them to obtain the Delaunay triangle graph, which is used to minimize the energy function efficiently. The experiments on simulated and real data, compared with other existing algorithms, both confirm the effectiveness of the proposed MRF approach, which suffers less from decorrelation effects caused by large low-coherence area or rapid-topography variation.

  12. Performance Enhancement of the RatCAP Awake Rate Brain PET System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaska, P.; Vaska, P.; Woody, C.

    The first full prototype of the RatCAP PET system, designed to image the brain of a rat while conscious, has been completed. Initial results demonstrated excellent spatial resolution, 1.8 mm FWHM with filtered backprojection and <1.5 mm FWHM with a Monte Carlo based MLEM method. However, noise equivalent countrate studies indicated the need for better timing to mitigate the effect of randoms. Thus, the front-end ASIC has been redesigned to minimize time walk, an accurate coincidence time alignment method has been implemented, and a variance reduction technique for the randoms is being developed. To maximize the quantitative capabilities required formore » neuroscience, corrections are being implemented and validated for positron range and photon noncollinearity, scatter (including outside the field of view), attenuation, randoms, and detector efficiency (deadtime is negligible). In addition, a more robust and compact PCI-based optical data acquisition system has been built to replace the original VME-based system while retaining the linux-based data processing and image reconstruction codes. Finally, a number of new animal imaging experiments have been carried out to demonstrate the performance of the RatCAP in real imaging situations, including an F-18 fluoride bone scan, a C-11 raclopride scan, and a dynamic C-11 methamphetamine scan.« less

  13. Near-Field, On-Chip Optical Brownian Ratchets.

    PubMed

    Wu, Shao-Hua; Huang, Ningfeng; Jaquay, Eric; Povinelli, Michelle L

    2016-08-10

    Nanoparticles in aqueous solution are subject to collisions with solvent molecules, resulting in random, Brownian motion. By breaking the spatiotemporal symmetry of the system, the motion can be rectified. In nature, Brownian ratchets leverage thermal fluctuations to provide directional motion of proteins and enzymes. In man-made systems, Brownian ratchets have been used for nanoparticle sorting and manipulation. Implementations based on optical traps provide a high degree of tunability along with precise spatiotemporal control. Here, we demonstrate an optical Brownian ratchet based on the near-field traps of an asymmetrically patterned photonic crystal. The system yields over 25 times greater trap stiffness than conventional optical tweezers. Our technique opens up new possibilities for particle manipulation in a microfluidic, lab-on-chip environment.

  14. Quantum interference magnetoconductance of polycrystalline germanium films in the variable-range hopping regime

    NASA Astrophysics Data System (ADS)

    Li, Zhaoguo; Peng, Liping; Zhang, Jicheng; Li, Jia; Zeng, Yong; Zhan, Zhiqiang; Wu, Weidong

    2018-06-01

    Direct evidence of quantum interference magnetotransport in polycrystalline germanium films in the variable-range hopping (VRH) regime is reported. The temperature dependence of the conductivity of germanium films fulfilled the Mott VRH mechanism with the form of ? in the low-temperature regime (?). For the magnetotransport behaviour of our germanium films in the VRH regime, a crossover, from negative magnetoconductance at the low-field to positive magnetoconductance at the high-field, is observed while the zero-field conductivity is higher than the critical value (?). In the regime of ?, the magnetoconductance is positive and quadratic in the field for some germanium films. These features are in agreement with the VRH magnetotransport theory based on the quantum interference effect among random paths in the hopping process.

  15. Bayesian estimation of Karhunen–Loève expansions; A random subspace approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhary, Kenny; Najm, Habib N.

    One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less

  16. Bayesian estimation of Karhunen–Loève expansions; A random subspace approach

    DOE PAGES

    Chowdhary, Kenny; Najm, Habib N.

    2016-04-13

    One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less

  17. Diffusion in the presence of a local attracting factor: Theory and interdisciplinary applications.

    PubMed

    Veermäe, Hardi; Patriarca, Marco

    2017-06-01

    In many complex diffusion processes the drift of random walkers is not caused by an external force, as in the case of Brownian motion, but by local variations of fitness perceived by the random walkers. In this paper, a simple but general framework is presented that describes such a type of random motion and may be of relevance in different problems, such as opinion dynamics, cultural spreading, and animal movement. To this aim, we study the problem of a random walker in d dimensions moving in the presence of a local heterogeneous attracting factor expressed in terms of an assigned position-dependent "attractiveness function." At variance with standard Brownian motion, the attractiveness function introduced here regulates both the advection and diffusion of the random walker, thus providing testable predictions for a specific form of fluctuation-relations. We discuss the relation between the drift-diffusion equation based on the attractiveness function and that describing standard Brownian motion, and we provide some explicit examples illustrating its relevance in different fields, such as animal movement, chemotactic diffusion, and social dynamics.

  18. Two-Component Structure in the Entanglement Spectrum of Highly Excited States

    NASA Astrophysics Data System (ADS)

    Yang, Zhi-Cheng; Chamon, Claudio; Hamma, Alioscia; Mucciolo, Eduardo R.

    2015-12-01

    We study the entanglement spectrum of highly excited eigenstates of two known models that exhibit a many-body localization transition, namely the one-dimensional random-field Heisenberg model and the quantum random energy model. Our results indicate that the entanglement spectrum shows a "two-component" structure: a universal part that is associated with random matrix theory, and a nonuniversal part that is model dependent. The nonuniversal part manifests the deviation of the highly excited eigenstate from a true random state even in the thermalized phase where the eigenstate thermalization hypothesis holds. The fraction of the spectrum containing the universal part decreases as one approaches the critical point and vanishes in the localized phase in the thermodynamic limit. We use the universal part fraction to construct an order parameter for measuring the degree of randomness of a generic highly excited state, which is also a promising candidate for studying the many-body localization transition. Two toy models based on Rokhsar-Kivelson type wave functions are constructed and their entanglement spectra are shown to exhibit the same structure.

  19. Creating, generating and comparing random network models with NetworkRandomizer.

    PubMed

    Tosadori, Gabriele; Bestvina, Ivan; Spoto, Fausto; Laudanna, Carlo; Scardoni, Giovanni

    2016-01-01

    Biological networks are becoming a fundamental tool for the investigation of high-throughput data in several fields of biology and biotechnology. With the increasing amount of information, network-based models are gaining more and more interest and new techniques are required in order to mine the information and to validate the results. To fill the validation gap we present an app, for the Cytoscape platform, which aims at creating randomised networks and randomising existing, real networks. Since there is a lack of tools that allow performing such operations, our app aims at enabling researchers to exploit different, well known random network models that could be used as a benchmark for validating real, biological datasets. We also propose a novel methodology for creating random weighted networks, i.e. the multiplication algorithm, starting from real, quantitative data. Finally, the app provides a statistical tool that compares real versus randomly computed attributes, in order to validate the numerical findings. In summary, our app aims at creating a standardised methodology for the validation of the results in the context of the Cytoscape platform.

  20. Diffusion in the presence of a local attracting factor: Theory and interdisciplinary applications

    NASA Astrophysics Data System (ADS)

    Veermäe, Hardi; Patriarca, Marco

    2017-06-01

    In many complex diffusion processes the drift of random walkers is not caused by an external force, as in the case of Brownian motion, but by local variations of fitness perceived by the random walkers. In this paper, a simple but general framework is presented that describes such a type of random motion and may be of relevance in different problems, such as opinion dynamics, cultural spreading, and animal movement. To this aim, we study the problem of a random walker in d dimensions moving in the presence of a local heterogeneous attracting factor expressed in terms of an assigned position-dependent "attractiveness function." At variance with standard Brownian motion, the attractiveness function introduced here regulates both the advection and diffusion of the random walker, thus providing testable predictions for a specific form of fluctuation-relations. We discuss the relation between the drift-diffusion equation based on the attractiveness function and that describing standard Brownian motion, and we provide some explicit examples illustrating its relevance in different fields, such as animal movement, chemotactic diffusion, and social dynamics.

  1. Corrected Mean-Field Model for Random Sequential Adsorption on Random Geometric Graphs

    NASA Astrophysics Data System (ADS)

    Dhara, Souvik; van Leeuwaarden, Johan S. H.; Mukherjee, Debankur

    2018-03-01

    A notorious problem in mathematics and physics is to create a solvable model for random sequential adsorption of non-overlapping congruent spheres in the d-dimensional Euclidean space with d≥ 2 . Spheres arrive sequentially at uniformly chosen locations in space and are accepted only when there is no overlap with previously deposited spheres. Due to spatial correlations, characterizing the fraction of accepted spheres remains largely intractable. We study this fraction by taking a novel approach that compares random sequential adsorption in Euclidean space to the nearest-neighbor blocking on a sequence of clustered random graphs. This random network model can be thought of as a corrected mean-field model for the interaction graph between the attempted spheres. Using functional limit theorems, we characterize the fraction of accepted spheres and its fluctuations.

  2. Multi-field inflation with a random potential

    NASA Astrophysics Data System (ADS)

    Tye, S.-H. Henry; Xu, Jiajun; Zhang, Yang

    2009-04-01

    Motivated by the possibility of inflation in the cosmic landscape, which may be approximated by a complicated potential, we study the density perturbations in multi-field inflation with a random potential. The random potential causes the inflaton to undergo a Brownian-like motion with a drift in the D-dimensional field space, allowing entropic perturbation modes to continuously and randomly feed into the adiabatic mode. To quantify such an effect, we employ a stochastic approach to evaluate the two-point and three-point functions of primordial perturbations. We find that in the weakly random scenario where the stochastic scatterings are frequent but mild, the resulting power spectrum resembles that of the single field slow-roll case, with up to 2% more red tilt. The strongly random scenario, in which the coarse-grained motion of the inflaton is significantly slowed down by the scatterings, leads to rich phenomenologies. The power spectrum exhibits primordial fluctuations on all angular scales. Such features may already be hiding in the error bars of observed CMB TT (as well as TE and EE) power spectrum and have been smoothed out by binning of data points. With more data coming in the future, we expect these features can be detected or falsified. On the other hand the tensor power spectrum itself is free of fluctuations and the tensor to scalar ratio is enhanced by the large ratio of the Brownian-like motion speed over the drift speed. In addition a large negative running of the power spectral index is possible. Non-Gaussianity is generically suppressed by the growth of adiabatic perturbations on super-horizon scales, and is negligible in the weakly random scenario. However, non-Gaussianity can possibly be enhanced by resonant effects in the strongly random scenario or arise from the entropic perturbations during the onset of (p)reheating if the background inflaton trajectory exhibits particular properties. The formalism developed in this paper can be applied to a wide class of multi-field inflation models including, e.g. the N-flation scenario.

  3. Rational group decision making: A random field Ising model at T = 0

    NASA Astrophysics Data System (ADS)

    Galam, Serge

    1997-02-01

    A modified version of a finite random field Ising ferromagnetic model in an external magnetic field at zero temperature is presented to describe group decision making. Fields may have a non-zero average. A postulate of minimum inter-individual conflicts is assumed. Interactions then produce a group polarization along one very choice which is however randomly selected. A small external social pressure is shown to have a drastic effect on the polarization. Individual bias related to personal backgrounds, cultural values and past experiences are introduced via quenched local competing fields. They are shown to be instrumental in generating a larger spectrum of collective new choices beyond initial ones. In particular, compromise is found to results from the existence of individual competing bias. Conflict is shown to weaken group polarization. The model yields new psychosociological insights about consensus and compromise in groups.

  4. Random-anisotropy model: Monotonic dependence of the coercive field on D/J

    NASA Astrophysics Data System (ADS)

    Saslow, W. M.; Koon, N. C.

    1994-02-01

    We present the results of a numerical study of the zero-temperature remanence and coercivity for the random anisotropy model (RAM), showing that, contrary to early calculations for this model, the coercive field increases monotonically with increases in the strength D of the random anisotropy relative to the strength J at the exchange field. Local-field adjustments with and without spin flips are considered. Convergence is difficult to obtain for small values of the anisotropy, suggesting that this is the likely source of the nonmonotonic behavior found in earlier studies. For both large and small anisotropy, each spin undergoes about one flip per hysteresis cycle, and about half of the spin flips occur in the vicinity of the coercive field. When only non-spin-flip adjustments are considered, at large anisotropy the coercivity is proportional to the anisotropy. At small anisotropy, the rate of convergence is comparable to that when spin flips are included.

  5. Effect of magnetic helicity upon rectilinear propagation of charged particles in random magnetic fields

    NASA Technical Reports Server (NTRS)

    Earl, James A.

    1992-01-01

    When charged particles spiral along a large constant magnetic field, their trajectories are scattered by any random field components that are superposed on the guiding field. If the random field configuration embodies helicity, the scattering is asymmetrical with respect to a plane perpendicular to the guiding field, for particles moving into the forward hemisphere are scattered at different rates from those moving into the backward hemisphere. This asymmetry gives rise to new terms in the transport equations that describe propagation of charged particles. Helicity has virtually no impact on qualitative features of the diffusive mode of propagation. However, characteristic velocities of the coherent modes that appear after a highly anisotropic injection exhibit an asymmetry related to helicity. Explicit formulas, which embody the effects of helicity, are given for the anisotropies, the coefficient diffusion, and the coherent velocities. Predictions derived from these expressions are in good agreement with Monte Carlo simulations of particle transport, but the simulations reveal certain phenomena whose explanation calls for further analytical work.

  6. The spectral expansion of the elasticity random field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malyarenko, Anatoliy; Ostoja-Starzewski, Martin

    2014-12-10

    We consider a deformable body that occupies a region D in the plane. In our model, the body’s elasticity tensor H(x) is the restriction to D of a second-order mean-square continuous random field. Under translation, the expected value and the correlation tensor of the field H(x) do not change. Under action of an arbitrary element k of the orthogonal group O(2), they transform according to the reducible orthogonal representation k ⟼ S{sup 2}(S{sup 2}(k)) of the above group. We find the spectral expansion of the correlation tensor R(x) of the elasticity field as well as the expansion of the fieldmore » itself in terms of stochastic integrals with respect to a family of orthogonal scattered random measures.« less

  7. Random field assessment of nanoscopic inhomogeneity of bone.

    PubMed

    Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu

    2010-12-01

    Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Two-component Structure in the Entanglement Spectrum of Highly Excited States

    NASA Astrophysics Data System (ADS)

    Yang, Zhi-Cheng; Chamon, Claudio; Hamma, Alioscia; Mucciolo, Eduardo

    We study the entanglement spectrum of highly excited eigenstates of two known models which exhibit a many-body localization transition, namely the one-dimensional random-field Heisenberg model and the quantum random energy model. Our results indicate that the entanglement spectrum shows a ``two-component'' structure: a universal part that is associated to Random Matrix Theory, and a non-universal part that is model dependent. The non-universal part manifests the deviation of the highly excited eigenstate from a true random state even in the thermalized phase where the Eigenstate Thermalization Hypothesis holds. The fraction of the spectrum containing the universal part decreases continuously as one approaches the critical point and vanishes in the localized phase in the thermodynamic limit. We use the universal part fraction to construct a new order parameter for the many-body delocalized-to-localized transition. Two toy models based on Rokhsar-Kivelson type wavefunctions are constructed and their entanglement spectra are shown to exhibit the same structure.

  9. Random walk to a nonergodic equilibrium concept

    NASA Astrophysics Data System (ADS)

    Bel, G.; Barkai, E.

    2006-01-01

    Random walk models, such as the trap model, continuous time random walks, and comb models, exhibit weak ergodicity breaking, when the average waiting time is infinite. The open question is, what statistical mechanical theory replaces the canonical Boltzmann-Gibbs theory for such systems? In this paper a nonergodic equilibrium concept is investigated, for a continuous time random walk model in a potential field. In particular we show that in the nonergodic phase the distribution of the occupation time of the particle in a finite region of space approaches U- or W-shaped distributions related to the arcsine law. We show that when conditions of detailed balance are applied, these distributions depend on the partition function of the problem, thus establishing a relation between the nonergodic dynamics and canonical statistical mechanics. In the ergodic phase the distribution function of the occupation times approaches a δ function centered on the value predicted based on standard Boltzmann-Gibbs statistics. The relation of our work to single-molecule experiments is briefly discussed.

  10. Deviation from the law of energy equipartition in a small dynamic-random-access memory

    NASA Astrophysics Data System (ADS)

    Carles, Pierre-Alix; Nishiguchi, Katsuhiko; Fujiwara, Akira

    2015-06-01

    A small dynamic-random-access memory (DRAM) coupled with a high charge sensitivity electrometer based on a silicon field-effect transistor is used to study the law of equipartition of energy. By statistically analyzing the movement of single electrons in the DRAM at various temperature and voltage conditions in thermal equilibrium, we are able to observe a behavior that differs from what is predicted by the law of equipartition energy: when the charging energy of the capacitor of the DRAM is comparable to or smaller than the thermal energy kBT/2, random electron motion is ruled perfectly by thermal energy; on the other hand, when the charging energy becomes higher in relation to the thermal energy kBT/2, random electron motion is suppressed which indicates a deviation from the law of equipartition of energy. Since the law of equipartition is analyzed using the DRAM, one of the most familiar devices, we believe that our results are perfectly universal among all electronic devices.

  11. Efficient and robust quantum random number generation by photon number detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Applegate, M. J.; Cavendish Laboratory, University of Cambridge, 19 JJ Thomson Avenue, Cambridge CB3 0HE; Thomas, O.

    2015-08-17

    We present an efficient and robust quantum random number generator based upon high-rate room temperature photon number detection. We employ an electric field-modulated silicon avalanche photodiode, a type of device particularly suited to high-rate photon number detection with excellent photon number resolution to detect, without an applied dead-time, up to 4 photons from the optical pulses emitted by a laser. By both measuring and modeling the response of the detector to the incident photons, we are able to determine the illumination conditions that achieve an optimal bit rate that we show is robust against variation in the photon flux. Wemore » extract random bits from the detected photon numbers with an efficiency of 99% corresponding to 1.97 bits per detected photon number yielding a bit rate of 143 Mbit/s, and verify that the extracted bits pass stringent statistical tests for randomness. Our scheme is highly scalable and has the potential of multi-Gbit/s bit rates.« less

  12. Random-hole optical fiber evanescent-wave gas sensing.

    PubMed

    Pickrell, G; Peng, W; Wang, A

    2004-07-01

    Research on development of optical gas sensors based on evanescent-wave absorption in random-hole optical fibers is described. A process to produce random-hole optical fibers was recently developed that uses a novel in situ bubble formation technique. Gas molecules that exhibit characteristic vibrational absorption lines in the near-IR region that correspond to the transmission window for silica optical fiber have been detected through the evanescent field of the guided mode in the pore region. The presence of the gas molecules in the holes of the fiber appears as a loss at wavelengths that are characteristic of the particular gas species present in the holes. An experimental setup was constructed with these holey fibers for detection of acetylene gas. The results clearly demonstrate the characteristic absorptions in the optical spectra that correspond to the narrow-line absorptions of the acetylene gas, and this represents what is to our knowledge the first report of random-hole fiber gas sensing in the literature.

  13. A New Non-gaussian Turbulent Wind Field Generator to Estimate Design-Loads of Wind-Turbines

    NASA Astrophysics Data System (ADS)

    Schaffarczyk, A. P.; Gontier, H.; Kleinhans, D.; Friedrich, R.

    Climate change and finite fossil fuel resources make it urgent to turn into electricity generation from mostly renewable energies. One major part will play wind-energy supplied by wind-turbines of rated power up to 10 MW. For their design and development wind field models have to be used. The standard models are based on the empirical spectra, for example by von Karman or Kaimal. From investigation of measured data it is clear that gusts are underrepresented in such models. Based on some fundamental discoveries of the nature of turbulence by Friedrich [1] derived from the Navier-Stokes equation directly, we used the concept of Continuous Time Random Walks to construct three dimensional wind fields obeying non-Gaussian statistics. These wind fields were used to estimate critical fatigue loads necessary within the certification process. Calculations are carried out with an implementation of a beam-model (FLEX5) for two types of state-of-the-art wind turbines The authors considered the edgewise and flapwise blade-root bending moments as well as tilt moment at tower top due to the standard wind field models and our new non-Gaussian wind field model. Clear differences in the loads were found.

  14. Childhood leukemia and magnetic fields in infant incubators.

    PubMed

    Söderberg, Karin C; Naumburg, Estelle; Anger, Gert; Cnattingius, Sven; Ekbom, Anders; Feychting, Maria

    2002-01-01

    In studies of magnetic field exposure and childhood leukemia, power lines and other electrical installations close to the children's homes constitute the most extensively studied source of exposure. We conducted a study to assess whether exposure to magnetic fields in infant incubators is associated with an increased leukemia risk. We identified all children with leukemia born in Sweden between 1973 and 1989 from the national Cancer Registry and selected at random one control per case, individually matched by sex and time of birth, from the study base. We retrieved information about treatment in infant incubators from medical records. We made measurements of the magnetic fields inside the incubators for each incubator model kept by the hospitals. Exposure assessment was based on measurements of the magnetic field level inside the incubator, as well as on the length of treatment. For acute lymphoblastic leukemia, the risk estimates were close to unity for all exposure definitions. For acute myeloid leukemia, we found a slightly elevated risk, but with wide confidence intervals and with no indication of dose response. Overall, our results give little evidence that exposure to magnetic fields inside infant incubators is associated with an increased risk of childhood leukemia.

  15. Individual analysis of inter and intragrain defects in electrically characterized polycrystalline silicon nanowire TFTs by multicomponent dark-field imaging based on nanobeam electron diffraction two-dimensional mapping

    NASA Astrophysics Data System (ADS)

    Asano, Takanori; Takaishi, Riichiro; Oda, Minoru; Sakuma, Kiwamu; Saitoh, Masumi; Tanaka, Hiroki

    2018-04-01

    We visualize the grain structures for individual nanosized thin film transistors (TFTs), which are electrically characterized, with an improved data processing technique for the dark-field image reconstruction of nanobeam electron diffraction maps. Our individual crystal analysis gives the one-to-one correspondence of TFTs with different grain boundary structures, such as random and coherent boundaries, to the characteristic degradations of ON-current and threshold voltage. Furthermore, the local crystalline uniformity inside a single grain is detected as the difference in diffraction intensity distribution.

  16. Recovery of chemical Estimates by Field Inhomogeneity Neighborhood Error Detection (REFINED): Fat/Water Separation at 7T

    PubMed Central

    Narayan, Sreenath; Kalhan, Satish C.; Wilson, David L.

    2012-01-01

    I.Abstract Purpose To reduce swaps in fat-water separation methods, a particular issue on 7T small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Materials and Methods Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Results Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Conclusion Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. PMID:23023815

  17. Recovery of chemical estimates by field inhomogeneity neighborhood error detection (REFINED): fat/water separation at 7 tesla.

    PubMed

    Narayan, Sreenath; Kalhan, Satish C; Wilson, David L

    2013-05-01

    To reduce swaps in fat-water separation methods, a particular issue on 7 Tesla (T) small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. Copyright © 2012 Wiley Periodicals, Inc.

  18. Effects of nanotechnologies-based devices on postural control in healthy subjects.

    PubMed

    Malchiodi Albedi, Giovanna; Corna, Stefano; Aspesi, Valentina; Clerici, Daniela; Parisio, Cinzia; Seitanidis, Jonathan; Cau, Nicola; Brugliera, Luigia; Capodaglio, Paolo

    2017-09-05

    The aim of the present preliminary randomized controlled study was to ascertain whether the use of newly developed nanotechnologies-based patches can influence posture control of healthy subjects. Thirty healthy female subjects (age 39.4 years, BMI 22.74 kg/m2) were randomly assigned to two groups: one with active patches and a control group with sham patches. Two patches were applied with a tape: one on the subject's sternum and the other on the C7 apophysis. Body sway during quiet upright stance was recorded with a dynamometric platform. Each subject was tested under two visual conditions, eyes open and closed. We used a blocked stratified randomization procedure conducted by a third party. Subjects wearing the sham patches showed a significant increase of the centre of pressure sway area after 4 hours when they performed the habitual moderate-intensity work activities. In the active patch group, a decrease of the sway path was evident, providing evidence of an enhanced balance control. Our preliminary findings on healthy subjects indicate that nanotechnological devices generating ultra-low electromagnetic fields can improve posture control.

  19. SAR Image Change Detection Based on Fuzzy Markov Random Field Model

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Huang, G.; Zhao, Z.

    2018-04-01

    Most existing SAR image change detection algorithms only consider single pixel information of different images, and not consider the spatial dependencies of image pixels. So the change detection results are susceptible to image noise, and the detection effect is not ideal. Markov Random Field (MRF) can make full use of the spatial dependence of image pixels and improve detection accuracy. When segmenting the difference image, different categories of regions have a high degree of similarity at the junction of them. It is difficult to clearly distinguish the labels of the pixels near the boundaries of the judgment area. In the traditional MRF method, each pixel is given a hard label during iteration. So MRF is a hard decision in the process, and it will cause loss of information. This paper applies the combination of fuzzy theory and MRF to the change detection of SAR images. The experimental results show that the proposed method has better detection effect than the traditional MRF method.

  20. Infinite hidden conditional random fields for human behavior analysis.

    PubMed

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja

    2013-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF (iHCRF), which is a nonparametric model based on hierarchical Dirichlet processes and is capable of automatically learning the optimal number of hidden states for a classification task. We show how we learn the model hyperparameters with an effective Markov-chain Monte Carlo sampling technique, and we explain the process that underlines our iHCRF model with the Restaurant Franchise Rating Agencies analogy. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCRFs--chosen via cross-validation--for the difficult tasks of recognizing instances of agreement, disagreement, and pain. Moreover, the iHCRF manages to achieve this performance in significantly less total training, validation, and testing time.

  1. Smartphone application for multi-phasic interventional trials in psychiatry: Technical design of a smart server.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Smartphones and their accompanying applications are currently widely utilized in various healthcare interventions. Prior to the deployment of these tools for healthcare intervention, typically, proof of concept feasibility studies, as well as randomized trials are conducted to determine that these tools are efficacious prior to their actual implementation. In the field of psychiatry, most of the current interventions seek to compare smartphone based intervention against conventional care. There remains a paucity of research evaluating different forms of interventions using a single smartphone application. In the field of nutrition, there has been recent pioneering research demonstrating how a multi-phasic randomized controlled trial could be conducted using a single smartphone application. Despite the innovativeness of the previous smartphone conceptualization, there remains a paucity of technical information underlying the conceptualization that would support a multi-phasic interventional trial. It is thus the aim of the current technical note to share insights into an innovative server design that would enable the delivery of multi-phasic trials.

  2. Relativistic diffusive motion in random electromagnetic fields

    NASA Astrophysics Data System (ADS)

    Haba, Z.

    2011-08-01

    We show that the relativistic dynamics in a Gaussian random electromagnetic field can be approximated by the relativistic diffusion of Schay and Dudley. Lorentz invariant dynamics in the proper time leads to the diffusion in the proper time. The dynamics in the laboratory time gives the diffusive transport equation corresponding to the Jüttner equilibrium at the inverse temperature β-1 = mc2. The diffusion constant is expressed by the field strength correlation function (Kubo's formula).

  3. DNA based random key generation and management for OTP encryption.

    PubMed

    Zhang, Yunpeng; Liu, Xin; Sun, Manhui

    2017-09-01

    One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.

  4. Propagation of terahertz pulses in random media.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2004-02-15

    We describe measurements of single-cycle terahertz pulse propagation in a random medium. The unique capabilities of terahertz time-domain spectroscopy permit the characterization of a multiply scattered field with unprecedented spatial and temporal resolution. With these results, we can develop a framework for understanding the statistics of broadband laser speckle. Also, the ability to extract information on the phase of the field opens up new possibilities for characterizing multiply scattered waves. We illustrate this with a simple example, which involves computing a time-windowed temporal correlation between fields measured at different spatial locations. This enables the identification of individual scattering events, and could lead to a new method for imaging in random media.

  5. Random electric field instabilities of relaxor ferroelectrics

    NASA Astrophysics Data System (ADS)

    Arce-Gamboa, José R.; Guzmán-Verri, Gian G.

    2017-06-01

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. We compare and reproduce several key experimental observations in the well-studied relaxor PbMg1/3Nb2/3O3-PbTiO3.

  6. Random crystal field effects on the integer and half-integer mixed-spin system

    NASA Astrophysics Data System (ADS)

    Yigit, Ali; Albayrak, Erhan

    2018-05-01

    In this work, we have focused on the random crystal field effects on the phase diagrams of the mixed spin-1 and spin-5/2 Ising system obtained by utilizing the exact recursion relations (ERR) on the Bethe lattice (BL). The distribution function P(Di) = pδ [Di - D(1 + α) ] +(1 - p) δ [Di - D(1 - α) ] is used to randomize the crystal field.The phase diagrams are found to exhibit second- and first-order phase transitions depending on the values of α, D and p. It is also observed that the model displays tricritical point, isolated point, critical end point and three compensation temperatures for suitable values of the system parameters.

  7. Modeling and Compensation of Random Drift of MEMS Gyroscopes Based on Least Squares Support Vector Machine Optimized by Chaotic Particle Swarm Optimization.

    PubMed

    Xing, Haifeng; Hou, Bo; Lin, Zhihui; Guo, Meifeng

    2017-10-13

    MEMS (Micro Electro Mechanical System) gyroscopes have been widely applied to various fields, but MEMS gyroscope random drift has nonlinear and non-stationary characteristics. It has attracted much attention to model and compensate the random drift because it can improve the precision of inertial devices. This paper has proposed to use wavelet filtering to reduce noise in the original data of MEMS gyroscopes, then reconstruct the random drift data with PSR (phase space reconstruction), and establish the model for the reconstructed data by LSSVM (least squares support vector machine), of which the parameters were optimized using CPSO (chaotic particle swarm optimization). Comparing the effect of modeling the MEMS gyroscope random drift with BP-ANN (back propagation artificial neural network) and the proposed method, the results showed that the latter had a better prediction accuracy. Using the compensation of three groups of MEMS gyroscope random drift data, the standard deviation of three groups of experimental data dropped from 0.00354°/s, 0.00412°/s, and 0.00328°/s to 0.00065°/s, 0.00072°/s and 0.00061°/s, respectively, which demonstrated that the proposed method can reduce the influence of MEMS gyroscope random drift and verified the effectiveness of this method for modeling MEMS gyroscope random drift.

  8. Texture classification using autoregressive filtering

    NASA Technical Reports Server (NTRS)

    Lawton, W. M.; Lee, M.

    1984-01-01

    A general theory of image texture models is proposed and its applicability to the problem of scene segmentation using texture classification is discussed. An algorithm, based on half-plane autoregressive filtering, which optimally utilizes second order statistics to discriminate between texture classes represented by arbitrary wide sense stationary random fields is described. Empirical results of applying this algorithm to natural and sysnthesized scenes are presented and future research is outlined.

  9. Bayesian spatial prediction of the site index in the study of the Missouri Ozark Forest Ecosystem Project

    Treesearch

    Xiaoqian Sun; Zhuoqiong He; John Kabrick

    2008-01-01

    This paper presents a Bayesian spatial method for analysing the site index data from the Missouri Ozark Forest Ecosystem Project (MOFEP). Based on ecological background and availability, we select three variables, the aspect class, the soil depth and the land type association as covariates for analysis. To allow great flexibility of the smoothness of the random field,...

  10. Segmentation of anatomical branching structures based on texture features and conditional random field

    NASA Astrophysics Data System (ADS)

    Nuzhnaya, Tatyana; Bakic, Predrag; Kontos, Despina; Megalooikonomou, Vasileios; Ling, Haibin

    2012-02-01

    This work is a part of our ongoing study aimed at understanding a relation between the topology of anatomical branching structures with the underlying image texture. Morphological variability of the breast ductal network is associated with subsequent development of abnormalities in patients with nipple discharge such as papilloma, breast cancer and atypia. In this work, we investigate complex dependence among ductal components to perform segmentation, the first step for analyzing topology of ductal lobes. Our automated framework is based on incorporating a conditional random field with texture descriptors of skewness, coarseness, contrast, energy and fractal dimension. These features are selected to capture the architectural variability of the enhanced ducts by encoding spatial variations between pixel patches in galactographic image. The segmentation algorithm was applied to a dataset of 20 x-ray galactograms obtained at the Hospital of the University of Pennsylvania. We compared the performance of the proposed approach with fully and semi automated segmentation algorithms based on neural network classification, fuzzy-connectedness, vesselness filter and graph cuts. Global consistency error and confusion matrix analysis were used as accuracy measurements. For the proposed approach, the true positive rate was higher and the false negative rate was significantly lower compared to other fully automated methods. This indicates that segmentation based on CRF incorporated with texture descriptors has potential to efficiently support the analysis of complex topology of the ducts and aid in development of realistic breast anatomy phantoms.

  11. Disease named entity recognition by combining conditional random fields and bidirectional recurrent neural networks.

    PubMed

    Wei, Qikang; Chen, Tao; Xu, Ruifeng; He, Yulan; Gui, Lin

    2016-01-01

    The recognition of disease and chemical named entities in scientific articles is a very important subtask in information extraction in the biomedical domain. Due to the diversity and complexity of disease names, the recognition of named entities of diseases is rather tougher than those of chemical names. Although there are some remarkable chemical named entity recognition systems available online such as ChemSpot and tmChem, the publicly available recognition systems of disease named entities are rare. This article presents a system for disease named entity recognition (DNER) and normalization. First, two separate DNER models are developed. One is based on conditional random fields model with a rule-based post-processing module. The other one is based on the bidirectional recurrent neural networks. Then the named entities recognized by each of the DNER model are fed into a support vector machine classifier for combining results. Finally, each recognized disease named entity is normalized to a medical subject heading disease name by using a vector space model based method. Experimental results show that using 1000 PubMed abstracts for training, our proposed system achieves an F1-measure of 0.8428 at the mention level and 0.7804 at the concept level, respectively, on the testing data of the chemical-disease relation task in BioCreative V.Database URL: http://219.223.252.210:8080/SS/cdr.html. © The Author(s) 2016. Published by Oxford University Press.

  12. Gene expression based mouse brain parcellation using Markov random field regularized non-negative matrix factorization

    NASA Astrophysics Data System (ADS)

    Pathak, Sayan D.; Haynor, David R.; Thompson, Carol L.; Lein, Ed; Hawrylycz, Michael

    2009-02-01

    Understanding the geography of genetic expression in the mouse brain has opened previously unexplored avenues in neuroinformatics. The Allen Brain Atlas (www.brain-map.org) (ABA) provides genome-wide colorimetric in situ hybridization (ISH) gene expression images at high spatial resolution, all mapped to a common three-dimensional 200μm3 spatial framework defined by the Allen Reference Atlas (ARA) and is a unique data set for studying expression based structural and functional organization of the brain. The goal of this study was to facilitate an unbiased data-driven structural partitioning of the major structures in the mouse brain. We have developed an algorithm that uses nonnegative matrix factorization (NMF) to perform parts based analysis of ISH gene expression images. The standard NMF approach and its variants are limited in their ability to flexibly integrate prior knowledge, in the context of spatial data. In this paper, we introduce spatial connectivity as an additional regularization in NMF decomposition via the use of Markov Random Fields (mNMF). The mNMF algorithm alternates neighborhood updates with iterations of the standard NMF algorithm to exploit spatial correlations in the data. We present the algorithm and show the sub-divisions of hippocampus and somatosensory-cortex obtained via this approach. The results are compared with established neuroanatomic knowledge. We also highlight novel gene expression based sub divisions of the hippocampus identified by using the mNMF algorithm.

  13. 3D vector distribution of the electro-magnetic fields on a random gold film

    NASA Astrophysics Data System (ADS)

    Canneson, Damien; Berini, Bruno; Buil, Stéphanie; Hermier, Jean-Pierre; Quélin, Xavier

    2018-05-01

    The 3D vector distribution of the electro-magnetic fields at the very close vicinity of the surface of a random gold film is studied. Such films are well known for their properties of light confinement and large fluctuations of local density of optical states. Using Finite-Difference Time-Domain simulations, we show that it is possible to determine the local orientation of the electro-magnetic fields. This allows us to obtain a complete characterization of the fields. Large fluctuations of their amplitude are observed as previously shown. Here, we demonstrate large variations of their direction depending both on the position on the random gold film, and on the distance to it. Such characterization could be useful for a better understanding of applications like the coupling of point-like dipoles to such films.

  14. On the Concept of Random Orientation in Far-Field Electromagnetic Scattering by Nonspherical Particles

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Yurkin, Maxim A.

    2017-01-01

    Although the model of randomly oriented nonspherical particles has been used in a great variety of applications of far-field electromagnetic scattering, it has never been defined in strict mathematical terms. In this Letter we use the formalism of Euler rigid-body rotations to clarify the concept of statistically random particle orientations and derive its immediate corollaries in the form of most general mathematical properties of the orientation-averaged extinction and scattering matrices. Our results serve to provide a rigorous mathematical foundation for numerous publications in which the notion of randomly oriented particles and its light-scattering implications have been considered intuitively obvious.

  15. A Sustainable City Planning Algorithm Based on TLBO and Local Search

    NASA Astrophysics Data System (ADS)

    Zhang, Ke; Lin, Li; Huang, Xuanxuan; Liu, Yiming; Zhang, Yonggang

    2017-09-01

    Nowadays, how to design a city with more sustainable features has become a center problem in the field of social development, meanwhile it has provided a broad stage for the application of artificial intelligence theories and methods. Because the design of sustainable city is essentially a constraint optimization problem, the swarm intelligence algorithm of extensive research has become a natural candidate for solving the problem. TLBO (Teaching-Learning-Based Optimization) algorithm is a new swarm intelligence algorithm. Its inspiration comes from the “teaching” and “learning” behavior of teaching class in the life. The evolution of the population is realized by simulating the “teaching” of the teacher and the student “learning” from each other, with features of less parameters, efficient, simple thinking, easy to achieve and so on. It has been successfully applied to scheduling, planning, configuration and other fields, which achieved a good effect and has been paid more and more attention by artificial intelligence researchers. Based on the classical TLBO algorithm, we propose a TLBO_LS algorithm combined with local search. We design and implement the random generation algorithm and evaluation model of urban planning problem. The experiments on the small and medium-sized random generation problem showed that our proposed algorithm has obvious advantages over DE algorithm and classical TLBO algorithm in terms of convergence speed and solution quality.

  16. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  17. An innovative large scale integration of silicon nanowire-based field effect transistors

    NASA Astrophysics Data System (ADS)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  18. Magnetic field dependence of spin torque switching in nanoscale magnetic tunnel junctions

    NASA Astrophysics Data System (ADS)

    Yang, Liu; Rowlands, Graham; Katine, Jordan; Langer, Juergen; Krivorotov, Ilya

    2012-02-01

    Magnetic random access memory based on spin transfer torque effect in nanoscale magnetic tunnel junctions (STT-RAM) is emerging as a promising candidate for embedded and stand-alone computer memory. An important performance parameter of STT-RAM is stability of its free magnetic layer against thermal fluctuations. Measurements of the free layer switching probability as a function of sub-critical voltage at zero effective magnetic field (read disturb rate or RDR measurements) have been proposed as a method for quantitative evaluation of the free layer thermal stability at zero voltage. In this presentation, we report RDR measurement as a function of external magnetic field, which provide a test of the RDR method self-consistency and reliability.

  19. Asymptotics of the evolution semigroup associated with a scalar field in the presence of a non-linear electromagnetic field

    NASA Astrophysics Data System (ADS)

    Albeverio, Sergio; Tamura, Hiroshi

    2018-04-01

    We consider a model describing the coupling of a vector-valued and a scalar homogeneous Markovian random field over R4, interpreted as expressing the interaction between a charged scalar quantum field coupled with a nonlinear quantized electromagnetic field. Expectations of functionals of the random fields are expressed by Brownian bridges. Using this, together with Feynman-Kac-Itô type formulae and estimates on the small time and large time behaviour of Brownian functionals, we prove asymptotic upper and lower bounds on the kernel of the transition semigroup for our model. The upper bound gives faster than exponential decay for large distances of the corresponding resolvent (propagator).

  20. Digital servo control of random sound fields

    NASA Technical Reports Server (NTRS)

    Nakich, R. B.

    1973-01-01

    It is necessary to place number of sensors at different positions in sound field to determine actual sound intensities to which test object is subjected. It is possible to determine whether specification is being met adequately or exceeded. Since excitation is of random nature, signals are essentially coherent and it is impossible to obtain true average.

  1. Random potentials and cosmological attractors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linde, Andrei, E-mail: alinde@stanford.edu

    I show that the problem of realizing inflation in theories with random potentials of a limited number of fields can be solved, and agreement with the observational data can be naturally achieved if at least one of these fields has a non-minimal kinetic term of the type used in the theory of cosmological α-attractors.

  2. Random phase approximation and cluster mean field studies of hard core Bose Hubbard model

    NASA Astrophysics Data System (ADS)

    Alavani, Bhargav K.; Gaude, Pallavi P.; Pai, Ramesh V.

    2018-04-01

    We investigate zero temperature and finite temperature properties of the Bose Hubbard Model in the hard core limit using Random Phase Approximation (RPA) and Cluster Mean Field Theory (CMFT). We show that our RPA calculations are able to capture quantum and thermal fluctuations significantly better than CMFT.

  3. DLA based compressed sensing for high resolution MR microscopy of neuronal tissue

    NASA Astrophysics Data System (ADS)

    Nguyen, Khieu-Van; Li, Jing-Rebecca; Radecki, Guillaume; Ciobanu, Luisa

    2015-10-01

    In this work we present the implementation of compressed sensing (CS) on a high field preclinical scanner (17.2 T) using an undersampling trajectory based on the diffusion limited aggregation (DLA) random growth model. When applied to a library of images this approach performs better than the traditional undersampling based on the polynomial probability density function. In addition, we show that the method is applicable to imaging live neuronal tissues, allowing significantly shorter acquisition times while maintaining the image quality necessary for identifying the majority of neurons via an automatic cell segmentation algorithm.

  4. Improving preschoolers' mathematics achievement with tablets: a randomized controlled trial

    NASA Astrophysics Data System (ADS)

    Schacter, John; Jo, Booil

    2017-09-01

    With a randomized field experiment of 433 preschoolers, we tested a tablet mathematics program designed to increase young children's mathematics learning. Intervention students played Math Shelf, a comprehensive iPad preschool and year 1 mathematics app, while comparison children received research-based hands-on mathematics instruction delivered by their classroom teachers. After 22 weeks, there was a large and statistically significant effect on mathematics achievement for Math Shelf students (Cohen's d = .94). Moderator analyses demonstrated an even larger effect for low achieving children (Cohen's d = 1.27). These results suggest that early education teachers can improve their students' mathematics outcomes by integrating experimentally proven tablet software into their daily routines.

  5. Subwavelenght Light Localization in Nanostructured Surfaces

    NASA Astrophysics Data System (ADS)

    Coello, V.; Wang, S.; Siqueiros, J.; Bozhevolnyi, S. I.

    Using a photon scanning tunneling microscope, we studied near field optical images obtained with a surface plasmon polariton (SPP) being resonantly excited along a surface with a random introduced roughness. The SPP intensity field distributions showed an optical enhancement in the form of round bright spots up to 5 times larger than the background signal. We also show an artificially fabricated SPP curved micromirror along with the corresponding near-field optical image. The recorded optical signal exhibited an enhancement up to 10 times larger than the background, which has been generated for the first time in a controlled form. A numerical simulation of a parabolic micromirror based on isotropic pointlike scatterers is analyzed and compared with experimental results. The potential of creating microstructures able to control SPP optical field enhancement is showed in a novel numerically simulated microcavity for SPP's.

  6. Differential form representation of stochastic electromagnetic fields

    NASA Astrophysics Data System (ADS)

    Haider, Michael; Russer, Johannes A.

    2017-09-01

    In this work, we revisit the theory of stochastic electromagnetic fields using exterior differential forms. We present a short overview as well as a brief introduction to the application of differential forms in electromagnetic theory. Within the framework of exterior calculus we derive equations for the second order moments, describing stochastic electromagnetic fields. Since the resulting objects are continuous quantities in space, a discretization scheme based on the Method of Moments (MoM) is introduced for numerical treatment. The MoM is applied in such a way, that the notation of exterior calculus is maintained while we still arrive at the same set of algebraic equations as obtained for the case of formulating the theory using the traditional notation of vector calculus. We conclude with an analytic calculation of the radiated electric field of two Hertzian dipole, excited by uncorrelated random currents.

  7. Field strategies for the calibration and validation of high-resolution forest carbon maps: Scaling from plots to a three state region MD, DE, & PA, USA.

    NASA Astrophysics Data System (ADS)

    Dolan, K. A.; Huang, W.; Johnson, K. D.; Birdsey, R.; Finley, A. O.; Dubayah, R.; Hurtt, G. C.

    2016-12-01

    In 2010 Congress directed NASA to initiate research towards the development of Carbon Monitoring Systems (CMS). In response, our team has worked to develop a robust, replicable framework to quantify and map aboveground forest biomass at high spatial resolutions. Crucial to this framework has been the collection of field-based estimates of aboveground tree biomass, combined with remotely detected canopy and structural attributes, for calibration and validation. Here we evaluate the field- based calibration and validation strategies within this carbon monitoring framework and discuss the implications on local to national monitoring systems. Through project development, the domain of this research has expanded from two counties in MD (2,181 km2), to the entire state of MD (32,133 km2), and most recently the tri-state region of MD, PA, and DE (157,868 km2) and covers forests in four major USDA ecological providences. While there are approximately 1000 Forest Inventory and Analysis (FIA) plots distributed across the state of MD, 60% fell in areas considered non-forest or had conditions that precluded them from being measured in the last forest inventory. Across the two pilot counties, where population and landuse competition is high, that proportion rose to 70% Thus, during the initial phases of this project 850 independent field plots were established for model calibration following a random stratified design to insure the adequate representation of height and vegetation classes found across the state, while FIA data were used as an independent data source for validation. As the project expanded to cover the larger spatial tri-state domain, the strategy was flipped to base calibration on more than 3,300 measured FIA plots, as they provide a standardized, consistent and available data source across the nation. An additional 350 stratified random plots were deployed in the Northern Mixed forests of PA and the Coastal Plains forests of DE for validation.

  8. A cluster randomized control field trial of the ABRACADABRA web-based reading technology: replication and extension of basic findings

    PubMed Central

    Piquette, Noella A.; Savage, Robert S.; Abrami, Philip C.

    2014-01-01

    The present paper reports a cluster randomized control trial evaluation of teaching using ABRACADABRA (ABRA), an evidence-based and web-based literacy intervention (http://abralite.concordia.ca) with 107 kindergarten and 96 grade 1 children in 24 classes (12 intervention 12 control classes) from all 12 elementary schools in one school district in Canada. Children in the intervention condition received 10–12 h of whole class instruction using ABRA between pre- and post-test. Hierarchical linear modeling of post-test results showed significant gains in letter-sound knowledge for intervention classrooms over control classrooms. In addition, medium effect sizes were evident for three of five outcome measures favoring the intervention: letter-sound knowledge (d= +0.66), phonological blending (d = +0.52), and word reading (d = +0.52), over effect sizes for regular teaching. It is concluded that regular teaching with ABRA technology adds significantly to literacy in the early elementary years. PMID:25538663

  9. Using GIS to generate spatially balanced random survey designs for natural resource applications.

    PubMed

    Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B

    2007-07-01

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.

  10. The Effect of Topical Tranexamic Acid on Bleeding Reduction during Functional Endoscopic Sinus Surgery

    PubMed Central

    Baradaranfar, Mohammad Hossein; Dadgarnia, Mohammad Hossein; Mahmoudi, Hossein; Behniafard, Nasim; Atighechi, Saeid; Zand, Vahid; Baradaranfar, Amin; Vaziribozorg, Sedighe

    2017-01-01

    Introduction: Bleeding is a common concern during functional endoscopic sinus surgery (FESS) that can increase the risk of damage to adjacent vital elements by reducing the surgeon’s field of view. This study aimed to explore the efficacy of topical tranexamic acid in reducing intraoperative bleeding. Materials and Methods: This double-blind, randomized clinical trial was conducted in 60 patients with chronic rhinosinusitis with polyposis (CRSwP) who underwent FESS. Patients were randomly divided into two groups; tranexamic or saline treatment. During surgery, normal saline (400 mL) or tranexamic acid (2 g) in normal saline with a total volume of 400 mL were used in the saline and tranexamic groups, respectively, for irrigation and suctioning. The surgeons’ assessment of field of view during surgery and intraoperative blood loss were recorded. Results: Mean blood loss was 254.13 mL in the saline group and 235.6 mL in the tranexamic group (P=0.31). No statistically significant differences between the two groups were found in terms of other investigated variables, such as surgical field quality based on Boezzart’s scale (P=0.30), surgeon satisfaction based on a Likert scale (P=0.54), or duration of surgery (P=0.22). Conclusion: Use of tranexamic acid (2 g in 400 mL normal saline) through washing of the nasal mucosa during FESS did not significantly reduce blood loss or improve the surgical field of view. Further studies with larger sample sizes and higher drug concentrations, and using other methods of administration, such as spraying or applying pledgets soaked in tranexamic acid, are recommended. PMID:28393053

  11. Imparting protean behavior to mobile robots accomplishing patrolling tasks in the presence of adversaries.

    PubMed

    Curiac, Daniel-Ioan; Volosencu, Constantin

    2015-10-08

    Providing unpredictable trajectories for patrol robots is essential when coping with adversaries. In order to solve this problem we developed an effective approach based on the known protean behavior of individual prey animals-random zig-zag movement. The proposed bio-inspired method modifies the normal robot's path by incorporating sudden and irregular direction changes without jeopardizing the robot's mission. Such a tactic is aimed to confuse the enemy (e.g. a sniper), offering less time to acquire and retain sight alignment and sight picture. This idea is implemented by simulating a series of fictive-temporary obstacles that will randomly appear in the robot's field of view, deceiving the obstacle avoiding mechanism to react. The new general methodology is particularized by using the Arnold's cat map to obtain the timely random appearance and disappearance of the fictive obstacles. The viability of the proposed method is confirmed through an extensive simulation case study.

  12. Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs

    NASA Astrophysics Data System (ADS)

    Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.

    2018-04-01

    Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.

  13. A multi-source precipitation approach to fill gaps over a radar precipitation field

    NASA Astrophysics Data System (ADS)

    Tesfagiorgis, K. B.; Mahani, S. E.; Khanbilvardi, R.

    2012-12-01

    Satellite Precipitation Estimates (SPEs) may be the only available source of information for operational hydrologic and flash flood prediction due to spatial limitations of radar and gauge products. The present work develops an approach to seamlessly blend satellite, radar, climatological and gauge precipitation products to fill gaps over ground-based radar precipitation fields. To mix different precipitation products, the bias of any of the products relative to each other should be removed. For bias correction, the study used an ensemble-based method which aims to estimate spatially varying multiplicative biases in SPEs using a radar rainfall product. Bias factors were calculated for a randomly selected sample of rainy pixels in the study area. Spatial fields of estimated bias were generated taking into account spatial variation and random errors in the sampled values. A weighted Successive Correction Method (SCM) is proposed to make the merging between error corrected satellite and radar rainfall estimates. In addition to SCM, we use a Bayesian spatial method for merging the gap free radar with rain gauges, climatological rainfall sources and SPEs. We demonstrate the method using SPE Hydro-Estimator (HE), radar- based Stage-II, a climatological product PRISM and rain gauge dataset for several rain events from 2006 to 2008 over three different geographical locations of the United States. Results show that: the SCM method in combination with the Bayesian spatial model produced a precipitation product in good agreement with independent measurements. The study implies that using the available radar pixels surrounding the gap area, rain gauge, PRISM and satellite products, a radar like product is achievable over radar gap areas that benefits the scientific community.

  14. Systematic review on the health effects of exposure to radiofrequency electromagnetic fields from mobile phone base stations

    PubMed Central

    Frei, Patrizia; Mohler, Evelyn; Hug, Kerstin

    2010-01-01

    Abstract Objective To review and evaluate the recent literature on the health effects of exposure to mobile phone base station (MPBS) radiation. Methods We performed a systematic review of randomized human trials conducted in laboratory settings and of epidemiological studies that investigated the health effects of MPBS radiation in the everyday environment. Findings We included in the analysis 17 articles that met our basic quality criteria: 5 randomized human laboratory trials and 12 epidemiological studies. The majority of the papers (14) examined self-reported non-specific symptoms of ill-health. Most of the randomized trials did not detect any association between MPBS radiation and the development of acute symptoms during or shortly after exposure. The sporadically observed associations did not show a consistent pattern with regard to symptoms or types of exposure. We also found that the more sophisticated the exposure assessment, the less likely it was that an effect would be reported. Studies on health effects other than non-specific symptoms and studies on MPBS exposure in children were scarce. Conclusion The evidence for a missing relationship between MPBS exposure up to 10 volts per metre and acute symptom development can be considered strong because it is based on randomized, blinded human laboratory trials. At present, there is insufficient data to draw firm conclusions about health effects from long-term low-level exposure typically occurring in the everyday environment. PMID:21124713

  15. Evaluation of the Efficacy of Tranexamic Acid on the Surgical Field in Primary Cleft Palate Surgery on Children-A Prospective, Randomized Clinical Study.

    PubMed

    Durga, Padmaja; Raavula, Parvathi; Gurajala, Indira; Gunnam, Poojita; Veerabathula, Prardhana; Reddy, Mukund; Upputuri, Omkar; Ramachandran, Gopinath

    2015-09-01

    To assess the effect of tranexamic acid on the quality of the surgical field. Prospective, randomized, double-blind study. Institutional, tertiary referral hospital. American Society of Anesthesiologists physical status class I patients, aged 8 to 60 months with Group II or III (Balakrishnan's classification) clefts scheduled for cleft palate repair. Children were randomized into two groups. The control group received saline, and the tranexamic acid group received tranexamic acid 10 mg/kg as a bolus, 15 minutes before incision. Grade of surgical field on a 10-point scale, surgeon satisfaction, and primary hemorrhage. Significant improvements were noted in surgeon satisfaction and median grade of assessment of the surgical field (4 [interquartile range, 4 to 6] in the control group vs. 3 [interquartile range, 2 to 4] in the test group; P = .003) in the tranexamic acid group compared to the control group. Preincision administration of 10 mg/kg of tranexamic acid significantly improved the surgical field during cleft palate repair.

  16. Extracting the field-effect mobilities of random semiconducting single-walled carbon nanotube networks: A critical comparison of methods

    NASA Astrophysics Data System (ADS)

    Schießl, Stefan P.; Rother, Marcel; Lüttgens, Jan; Zaumseil, Jana

    2017-11-01

    The field-effect mobility is an important figure of merit for semiconductors such as random networks of single-walled carbon nanotubes (SWNTs). However, owing to their network properties and quantum capacitance, the standard models for field-effect transistors cannot be applied without modifications. Several different methods are used to determine the mobility with often very different results. We fabricated and characterized field-effect transistors with different polymer-sorted, semiconducting SWNT network densities ranging from low (≈6 μm-1) to densely packed quasi-monolayers (≈26 μm-1) with a maximum on-conductance of 0.24 μS μm-1 and compared four different techniques to evaluate the field-effect mobility. We demonstrate the limits and requirements for each method with regard to device layout and carrier accumulation. We find that techniques that take into account the measured capacitance on the active device give the most reliable mobility values. Finally, we compare our experimental results to a random-resistor-network model.

  17. Seven lessons from manyfield inflation in random potentials

    NASA Astrophysics Data System (ADS)

    Dias, Mafalda; Frazer, Jonathan; Marsh, M. C. David

    2018-01-01

    We study inflation in models with many interacting fields subject to randomly generated scalar potentials. We use methods from non-equilibrium random matrix theory to construct the potentials and an adaption of the `transport method' to evolve the two-point correlators during inflation. This construction allows, for the first time, for an explicit study of models with up to 100 interacting fields supporting a period of `approximately saddle-point' inflation. We determine the statistical predictions for observables by generating over 30,000 models with 2–100 fields supporting at least 60 efolds of inflation. These studies lead us to seven lessons: i) Manyfield inflation is not single-field inflation, ii) The larger the number of fields, the simpler and sharper the predictions, iii) Planck compatibility is not rare, but future experiments may rule out this class of models, iv) The smoother the potentials, the sharper the predictions, v) Hyperparameters can transition from stiff to sloppy, vi) Despite tachyons, isocurvature can decay, vii) Eigenvalue repulsion drives the predictions. We conclude that many of the `generic predictions' of single-field inflation can be emergent features of complex inflation models.

  18. Quincke random walkers

    NASA Astrophysics Data System (ADS)

    Pradillo, Gerardo; Heintz, Aneesh; Vlahovska, Petia

    2017-11-01

    The spontaneous rotation of a sphere in an applied uniform DC electric field (Quincke effect) has been utilized to engineer self-propelled particles: if the sphere is initially resting on a surface, it rolls. The Quincke rollers have been widely used as a model system to study collective behavior in ``active'' suspensions. If the applied field is DC, an isolated Quincke roller follows a straight line trajectory. In this talk, we discuss the design of a Quincke roller that executes a random-walk-like behavior. We utilize AC field - upon reversal of the field direction a fluctuation in the axis of rotation (which is degenerate in the plane perpendicular to the field and parallel to the surface) introduces randomness in the direction of motion. The MSD of an isolated Quincke walker depends on frequency, amplitude, and waveform of the electric field. Experiment and theory are compared. We also investigate the collective behavior of Quincke walkers,the transport of inert particles in a bath of Quincke walkers, and the spontaneous motion of a drop containing Quincke active particle. supported by NSF Grant CBET 1437545.

  19. Vision Sensor-Based Road Detection for Field Robot Navigation

    PubMed Central

    Lu, Keyu; Li, Jian; An, Xiangjing; He, Hangen

    2015-01-01

    Road detection is an essential component of field robot navigation systems. Vision sensors play an important role in road detection for their great potential in environmental perception. In this paper, we propose a hierarchical vision sensor-based method for robust road detection in challenging road scenes. More specifically, for a given road image captured by an on-board vision sensor, we introduce a multiple population genetic algorithm (MPGA)-based approach for efficient road vanishing point detection. Superpixel-level seeds are then selected in an unsupervised way using a clustering strategy. Then, according to the GrowCut framework, the seeds proliferate and iteratively try to occupy their neighbors. After convergence, the initial road segment is obtained. Finally, in order to achieve a globally-consistent road segment, the initial road segment is refined using the conditional random field (CRF) framework, which integrates high-level information into road detection. We perform several experiments to evaluate the common performance, scale sensitivity and noise sensitivity of the proposed method. The experimental results demonstrate that the proposed method exhibits high robustness compared to the state of the art. PMID:26610514

  20. The standard mean-field treatment of inter-particle attraction in classical DFT is better than one might expect

    NASA Astrophysics Data System (ADS)

    Archer, Andrew J.; Chacko, Blesson; Evans, Robert

    2017-07-01

    In classical density functional theory (DFT), the part of the Helmholtz free energy functional arising from attractive inter-particle interactions is often treated in a mean-field or van der Waals approximation. On the face of it, this is a somewhat crude treatment as the resulting functional generates the simple random phase approximation (RPA) for the bulk fluid pair direct correlation function. We explain why using standard mean-field DFT to describe inhomogeneous fluid structure and thermodynamics is more accurate than one might expect based on this observation. By considering the pair correlation function g(x) and structure factor S(k) of a one-dimensional model fluid, for which exact results are available, we show that the mean-field DFT, employed within the test-particle procedure, yields results much superior to those from the RPA closure of the bulk Ornstein-Zernike equation. We argue that one should not judge the quality of a DFT based solely on the approximation it generates for the bulk pair direct correlation function.

  1. Machine Learning Classification of Heterogeneous Fields to Estimate Physical Responses

    NASA Astrophysics Data System (ADS)

    McKenna, S. A.; Akhriev, A.; Alzate, C.; Zhuk, S.

    2017-12-01

    The promise of machine learning to enhance physics-based simulation is examined here using the transient pressure response to a pumping well in a heterogeneous aquifer. 10,000 random fields of log10 hydraulic conductivity (K) are created and conditioned on a single K measurement at the pumping well. Each K-field is used as input to a forward simulation of drawdown (pressure decline). The differential equations governing groundwater flow to the well serve as a non-linear transform of the input K-field to an output drawdown field. The results are stored and the data set is split into training and testing sets for classification. A Euclidean distance measure between any two fields is calculated and the resulting distances between all pairs of fields define a similarity matrix. Similarity matrices are calculated for both input K-fields and the resulting drawdown fields at the end of the simulation. The similarity matrices are then used as input to spectral clustering to determine groupings of similar input and output fields. Additionally, the similarity matrix is used as input to multi-dimensional scaling to visualize the clustering of fields in lower dimensional spaces. We examine the ability to cluster both input K-fields and output drawdown fields separately with the goal of identifying K-fields that create similar drawdowns and, conversely, given a set of simulated drawdown fields, identify meaningful clusters of input K-fields. Feature extraction based on statistical parametric mapping provides insight into what features of the fields drive the classification results. The final goal is to successfully classify input K-fields into the correct output class, and also, given an output drawdown field, be able to infer the correct class of input field that created it.

  2. Genetic distances between popcorn populations based on molecular markers and correlations with heterosis estimates made by diallel analysis of hybrids.

    PubMed

    Munhoz, R E F; Prioli, A J; Amaral, A T; Scapim, C A; Simon, G A

    2009-08-11

    Diallel analysis was used to obtain information on combining ability, heterosis, estimates of genetic distances by random amplified polymorphic DNA (RAPD) and on their correlations with heterosis, for the popcorn varieties RS 20, UNB2, CMS 43, CMS 42, Zélia, UEM J1, UEM M2, Beija-Flor, and Viçosa, which were crossed to obtain all possible combinations, without reciprocals. The genitors and the 36 F(1) hybrids were evaluated in field trials in Maringá during two growing seasons in a randomized complete block design with three replications. Based on the results, strategies for further studies were developed, including the construction of composites by joining varieties with high general combining ability for grain yield (UNB2 and CMS 42) with those with high general combining ability for popping expansion (Zélia, RS 20 and UEM M2). Based on the RAPD markers, UEM J1 and Zélia were the most genetically distant and RS 20 and UNB2 were the most similar. The low correlation between heterosis and genetic distances may be explained by the random dispersion of the RAPD markers, which were insufficient for the exploitation of the popcorn genome. We concluded that an association between genetic dissimilarity and heterosis based only on genetic distance is not expected without considering the effect of dominant loci.

  3. Axion-photon propagation in magnetized universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chen; Lai, Dong, E-mail: wangchen@nao.cas.cn, E-mail: dong@astro.cornell.edu

    Oscillations between photons and axion-like particles (ALP) travelling in intergalactic magnetic fields have been invoked to explain a number of astrophysical phenomena, or used to constrain ALP properties using observations. One example is the anomalous transparency of the universe to TeV gamma rays. The intergalactic magnetic field is usually modeled as patches of coherent domains, each with a uniform magnetic field, but the field orientation changes randomly from one domain to the next (''discrete-φ model''). We show in this paper that in more realistic situations, when the magnetic field direction varies continuously along the propagation path, the photon-to-ALP conversion probabilitymore » P can be significantly different from the discrete-φ model. In particular, P has a distinct dependence on the photon energy and ALP mass, and can be as large as 100%. This result can affect previous constraints on ALP properties based on ALP-photon propagation in intergalactic magnetic fields, such as TeV photons from distant Active Galactic Nucleus.« less

  4. Sonic Boom Pressure Signature Uncertainty Calculation and Propagation to Ground Noise

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Bretl, Katherine N.; Walker, Eric L.; Pinier, Jeremy T.

    2015-01-01

    The objective of this study was to outline an approach for the quantification of uncertainty in sonic boom measurements and to investigate the effect of various near-field uncertainty representation approaches on ground noise predictions. These approaches included a symmetric versus asymmetric uncertainty band representation and a dispersion technique based on a partial sum Fourier series that allows for the inclusion of random error sources in the uncertainty. The near-field uncertainty was propagated to the ground level, along with additional uncertainty in the propagation modeling. Estimates of perceived loudness were obtained for the various types of uncertainty representation in the near-field. Analyses were performed on three configurations of interest to the sonic boom community: the SEEB-ALR, the 69o DeltaWing, and the LM 1021-01. Results showed that representation of the near-field uncertainty plays a key role in ground noise predictions. Using a Fourier series based dispersion approach can double the amount of uncertainty in the ground noise compared to a pure bias representation. Compared to previous computational fluid dynamics results, uncertainty in ground noise predictions were greater when considering the near-field experimental uncertainty.

  5. Studies of the field-of-view resolution tradeoff in virtual-reality systems

    NASA Technical Reports Server (NTRS)

    Piantanida, Thomas P.; Boman, Duane; Larimer, James; Gille, Jennifer; Reed, Charles

    1992-01-01

    Most virtual-reality systems use LCD-based displays that achieve a large field-of-view at the expense of resolution. A typical display will consist of approximately 86,000 pixels uniformly distributed over an 80-degree by 60-degree image. Thus, each pixel subtends about 13 minutes of arc at the retina; about the same as the resolvable features of the 20/200 line of a Snellen Eye Chart. The low resolution of LCD-based systems limits task performance in some applications. We have examined target-detection performance in a low-resolution virtual world. Our synthesized three-dimensional virtual worlds consisted of target objects that could be positioned at a fixed distance from the viewer, but at random azimuth and constrained elevation. A virtual world could be bounded by chromatic walls or by wire-frame, or it could be unbounded. Viewers scanned these worlds and indicated by appropriate gestures when they had detected the target object. By manipulating the viewer's field size and the chromatic and luminance contrast of annuli surrounding the field-of-view, we were able to assess the effect of field size on the detection of virtual objects in low-resolution synthetic worlds.

  6. Controlling dispersion forces between small particles with artificially created random light fields

    PubMed Central

    Brügger, Georges; Froufe-Pérez, Luis S.; Scheffold, Frank; José Sáenz, Juan

    2015-01-01

    Appropriate combinations of laser beams can be used to trap and manipulate small particles with optical tweezers as well as to induce significant optical binding forces between particles. These interaction forces are usually strongly anisotropic depending on the interference landscape of the external fields. This is in contrast with the familiar isotropic, translationally invariant, van der Waals and, in general, Casimir–Lifshitz interactions between neutral bodies arising from random electromagnetic waves generated by equilibrium quantum and thermal fluctuations. Here we show, both theoretically and experimentally, that dispersion forces between small colloidal particles can also be induced and controlled using artificially created fluctuating light fields. Using optical tweezers as a gauge, we present experimental evidence for the predicted isotropic attractive interactions between dielectric microspheres induced by laser-generated, random light fields. These light-induced interactions open a path towards the control of translationally invariant interactions with tuneable strength and range in colloidal systems. PMID:26096622

  7. Driving a Superconductor to Insulator Transition with Random Gauge Fields.

    PubMed

    Nguyen, H Q; Hollen, S M; Shainline, J; Xu, J M; Valles, J M

    2016-11-30

    Typically the disorder that alters the interference of particle waves to produce Anderson localization is potential scattering from randomly placed impurities. Here we show that disorder in the form of random gauge fields that act directly on particle phases can also drive localization. We present evidence of a superfluid bose glass to insulator transition at a critical level of this gauge field disorder in a nano-patterned array of amorphous Bi islands. This transition shows signs of metallic transport near the critical point characterized by a resistance , indicative of a quantum phase transition. The critical disorder depends on interisland coupling in agreement with recent Quantum Monte Carlo simulations. We discuss how this disorder tuned SIT differs from the common frustration tuned SIT that also occurs in magnetic fields. Its discovery enables new high fidelity comparisons between theoretical and experimental studies of disorder effects on quantum critical systems.

  8. Mindfulness Training and Reductions in Teacher Stress and Burnout: Results from Two Randomized, Waitlist-Control Field Trials

    ERIC Educational Resources Information Center

    Roeser, Robert W.; Schonert-Reichl, Kimberly A.; Jha, Amishi; Cullen, Margaret; Wallace, Linda; Wilensky, Rona; Oberle, Eva; Thomson, Kimberly; Taylor, Cynthia; Harrison, Jessica

    2013-01-01

    The effects of randomization to mindfulness training (MT) or to a waitlist-control condition on psychological and physiological indicators of teachers' occupational stress and burnout were examined in 2 field trials. The sample included 113 elementary and secondary school teachers (89% female) from Canada and the United States. Measures were…

  9. Preparing Beginning Reading Teachers: An Experimental Comparison of Initial Early Literacy Field Experiences

    ERIC Educational Resources Information Center

    Al Otaiba, Stephanie; Lake, Vickie E.; Greulich, Luana; Folsom, Jessica S.; Guidry, Lisa

    2012-01-01

    This randomized-control trial examined the learning of preservice teachers taking an initial Early Literacy course in an early childhood education program and of the kindergarten or first grade students they tutored in their field experience. Preservice teachers were randomly assigned to one of two tutoring programs: Book Buddies and Tutor…

  10. A Multisite Cluster Randomized Field Trial of Open Court Reading

    ERIC Educational Resources Information Center

    Borman, Geoffrey D.; Dowling, N. Maritza; Schneck, Carrie

    2008-01-01

    In this article, the authors report achievement outcomes of a multisite cluster randomized field trial of Open Court Reading 2005 (OCR), a K-6 literacy curriculum published by SRA/McGraw-Hill. The participants are 49 first-grade through fifth-grade classrooms from predominantly minority and poor contexts across the nation. Blocking by grade level…

  11. Characterizing individual scattering events by measuring the amplitude and phase of the electric field diffusing through a random medium.

    PubMed

    Jian, Zhongping; Pearce, Jeremy; Mittleman, Daniel M

    2003-07-18

    We describe observations of the amplitude and phase of an electric field diffusing through a three-dimensional random medium, using terahertz time-domain spectroscopy. These measurements are spatially resolved with a resolution smaller than the speckle spot size and temporally resolved with a resolution better than one optical cycle. By computing correlation functions between fields measured at different positions and with different temporal delays, it is possible to obtain information about individual scattering events experienced by the diffusing field. This represents a new method for characterizing a multiply scattered wave.

  12. Escalated convergent artificial bee colony

    NASA Astrophysics Data System (ADS)

    Jadon, Shimpi Singh; Bansal, Jagdish Chand; Tiwari, Ritu

    2016-03-01

    Artificial bee colony (ABC) optimisation algorithm is a recent, fast and easy-to-implement population-based meta heuristic for optimisation. ABC has been proved a rival algorithm with some popular swarm intelligence-based algorithms such as particle swarm optimisation, firefly algorithm and ant colony optimisation. The solution search equation of ABC is influenced by a random quantity which helps its search process in exploration at the cost of exploitation. In order to find a fast convergent behaviour of ABC while exploitation capability is maintained, in this paper basic ABC is modified in two ways. First, to improve exploitation capability, two local search strategies, namely classical unidimensional local search and levy flight random walk-based local search are incorporated with ABC. Furthermore, a new solution search strategy, namely stochastic diffusion scout search is proposed and incorporated into the scout bee phase to provide more chance to abandon solution to improve itself. Efficiency of the proposed algorithm is tested on 20 benchmark test functions of different complexities and characteristics. Results are very promising and they prove it to be a competitive algorithm in the field of swarm intelligence-based algorithms.

  13. Self-excitation of a nonlinear scalar field in a random medium

    PubMed Central

    Zeldovich, Ya. B.; Molchanov, S. A.; Ruzmaikin, A. A.; Sokoloff, D. D.

    1987-01-01

    We discuss the evolution in time of a scalar field under the influence of a random potential and diffusion. The cases of a short-correlation in time and of stationary potentials are considered. In a linear approximation and for sufficiently weak diffusion, the statistical moments of the field grow exponentially in time at growth rates that progressively increase with the order of the moment; this indicates the intermittent nature of the field. Nonlinearity halts this growth and in some cases can destroy the intermittency. However, in many nonlinear situations the intermittency is preserved: high, persistent peaks of the field exist against the background of a smooth field distribution. These widely spaced peaks may make a major contribution to the average characteristics of the field. PMID:16593872

  14. Waterbodies Extraction from LANDSAT8-OLI Imagery Using Awater Indexs-Guied Stochastic Fully-Connected Conditional Random Field Model and the Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Wang, X.; Xu, L.

    2018-04-01

    One of the most important applications of remote sensing classification is water extraction. The water index (WI) based on Landsat images is one of the most common ways to distinguish water bodies from other land surface features. But conventional WI methods take into account spectral information only form a limited number of bands, and therefore the accuracy of those WI methods may be constrained in some areas which are covered with snow/ice, clouds, etc. An accurate and robust water extraction method is the key to the study at present. The support vector machine (SVM) using all bands spectral information can reduce for these classification error to some extent. Nevertheless, SVM which barely considers spatial information is relatively sensitive to noise in local regions. Conditional random field (CRF) which considers both spatial information and spectral information has proven to be able to compensate for these limitations. Hence, in this paper, we develop a systematic water extraction method by taking advantage of the complementarity between the SVM and a water index-guided stochastic fully-connected conditional random field (SVM-WIGSFCRF) to address the above issues. In addition, we comprehensively evaluate the reliability and accuracy of the proposed method using Landsat-8 operational land imager (OLI) images of one test site. We assess the method's performance by calculating the following accuracy metrics: Omission Errors (OE) and Commission Errors (CE); Kappa coefficient (KP) and Total Error (TE). Experimental results show that the new method can improve target detection accuracy under complex and changeable environments.

  15. Experiences of being a control group: lessons from a UK-based randomized controlled trial of group singing as a health promotion initiative for older people.

    PubMed

    Skingley, Ann; Bungay, Hilary; Clift, Stephen; Warden, June

    2014-12-01

    Existing randomized controlled trials within the health field suggest that the concept of randomization is not always well understood and that feelings of disappointment may occur when participants are not placed in their preferred arm. This may affect a study's rigour and ethical integrity if not addressed. We aimed to test whether these issues apply to a healthy volunteer sample within a health promotion trial of singing for older people. Written comments from control group participants at two points during the trial were analysed, together with individual semi-structured interviews with a small sample (n = 11) of this group. We found that motivation to participate in the trial was largely due to the appeal of singing and disappointment resulted from allocation to the control group. Understanding of randomization was generally good and feelings of disappointment lessened over time and with a post-research opportunity to sing. Findings suggest that measures should be put in place to minimize the potential negative impacts of randomized controlled trials in health promotion research. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Outcomes of a School-Based Intervention (RESCATE) to Improve Physical Activity Patterns in Mexican Children Aged 8-10 Years

    ERIC Educational Resources Information Center

    Colin-Ramirez, E.; Castillo-Martinez, L.; Orea-Tejeda, A.; Vergara-Castaneda, A.; Keirns-Davis, C.; Villa-Romero, A.

    2010-01-01

    The aim of this study was to evaluate the impact of an intervention program on the patterns of physical activity in 8- to 10-year-old Mexican children from lower socioeconomic status. This study performed a randomized controlled field trial in 498 children aged 8-10 years from 10 public schools of low socioeconomic status in Mexico City. Schools…

  17. Punch Response of Gels at Different Loading Rates

    DTIC Science & Technology

    2014-03-01

    calibration (4, 6). While similar in density, neither clay nor gelatin simulates the tissue structure of the human body accurately. Danelson et al. (7...the load response of human tissue. 2 Recent work on gelatins has shown promise in robotics, sensors, and microfluidics (9). Hydrogels ( water -based...images of a high-contrast, random pattern of speckles and a sophisticated optimization program to measure full-field deformation. Figure 1 shows an

  18. Northwest Forest Plan—the first 10 years (1994–2003): preliminary assessment of the condition of watersheds.

    Treesearch

    Kirsten Gallo; Steven H. Lanigan; Peter Eldred; Sean N. Gordon; Chris Moyer

    2005-01-01

    We aggregated road, vegetation, and inchannel data to assess the condition of sixth-field watersheds and describe the distribution of the condition of watersheds in the Northwest Forest Plan (the Plan) area. The assessment is based on 250 watersheds selected at random within the Plan area. The distributions of conditions are presented for watersheds and for many of the...

  19. Ultrafast random-access scanning in two-photon microscopy using acousto-optic deflectors.

    PubMed

    Salomé, R; Kremer, Y; Dieudonné, S; Léger, J-F; Krichevsky, O; Wyart, C; Chatenay, D; Bourdieu, L

    2006-06-30

    Two-photon scanning microscopy (TPSM) is a powerful tool for imaging deep inside living tissues with sub-cellular resolution. The temporal resolution of TPSM is however strongly limited by the galvanometric mirrors used to steer the laser beam. Fast physiological events can therefore only be followed by scanning repeatedly a single line within the field of view. Because acousto-optic deflectors (AODs) are non-mechanical devices, they allow access at any point within the field of view on a microsecond time scale and are therefore excellent candidates to improve the temporal resolution of TPSM. However, the use of AOD-based scanners with femtosecond pulses raises several technical difficulties. In this paper, we describe an all-digital TPSM setup based on two crossed AODs. It includes in particular an acousto-optic modulator (AOM) placed at 45 degrees with respect to the AODs to pre-compensate for the large spatial distortions of femtosecond pulses occurring in the AODs, in order to optimize the spatial resolution and the fluorescence excitation. Our setup allows recording from freely selectable point-of-interest at high speed (1kHz). By maximizing the time spent on points of interest, random-access TPSM (RA-TPSM) constitutes a promising method for multiunit recordings with millisecond resolution in biological tissues.

  20. Conditional Random Field (CRF)-Boosting: Constructing a Robust Online Hybrid Boosting Multiple Object Tracker Facilitated by CRF Learning

    PubMed Central

    Yang, Ehwa; Gwak, Jeonghwan; Jeon, Moongu

    2017-01-01

    Due to the reasonably acceptable performance of state-of-the-art object detectors, tracking-by-detection is a standard strategy for visual multi-object tracking (MOT). In particular, online MOT is more demanding due to its diverse applications in time-critical situations. A main issue of realizing online MOT is how to associate noisy object detection results on a new frame with previously being tracked objects. In this work, we propose a multi-object tracker method called CRF-boosting which utilizes a hybrid data association method based on online hybrid boosting facilitated by a conditional random field (CRF) for establishing online MOT. For data association, learned CRF is used to generate reliable low-level tracklets and then these are used as the input of the hybrid boosting. To do so, while existing data association methods based on boosting algorithms have the necessity of training data having ground truth information to improve robustness, CRF-boosting ensures sufficient robustness without such information due to the synergetic cascaded learning procedure. Further, a hierarchical feature association framework is adopted to further improve MOT accuracy. From experimental results on public datasets, we could conclude that the benefit of proposed hybrid approach compared to the other competitive MOT systems is noticeable. PMID:28304366

  1. Optimal Bayesian Adaptive Design for Test-Item Calibration.

    PubMed

    van der Linden, Wim J; Ren, Hao

    2015-06-01

    An optimal adaptive design for test-item calibration based on Bayesian optimality criteria is presented. The design adapts the choice of field-test items to the examinees taking an operational adaptive test using both the information in the posterior distributions of their ability parameters and the current posterior distributions of the field-test parameters. Different criteria of optimality based on the two types of posterior distributions are possible. The design can be implemented using an MCMC scheme with alternating stages of sampling from the posterior distributions of the test takers' ability parameters and the parameters of the field-test items while reusing samples from earlier posterior distributions of the other parameters. Results from a simulation study demonstrated the feasibility of the proposed MCMC implementation for operational item calibration. A comparison of performances for different optimality criteria showed faster calibration of substantial numbers of items for the criterion of D-optimality relative to A-optimality, a special case of c-optimality, and random assignment of items to the test takers.

  2. Defect detection around rebars in concrete using focused ultrasound and reverse time migration.

    PubMed

    Beniwal, Surendra; Ganguli, Abhijit

    2015-09-01

    Experimental and numerical investigations have been performed to assess the feasibility of damage detection around rebars in concrete using focused ultrasound and a Reverse Time Migration (RTM) based subsurface imaging algorithm. Since concrete is heterogeneous, an unfocused ultrasonic field will be randomly scattered by the aggregates, thereby masking information about damage(s). A focused ultrasonic field, on the other hand, increases the possibility of detection of an anomaly due to enhanced amplitude of the incident field in the focal region. Further, the RTM based reconstruction using scattered focused field data is capable of creating clear images of the inspected region of interest. Since scattering of a focused field by a damaged rebar differs qualitatively from that of an undamaged rebar, distinct images of damaged and undamaged situations are obtained in the RTM generated images. This is demonstrated with both numerical and experimental investigations. The total scattered field, acquired on the surface of the concrete medium, is used as input for the RTM algorithm to generate the subsurface image that helps to identify the damage. The proposed technique, therefore, has some advantage since knowledge about the undamaged scenario for the concrete medium is not necessary to assess its integrity. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Small-scale filament eruptions as the driver of X-ray jets in solar coronal holes.

    PubMed

    Sterling, Alphonse C; Moore, Ronald L; Falconer, David A; Adams, Mitzi

    2015-07-23

    Solar X-ray jets are thought to be made by a burst of reconnection of closed magnetic field at the base of a jet with ambient open field. In the accepted version of the 'emerging-flux' model, such a reconnection occurs at a plasma current sheet between the open field and the emerging closed field, and also forms a localized X-ray brightening that is usually observed at the edge of the jet's base. Here we report high-resolution X-ray and extreme-ultraviolet observations of 20 randomly selected X-ray jets that form in coronal holes at the Sun's poles. In each jet, contrary to the emerging-flux model, a miniature version of the filament eruptions that initiate coronal mass ejections drives the jet-producing reconnection. The X-ray bright point occurs by reconnection of the 'legs' of the minifilament-carrying erupting closed field, analogous to the formation of solar flares in larger-scale eruptions. Previous observations have found that some jets are driven by base-field eruptions, but only one such study, of only one jet, provisionally questioned the emerging-flux model. Our observations support the view that solar filament eruptions are formed by a fundamental explosive magnetic process that occurs on a vast range of scales, from the biggest mass ejections and flare eruptions down to X-ray jets, and perhaps even down to smaller jets that may power coronal heating. A similar scenario has previously been suggested, but was inferred from different observations and based on a different origin of the erupting minifilament.

  4. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    PubMed Central

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  5. Overview of emerging nonvolatile memory technologies

    PubMed Central

    2014-01-01

    Nonvolatile memory technologies in Si-based electronics date back to the 1990s. Ferroelectric field-effect transistor (FeFET) was one of the most promising devices replacing the conventional Flash memory facing physical scaling limitations at those times. A variant of charge storage memory referred to as Flash memory is widely used in consumer electronic products such as cell phones and music players while NAND Flash-based solid-state disks (SSDs) are increasingly displacing hard disk drives as the primary storage device in laptops, desktops, and even data centers. The integration limit of Flash memories is approaching, and many new types of memory to replace conventional Flash memories have been proposed. Emerging memory technologies promise new memories to store more data at less cost than the expensive-to-build silicon chips used by popular consumer gadgets including digital cameras, cell phones and portable music players. They are being investigated and lead to the future as potential alternatives to existing memories in future computing systems. Emerging nonvolatile memory technologies such as magnetic random-access memory (MRAM), spin-transfer torque random-access memory (STT-RAM), ferroelectric random-access memory (FeRAM), phase-change memory (PCM), and resistive random-access memory (RRAM) combine the speed of static random-access memory (SRAM), the density of dynamic random-access memory (DRAM), and the nonvolatility of Flash memory and so become very attractive as another possibility for future memory hierarchies. Many other new classes of emerging memory technologies such as transparent and plastic, three-dimensional (3-D), and quantum dot memory technologies have also gained tremendous popularity in recent years. Subsequently, not an exaggeration to say that computer memory could soon earn the ultimate commercial validation for commercial scale-up and production the cheap plastic knockoff. Therefore, this review is devoted to the rapidly developing new class of memory technologies and scaling of scientific procedures based on an investigation of recent progress in advanced Flash memory devices. PMID:25278820

  6. Overview of emerging nonvolatile memory technologies.

    PubMed

    Meena, Jagan Singh; Sze, Simon Min; Chand, Umesh; Tseng, Tseung-Yuen

    2014-01-01

    Nonvolatile memory technologies in Si-based electronics date back to the 1990s. Ferroelectric field-effect transistor (FeFET) was one of the most promising devices replacing the conventional Flash memory facing physical scaling limitations at those times. A variant of charge storage memory referred to as Flash memory is widely used in consumer electronic products such as cell phones and music players while NAND Flash-based solid-state disks (SSDs) are increasingly displacing hard disk drives as the primary storage device in laptops, desktops, and even data centers. The integration limit of Flash memories is approaching, and many new types of memory to replace conventional Flash memories have been proposed. Emerging memory technologies promise new memories to store more data at less cost than the expensive-to-build silicon chips used by popular consumer gadgets including digital cameras, cell phones and portable music players. They are being investigated and lead to the future as potential alternatives to existing memories in future computing systems. Emerging nonvolatile memory technologies such as magnetic random-access memory (MRAM), spin-transfer torque random-access memory (STT-RAM), ferroelectric random-access memory (FeRAM), phase-change memory (PCM), and resistive random-access memory (RRAM) combine the speed of static random-access memory (SRAM), the density of dynamic random-access memory (DRAM), and the nonvolatility of Flash memory and so become very attractive as another possibility for future memory hierarchies. Many other new classes of emerging memory technologies such as transparent and plastic, three-dimensional (3-D), and quantum dot memory technologies have also gained tremendous popularity in recent years. Subsequently, not an exaggeration to say that computer memory could soon earn the ultimate commercial validation for commercial scale-up and production the cheap plastic knockoff. Therefore, this review is devoted to the rapidly developing new class of memory technologies and scaling of scientific procedures based on an investigation of recent progress in advanced Flash memory devices.

  7. Box-Cox Mixed Logit Model for Travel Behavior Analysis

    NASA Astrophysics Data System (ADS)

    Orro, Alfonso; Novales, Margarita; Benitez, Francisco G.

    2010-09-01

    To represent the behavior of travelers when they are deciding how they are going to get to their destination, discrete choice models, based on the random utility theory, have become one of the most widely used tools. The field in which these models were developed was halfway between econometrics and transport engineering, although the latter now constitutes one of their principal areas of application. In the transport field, they have mainly been applied to mode choice, but also to the selection of destination, route, and other important decisions such as the vehicle ownership. In usual practice, the most frequently employed discrete choice models implement a fixed coefficient utility function that is linear in the parameters. The principal aim of this paper is to present the viability of specifying utility functions with random coefficients that are nonlinear in the parameters, in applications of discrete choice models to transport. Nonlinear specifications in the parameters were present in discrete choice theory at its outset, although they have seldom been used in practice until recently. The specification of random coefficients, however, began with the probit and the hedonic models in the 1970s, and, after a period of apparent little practical interest, has burgeoned into a field of intense activity in recent years with the new generation of mixed logit models. In this communication, we present a Box-Cox mixed logit model, original of the authors. It includes the estimation of the Box-Cox exponents in addition to the parameters of the random coefficients distribution. Probability of choose an alternative is an integral that will be calculated by simulation. The estimation of the model is carried out by maximizing the simulated log-likelihood of a sample of observed individual choices between alternatives. The differences between the predictions yielded by models that are inconsistent with real behavior have been studied with simulation experiments.

  8. Some Metric Properties of Planar Gaussian Free Field

    NASA Astrophysics Data System (ADS)

    Goswami, Subhajit

    In this thesis we study the properties of some metrics arising from two-dimensional Gaussian free field (GFF), namely the Liouville first-passage percolation (Liouville FPP), the Liouville graph distance and an effective resistance metric. In Chapter 1, we define these metrics as well as discuss the motivations for studying them. Roughly speaking, Liouville FPP is the shortest path metric in a planar domain D where the length of a path P is given by ∫Pe gammah(z)|dz| where h is the GFF on D and gamma > 0. In Chapter 2, we present an upper bound on the expected Liouville FPP distance between two typical points for small values of gamma (the near-Euclidean regime). A similar upper bound is derived in Chapter 3 for the Liouville graph distance which is, roughly, the minimal number of Euclidean balls with comparable Liouville quantum gravity (LQG) measure whose union contains a continuous path between two endpoints. Our bounds seem to be in disagreement with Watabiki's prediction (1993) on the random metric of Liouville quantum gravity in this regime. The contents of these two chapters are based on a joint work with Jian Ding. In Chapter 4, we derive some asymptotic estimates for effective resistances on a random network which is defined as follows. Given any gamma > 0 and for eta = {etav}v∈Z2 denoting a sample of the two-dimensional discrete Gaussian free field on Z2 pinned at the origin, we equip the edge ( u, v) with conductance egamma(etau + eta v). The metric structure of effective resistance plays a crucial role in our proof of the main result in Chapter 4. The primary motivation behind this metric is to understand the random walk on Z 2 where the edge (u, v) has weight egamma(etau + etav). Using the estimates from Chapter 4 we show in Chapter 5 that for almost every eta, this random walk is recurrent and that, with probability tending to 1 as T → infinity, the return probability at time 2T decays as T-1+o(1). In addition, we prove a version of subdiffusive behavior by showing that the expected exit time from a ball of radius N scales as Npsi(gamma)+o(1) with psi(gamma) > 2 for all gamma > 0. The contents of these chapters are based on a joint work with Marek Biskup and Jian Ding.

  9. Assessment Data-Informed Guidance to Individualize Kindergarten Reading Instruction: Findings from a Cluster-Randomized Control Field Trial

    ERIC Educational Resources Information Center

    Al Otaiba, Stephanie; Connor, Carol M.; Folsom, Jessica S.; Greulich, Luana; Meadows, Jane; Li, Zhi

    2011-01-01

    The purpose of this cluster-randomized control field trial was to examine whether kindergarten teachers could learn to differentiate classroom reading instruction using Individualized Student Instruction for Kindergarten (ISI-K) and to test the efficacy of differentiation on reading outcomes. The study involved 14 schools, 23 ISI-K (n = 305…

  10. Group field theory and tensor networks: towards a Ryu–Takayanagi formula in full quantum gravity

    NASA Astrophysics Data System (ADS)

    Chirco, Goffredo; Oriti, Daniele; Zhang, Mingyi

    2018-06-01

    We establish a dictionary between group field theory (thus, spin networks and random tensors) states and generalized random tensor networks. Then, we use this dictionary to compute the Rényi entropy of such states and recover the Ryu–Takayanagi formula, in two different cases corresponding to two different truncations/approximations, suggested by the established correspondence.

  11. The Effect of Teacher-Family Communication on Student Engagement: Evidence from a Randomized Field Experiment

    ERIC Educational Resources Information Center

    Kraft, Matthew A.; Dougherty, Shaun M.

    2013-01-01

    In this study, we evaluate the efficacy of teacher communication with parents and students as a means of increasing student engagement. We estimate the causal effect of teacher communication by conducting a randomized field experiment in which sixth- and ninth-grade students were assigned to receive a daily phone call home and a text/written…

  12. Sound Source Localization Using Non-Conformal Surface Sound Field Transformation Based on Spherical Harmonic Wave Decomposition

    PubMed Central

    Zhang, Lanyue; Ding, Dandan; Yang, Desen; Wang, Jia; Shi, Jie

    2017-01-01

    Spherical microphone arrays have been paid increasing attention for their ability to locate a sound source with arbitrary incident angle in three-dimensional space. Low-frequency sound sources are usually located by using spherical near-field acoustic holography. The reconstruction surface and holography surface are conformal surfaces in the conventional sound field transformation based on generalized Fourier transform. When the sound source is on the cylindrical surface, it is difficult to locate by using spherical surface conformal transform. The non-conformal sound field transformation by making a transfer matrix based on spherical harmonic wave decomposition is proposed in this paper, which can achieve the transformation of a spherical surface into a cylindrical surface by using spherical array data. The theoretical expressions of the proposed method are deduced, and the performance of the method is simulated. Moreover, the experiment of sound source localization by using a spherical array with randomly and uniformly distributed elements is carried out. Results show that the non-conformal surface sound field transformation from a spherical surface to a cylindrical surface is realized by using the proposed method. The localization deviation is around 0.01 m, and the resolution is around 0.3 m. The application of the spherical array is extended, and the localization ability of the spherical array is improved. PMID:28489065

  13. The citation merit of scientific publications.

    PubMed

    Crespo, Juan A; Ortuño-Ortín, Ignacio; Ruiz-Castillo, Javier

    2012-01-01

    We propose a new method to assess the merit of any set of scientific papers in a given field based on the citations they receive. Given a field and a citation impact indicator, such as the mean citation or the [Formula: see text]-index, the merit of a given set of [Formula: see text] articles is identified with the probability that a randomly drawn set of [Formula: see text] articles from a given pool of articles in that field has a lower citation impact according to the indicator in question. The method allows for comparisons between sets of articles of different sizes and fields. Using a dataset acquired from Thomson Scientific that contains the articles published in the periodical literature in the period 1998-2007, we show that the novel approach yields rankings of research units different from those obtained by a direct application of the mean citation or the [Formula: see text]-index.

  14. Far field beam pattern of one MW combined beam of laser diode array amplifiers for space power transmission

    NASA Technical Reports Server (NTRS)

    Kwon, Jin H.; Lee, Ja H.

    1989-01-01

    The far-field beam pattern and the power-collection efficiency are calculated for a multistage laser-diode-array amplifier consisting of about 200,000 5-W laser diode arrays with random distributions of phase and orientation errors and random diode failures. From the numerical calculation it is found that the far-field beam pattern is little affected by random failures of up to 20 percent of the laser diodes with reference of 80 percent receiving efficiency in the center spot. The random differences in phases among laser diodes due to probable manufacturing errors is allowed to about 0.2 times the wavelength. The maximum allowable orientation error is about 20 percent of the diffraction angle of a single laser diode aperture (about 1 cm). The preliminary results indicate that the amplifier could be used for space beam-power transmission with an efficiency of about 80 percent for a moderate-size (3-m-diameter) receiver placed at a distance of less than 50,000 km.

  15. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    PubMed

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  16. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    NASA Astrophysics Data System (ADS)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2018-03-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  17. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    NASA Astrophysics Data System (ADS)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2017-12-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2} ). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3} ) and the level sets of the Gaussian free field ({d≥ 3} ). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  18. Randomized, Prospective Study of the Impact of a Sleep Health Program on Firefighter Injury and Disability.

    PubMed

    Sullivan, Jason P; O'Brien, Conor S; Barger, Laura K; Rajaratnam, Shantha M W; Czeisler, Charles A; Lockley, Steven W

    2017-01-01

    Firefighters' schedules include extended shifts and long work weeks which cause sleep deficiency and circadian rhythm disruption. Many firefighters also suffer from undiagnosed sleep disorders, exacerbating fatigue. We tested the hypothesis that a workplace-based Sleep Health Program (SHP) incorporating sleep health education and sleep disorders screening would improve firefighter health and safety compared to standard practice. Prospective station-level randomized, field-based intervention. US fire department. 1189 firefighters. Sleep health education, questionnaire-based sleep disorders screening, and sleep clinic referrals for respondents who screened positive for a sleep disorder. Firefighters were randomized by station. Using departmental records, in an intention-to-treat analysis, firefighters assigned to intervention stations which participated in education sessions and had the opportunity to complete sleep disorders screening reported 46% fewer disability days than those assigned to control stations (1.4 ± 5.9 vs. 2.6 ± 8.5 days/firefighter, respectively; p = .003). There were no significant differences in departmental injury or motor vehicle crash rates between the groups. In post hoc analysis accounting for intervention exposure, firefighters who attended education sessions were 24% less likely to file at least one injury report during the study than those who did not attend, regardless of randomization (OR [95% CI] 0.76 [0.60, 0.98]; χ2 = 4.56; p = .033). There were no significant changes pre- versus post-study in self-reported sleep or sleepiness in those who participated in the intervention. A firefighter workplace-based SHP providing sleep health education and sleep disorders screening opportunity can reduce injuries and work loss due to disability in firefighters. © Sleep Research Society 2016. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  19. Random electric field instabilities of relaxor ferroelectrics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arce-Gamboa, Jose R.; Guzman-Verri, Gian G.

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. As a result, we compare and reproduce severalmore » key experimental observations in the well-studied relaxor PbMg 1/3Nb 2/3O 3–PbTiO 3.« less

  20. Random electric field instabilities of relaxor ferroelectrics

    DOE PAGES

    Arce-Gamboa, Jose R.; Guzman-Verri, Gian G.

    2017-06-13

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. As a result, we compare and reproduce severalmore » key experimental observations in the well-studied relaxor PbMg 1/3Nb 2/3O 3–PbTiO 3.« less

  1. Possible Statistics of Two Coupled Random Fields: Application to Passive Scalar

    NASA Technical Reports Server (NTRS)

    Dubrulle, B.; He, Guo-Wei; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    We use the relativity postulate of scale invariance to derive the similarity transformations between two coupled scale-invariant random elds at different scales. We nd the equations leading to the scaling exponents. This formulation is applied to the case of passive scalars advected i) by a random Gaussian velocity field; and ii) by a turbulent velocity field. In the Gaussian case, we show that the passive scalar increments follow a log-Levy distribution generalizing Kraichnan's solution and, in an appropriate limit, a log-normal distribution. In the turbulent case, we show that when the velocity increments follow a log-Poisson statistics, the passive scalar increments follow a statistics close to log-Poisson. This result explains the experimental observations of Ruiz et al. about the temperature increments.

  2. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  3. Correlated Fluctuations in Strongly Coupled Binary Networks Beyond Equilibrium

    NASA Astrophysics Data System (ADS)

    Dahmen, David; Bos, Hannah; Helias, Moritz

    2016-07-01

    Randomly coupled Ising spins constitute the classical model of collective phenomena in disordered systems, with applications covering glassy magnetism and frustration, combinatorial optimization, protein folding, stock market dynamics, and social dynamics. The phase diagram of these systems is obtained in the thermodynamic limit by averaging over the quenched randomness of the couplings. However, many applications require the statistics of activity for a single realization of the possibly asymmetric couplings in finite-sized networks. Examples include reconstruction of couplings from the observed dynamics, representation of probability distributions for sampling-based inference, and learning in the central nervous system based on the dynamic and correlation-dependent modification of synaptic connections. The systematic cumulant expansion for kinetic binary (Ising) threshold units with strong, random, and asymmetric couplings presented here goes beyond mean-field theory and is applicable outside thermodynamic equilibrium; a system of approximate nonlinear equations predicts average activities and pairwise covariances in quantitative agreement with full simulations down to hundreds of units. The linearized theory yields an expansion of the correlation and response functions in collective eigenmodes, leads to an efficient algorithm solving the inverse problem, and shows that correlations are invariant under scaling of the interaction strengths.

  4. A Markov random field based approach to the identification of meat and bone meal in feed by near-infrared spectroscopic imaging.

    PubMed

    Jiang, Xunpeng; Yang, Zengling; Han, Lujia

    2014-07-01

    Contaminated meat and bone meal (MBM) in animal feedstuff has been the source of bovine spongiform encephalopathy (BSE) disease in cattle, leading to a ban in its use, so methods for its detection are essential. In this study, five pure feed and five pure MBM samples were used to prepare two sets of sample arrangements: set A for investigating the discrimination of individual feed/MBM particles and set B for larger numbers of overlapping particles. The two sets were used to test a Markov random field (MRF)-based approach. A Fourier transform infrared (FT-IR) imaging system was used for data acquisition. The spatial resolution of the near-infrared (NIR) spectroscopic image was 25 μm × 25 μm. Each spectrum was the average of 16 scans across the wavenumber range 7,000-4,000 cm(-1), at intervals of 8 cm(-1). This study introduces an innovative approach to analyzing NIR spectroscopic images: an MRF-based approach has been developed using the iterated conditional mode (ICM) algorithm, integrating initial labeling-derived results from support vector machine discriminant analysis (SVMDA) and observation data derived from the results of principal component analysis (PCA). The results showed that MBM covered by feed could be successfully recognized with an overall accuracy of 86.59% and a Kappa coefficient of 0.68. Compared with conventional methods, the MRF-based approach is capable of extracting spectral information combined with spatial information from NIR spectroscopic images. This new approach enhances the identification of MBM using NIR spectroscopic imaging.

  5. Spatial-Temporal Data Collection with Compressive Sensing in Mobile Sensor Networks

    PubMed Central

    Li, Jiayin; Guo, Wenzhong; Chen, Zhonghui; Xiong, Neal

    2017-01-01

    Compressive sensing (CS) provides an energy-efficient paradigm for data gathering in wireless sensor networks (WSNs). However, the existing work on spatial-temporal data gathering using compressive sensing only considers either multi-hop relaying based or multiple random walks based approaches. In this paper, we exploit the mobility pattern for spatial-temporal data collection and propose a novel mobile data gathering scheme by employing the Metropolis-Hastings algorithm with delayed acceptance, an improved random walk algorithm for a mobile collector to collect data from a sensing field. The proposed scheme exploits Kronecker compressive sensing (KCS) for spatial-temporal correlation of sensory data by allowing the mobile collector to gather temporal compressive measurements from a small subset of randomly selected nodes along a random routing path. More importantly, from the theoretical perspective we prove that the equivalent sensing matrix constructed from the proposed scheme for spatial-temporal compressible signal can satisfy the property of KCS models. The simulation results demonstrate that the proposed scheme can not only significantly reduce communication cost but also improve recovery accuracy for mobile data gathering compared to the other existing schemes. In particular, we also show that the proposed scheme is robust in unreliable wireless environment under various packet losses. All this indicates that the proposed scheme can be an efficient alternative for data gathering application in WSNs. PMID:29117152

  6. Spatial-Temporal Data Collection with Compressive Sensing in Mobile Sensor Networks.

    PubMed

    Zheng, Haifeng; Li, Jiayin; Feng, Xinxin; Guo, Wenzhong; Chen, Zhonghui; Xiong, Neal

    2017-11-08

    Compressive sensing (CS) provides an energy-efficient paradigm for data gathering in wireless sensor networks (WSNs). However, the existing work on spatial-temporal data gathering using compressive sensing only considers either multi-hop relaying based or multiple random walks based approaches. In this paper, we exploit the mobility pattern for spatial-temporal data collection and propose a novel mobile data gathering scheme by employing the Metropolis-Hastings algorithm with delayed acceptance, an improved random walk algorithm for a mobile collector to collect data from a sensing field. The proposed scheme exploits Kronecker compressive sensing (KCS) for spatial-temporal correlation of sensory data by allowing the mobile collector to gather temporal compressive measurements from a small subset of randomly selected nodes along a random routing path. More importantly, from the theoretical perspective we prove that the equivalent sensing matrix constructed from the proposed scheme for spatial-temporal compressible signal can satisfy the property of KCS models. The simulation results demonstrate that the proposed scheme can not only significantly reduce communication cost but also improve recovery accuracy for mobile data gathering compared to the other existing schemes. In particular, we also show that the proposed scheme is robust in unreliable wireless environment under various packet losses. All this indicates that the proposed scheme can be an efficient alternative for data gathering application in WSNs .

  7. Dynamical eigenfunction decomposition of turbulent channel flow

    NASA Technical Reports Server (NTRS)

    Ball, K. S.; Sirovich, L.; Keefe, L. R.

    1991-01-01

    The results of an analysis of low-Reynolds-number turbulent channel flow based on the Karhunen-Loeve (K-L) expansion are presented. The turbulent flow field is generated by a direct numerical simulation of the Navier-Stokes equations at a Reynolds number Re(tau) = 80 (based on the wall shear velocity and channel half-width). The K-L procedure is then applied to determine the eigenvalues and eigenfunctions for this flow. The random coefficients of the K-L expansion are subsequently found by projecting the numerical flow field onto these eigenfunctions. The resulting expansion captures 90 percent of the turbulent energy with significantly fewer modes than the original trigonometric expansion. The eigenfunctions, which appear either as rolls or shearing motions, possess viscous boundary layers at the walls and are much richer in harmonics than the original basis functions.

  8. Unsupervised segmentation of lungs from chest radiographs

    NASA Astrophysics Data System (ADS)

    Ghosh, Payel; Antani, Sameer K.; Long, L. Rodney; Thoma, George R.

    2012-03-01

    This paper describes our preliminary investigations for deriving and characterizing coarse-level textural regions present in the lung field on chest radiographs using unsupervised grow-cut (UGC), a cellular automaton based unsupervised segmentation technique. The segmentation has been performed on a publicly available data set of chest radiographs. The algorithm is useful for this application because it automatically converges to a natural segmentation of the image from random seed points using low-level image features such as pixel intensity values and texture features. Our goal is to develop a portable screening system for early detection of lung diseases for use in remote areas in developing countries. This involves developing automated algorithms for screening x-rays as normal/abnormal with a high degree of sensitivity, and identifying lung disease patterns on chest x-rays. Automatically deriving and quantitatively characterizing abnormal regions present in the lung field is the first step toward this goal. Therefore, region-based features such as geometrical and pixel-value measurements were derived from the segmented lung fields. In the future, feature selection and classification will be performed to identify pathological conditions such as pulmonary tuberculosis on chest radiographs. Shape-based features will also be incorporated to account for occlusions of the lung field and by other anatomical structures such as the heart and diaphragm.

  9. A model for bacterial colonization of sinking aggregates.

    PubMed

    Bearon, R N

    2007-01-01

    Sinking aggregates provide important nutrient-rich environments for marine bacteria. Quantifying the rate at which motile bacteria colonize such aggregations is important in understanding the microbial loop in the pelagic food web. In this paper, a simple analytical model is presented to predict the rate at which bacteria undergoing a random walk encounter a sinking aggregate. The model incorporates the flow field generated by the sinking aggregate, the swimming behavior of the bacteria, and the interaction of the flow with the swimming behavior. An expression for the encounter rate is computed in the limit of large Péclet number when the random walk can be approximated by a diffusion process. Comparison with an individual-based numerical simulation is also given.

  10. Rayleigh approximation to ground state of the Bose and Coulomb glasses

    PubMed Central

    Ryan, S. D.; Mityushev, V.; Vinokur, V. M.; Berlyand, L.

    2015-01-01

    Glasses are rigid systems in which competing interactions prevent simultaneous minimization of local energies. This leads to frustration and highly degenerate ground states the nature and properties of which are still far from being thoroughly understood. We report an analytical approach based on the method of functional equations that allows us to construct the Rayleigh approximation to the ground state of a two-dimensional (2D) random Coulomb system with logarithmic interactions. We realize a model for 2D Coulomb glass as a cylindrical type II superconductor containing randomly located columnar defects (CD) which trap superconducting vortices induced by applied magnetic field. Our findings break ground for analytical studies of glassy systems, marking an important step towards understanding their properties. PMID:25592417

  11. Nonparametric estimation of plant density by the distance method

    USGS Publications Warehouse

    Patil, S.A.; Burnham, K.P.; Kovner, J.L.

    1979-01-01

    A relation between the plant density and the probability density function of the nearest neighbor distance (squared) from a random point is established under fairly broad conditions. Based upon this relationship, a nonparametric estimator for the plant density is developed and presented in terms of order statistics. Consistency and asymptotic normality of the estimator are discussed. An interval estimator for the density is obtained. The modifications of this estimator and its variance are given when the distribution is truncated. Simulation results are presented for regular, random and aggregated populations to illustrate the nonparametric estimator and its variance. A numerical example from field data is given. Merits and deficiencies of the estimator are discussed with regard to its robustness and variance.

  12. Multilayer Markov Random Field models for change detection in optical remote sensing images

    NASA Astrophysics Data System (ADS)

    Benedek, Csaba; Shadaydeh, Maha; Kato, Zoltan; Szirányi, Tamás; Zerubia, Josiane

    2015-09-01

    In this paper, we give a comparative study on three Multilayer Markov Random Field (MRF) based solutions proposed for change detection in optical remote sensing images, called Multicue MRF, Conditional Mixed Markov model, and Fusion MRF. Our purposes are twofold. On one hand, we highlight the significance of the focused model family and we set them against various state-of-the-art approaches through a thematic analysis and quantitative tests. We discuss the advantages and drawbacks of class comparison vs. direct approaches, usage of training data, various targeted application fields and different ways of Ground Truth generation, meantime informing the Reader in which roles the Multilayer MRFs can be efficiently applied. On the other hand we also emphasize the differences between the three focused models at various levels, considering the model structures, feature extraction, layer interpretation, change concept definition, parameter tuning and performance. We provide qualitative and quantitative comparison results using principally a publicly available change detection database which contains aerial image pairs and Ground Truth change masks. We conclude that the discussed models are competitive against alternative state-of-the-art solutions, if one uses them as pre-processing filters in multitemporal optical image analysis. In addition, they cover together a large range of applications, considering the different usage options of the three approaches.

  13. Inter-observer agreement on a checklist to evaluate scientific publications in the field of animal reproduction.

    PubMed

    Simoneit, Céline; Heuwieser, Wolfgang; Arlt, Sebastian P

    2012-01-01

    This study's objective was to determine respondents' inter-observer agreement on a detailed checklist to evaluate three exemplars (one case report, one randomized controlled study without blinding, and one blinded, randomized controlled study) of the scientific literature in the field of bovine reproduction. Fourteen international scientists in the field of animal reproduction were provided with the three articles, three copies of the checklist, and a supplementary explanation. Overall, 13 responded to more than 90% of the items. Overall repeatability between respondents using Fleiss's κ was 0.35 (fair agreement). Combining the "strongly agree" and "agree" responses and the "strongly disagree" and "disagree" responses increased κ to 0.49 (moderate agreement). Evaluation of information given in the three articles on housing of the animals (35% identical answers) and preconditions or pretreatments (42%) varied widely. Even though the overall repeatability was fair, repeatability concerning the important categories was high (e.g., level of agreement=98%). Our data show that the checklist is a reasonable and practical supporting tool to assess the quality of publications. Therefore, it may be used in teaching and practicing evidence-based veterinary medicine. It can support training in systematic and critical appraisal of information and in clinical decision making.

  14. Binary pseudo-random patterned structures for modulation transfer function calibration and resolution characterization of a full-field transmission soft x-ray microscope

    DOE PAGES

    Yashchuk, V. V.; Fischer, P. J.; Chan, E. R.; ...

    2015-12-09

    We present a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) one-dimensional sequences and two-dimensional arrays as an effective method for spectral characterization in the spatial frequency domain of a broad variety of metrology instrumentation, including interferometric microscopes, scatterometers, phase shifting Fizeau interferometers, scanning and transmission electron microscopes, and at this time, x-ray microscopes. The inherent power spectral density of BPR gratings and arrays, which has a deterministic white-noise-like character, allows a direct determination of the MTF with a uniform sensitivity over the entire spatial frequency range and field of view of an instrument. We demonstrate themore » MTF calibration and resolution characterization over the full field of a transmission soft x-ray microscope using a BPR multilayer (ML) test sample with 2.8 nm fundamental layer thickness. We show that beyond providing a direct measurement of the microscope's MTF, tests with the BPRML sample can be used to fine tune the instrument's focal distance. Finally, our results confirm the universality of the method that makes it applicable to a large variety of metrology instrumentation with spatial wavelength bandwidths from a few nanometers to hundreds of millimeters.« less

  15. Multiple scattering and stop band characteristics of flexural waves on a thin plate with circular holes

    NASA Astrophysics Data System (ADS)

    Wang, Zuowei; Biwa, Shiro

    2018-03-01

    A numerical procedure is proposed for the multiple scattering analysis of flexural waves on a thin plate with circular holes based on the Kirchhoff plate theory. The numerical procedure utilizes the wave function expansion of the exciting as well as scattered fields, and the boundary conditions at the periphery of holes are incorporated as the relations between the expansion coefficients of exciting and scattered fields. A set of linear algebraic equations with respect to the wave expansion coefficients of the exciting field alone is established by the numerical collocation method. To demonstrate the applicability of the procedure, the stop band characteristics of flexural waves are analyzed for different arrangements and concentrations of circular holes on a steel plate. The energy transmission spectra of flexural waves are shown to capture the detailed features of the stop band formation of regular and random arrangements of holes. The increase of the concentration of holes is found to shift the dips of the energy transmission spectra toward higher frequencies as well as deepen them. The hexagonal hole arrangement can form a much broader stop band than the square hole arrangement for flexural wave transmission. It is also demonstrated that random arrangements of holes make the transmission spectrum more complicated.

  16. DLA based compressed sensing for high resolution MR microscopy of neuronal tissue.

    PubMed

    Nguyen, Khieu-Van; Li, Jing-Rebecca; Radecki, Guillaume; Ciobanu, Luisa

    2015-10-01

    In this work we present the implementation of compressed sensing (CS) on a high field preclinical scanner (17.2 T) using an undersampling trajectory based on the diffusion limited aggregation (DLA) random growth model. When applied to a library of images this approach performs better than the traditional undersampling based on the polynomial probability density function. In addition, we show that the method is applicable to imaging live neuronal tissues, allowing significantly shorter acquisition times while maintaining the image quality necessary for identifying the majority of neurons via an automatic cell segmentation algorithm. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE PAGES

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    2016-07-20

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  18. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  19. Spin-orbit torque induced magnetization switching in Ta/Co{sub 20}Fe{sub 60}B{sub 20}/MgO structures under small in-plane magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Jiangwei, E-mail: caojw@lzu.edu.cn; Zheng, Yuqiang; Su, Xianpeng

    2016-04-25

    Spin-orbit torque (SOT)-induced magnetization switching under small in-plane magnetic fields in as-deposited and annealed Ta/CoFeB/MgO structures is studied. For the as-deposited samples, partial SOT-induced switching behavior is observed under an in-plane field of less than 100 Oe. Conversely, for the annealed samples, an in-plane field of 10 Oe is large enough to achieve full deterministic magnetization switching. The Dzyaloshinskii-Moriya interaction at the Ta/CoFeB interface is believed to be the main reason for the discrepancy of the requisite in-plane magnetic fields for switching in the as-deposited and annealed samples. In addition, asymmetric field dependence behavior of SOT-induced magnetization switching is observed in themore » annealed samples. Deterministic magnetization switching in the absence of an external magnetic field is obtained in the annealed samples, which is extremely important to develop SOT-based magnetoresistive random access memory.« less

  20. Landslide Inventory Mapping from Bitemporal 10 m SENTINEL-2 Images Using Change Detection Based Markov Random Field

    NASA Astrophysics Data System (ADS)

    Qin, Y.; Lu, P.; Li, Z.

    2018-04-01

    Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF) method for landslide inventory mapping. The proposed method mainly includes two steps: 1) change detection-based multi-threshold for training samples generation and 2) MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1) it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2) it takes the spectral characteristics of landslides into account; and 3) it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2) images in China.

  1. Propagation of finite amplitude sound through turbulence: Modeling with geometrical acoustics and the parabolic approximation

    NASA Astrophysics Data System (ADS)

    Blanc-Benon, Philippe; Lipkens, Bart; Dallois, Laurent; Hamilton, Mark F.; Blackstock, David T.

    2002-01-01

    Sonic boom propagation can be affected by atmospheric turbulence. It has been shown that turbulence affects the perceived loudness of sonic booms, mainly by changing its peak pressure and rise time. The models reported here describe the nonlinear propagation of sound through turbulence. Turbulence is modeled as a set of individual realizations of a random temperature or velocity field. In the first model, linear geometrical acoustics is used to trace rays through each realization of the turbulent field. A nonlinear transport equation is then derived along each eigenray connecting the source and receiver. The transport equation is solved by a Pestorius algorithm. In the second model, the KZK equation is modified to account for the effect of a random temperature field and it is then solved numerically. Results from numerical experiments that simulate the propagation of spark-produced N waves through turbulence are presented. It is observed that turbulence decreases, on average, the peak pressure of the N waves and increases the rise time. Nonlinear distortion is less when turbulence is present than without it. The effects of random vector fields are stronger than those of random temperature fields. The location of the caustics and the deformation of the wave front are also presented. These observations confirm the results from the model experiment in which spark-produced N waves are used to simulate sonic boom propagation through a turbulent atmosphere.

  2. Propagation of finite amplitude sound through turbulence: modeling with geometrical acoustics and the parabolic approximation.

    PubMed

    Blanc-Benon, Philippe; Lipkens, Bart; Dallois, Laurent; Hamilton, Mark F; Blackstock, David T

    2002-01-01

    Sonic boom propagation can be affected by atmospheric turbulence. It has been shown that turbulence affects the perceived loudness of sonic booms, mainly by changing its peak pressure and rise time. The models reported here describe the nonlinear propagation of sound through turbulence. Turbulence is modeled as a set of individual realizations of a random temperature or velocity field. In the first model, linear geometrical acoustics is used to trace rays through each realization of the turbulent field. A nonlinear transport equation is then derived along each eigenray connecting the source and receiver. The transport equation is solved by a Pestorius algorithm. In the second model, the KZK equation is modified to account for the effect of a random temperature field and it is then solved numerically. Results from numerical experiments that simulate the propagation of spark-produced N waves through turbulence are presented. It is observed that turbulence decreases, on average, the peak pressure of the N waves and increases the rise time. Nonlinear distortion is less when turbulence is present than without it. The effects of random vector fields are stronger than those of random temperature fields. The location of the caustics and the deformation of the wave front are also presented. These observations confirm the results from the model experiment in which spark-produced N waves are used to simulate sonic boom propagation through a turbulent atmosphere.

  3. Flight-path estimation in passive low-altitude flight by visual cues

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Kohn, S.

    1993-01-01

    A series of experiments was conducted, in which subjects had to estimate the flight path while passively being flown in straight or in curved motion over several types of nominally flat, textured terrain. Three computer-generated terrain types were investigated: (1) a random 'pole' field, (2) a flat field consisting of random rectangular patches, and (3) a field of random parallelepipeds. Experimental parameters were the velocity-to-height (V/h) ratio, the viewing distance, and the terrain type. Furthermore, the effect of obscuring parts of the visual field was investigated. Assumptions were made about the basic visual-field information by analyzing the pattern of line-of-sight (LOS) rate vectors in the visual field. The experimental results support these assumptions and show that, for both a straight as well as a curved flight path, the estimation accuracy and estimation times improve with the V/h ratio. Error scores for the curved flight path are found to be about 3 deg in visual angle higher than for the straight flight path, and the sensitivity to the V/h ratio is found to be considerably larger. For the straight motion, the flight path could be estimated successfully from local areas in the far field. Curved flight-path estimates have to rely on the entire LOS rate pattern.

  4. Intermittency and random matrices

    NASA Astrophysics Data System (ADS)

    Sokoloff, Dmitry; Illarionov, E. A.

    2015-08-01

    A spectacular phenomenon of intermittency, i.e. a progressive growth of higher statistical moments of a physical field excited by an instability in a random medium, attracted the attention of Zeldovich in the last years of his life. At that time, the mathematical aspects underlying the physical description of this phenomenon were still under development and relations between various findings in the field remained obscure. Contemporary results from the theory of the product of independent random matrices (the Furstenberg theory) allowed the elaboration of the phenomenon of intermittency in a systematic way. We consider applications of the Furstenberg theory to some problems in cosmology and dynamo theory.

  5. Biases encountered in long-term monitoring studies of invertebrates and microflora: Australian examples of protocols, personnel, tools and site location.

    PubMed

    Greenslade, Penelope; Florentine, Singarayer K; Hansen, Brigita D; Gell, Peter A

    2016-08-01

    Monitoring forms the basis for understanding ecological change. It relies on repeatability of methods to ensure detected changes accurately reflect the effect of environmental drivers. However, operator bias can influence the repeatability of field and laboratory work. We tested this for invertebrates and diatoms in three trials: (1) two operators swept invertebrates from heath vegetation, (2) four operators picked invertebrates from pyrethrum knockdown samples from tree trunk and (3) diatom identifications by eight operators in three laboratories. In each trial, operators were working simultaneously and their training in the field and laboratory was identical. No variation in catch efficiency was found between the two operators of differing experience using a random number of net sweeps to catch invertebrates when sequence, location and size of sweeps were random. Number of individuals and higher taxa collected by four operators from tree trunks varied significantly between operators and with their 'experience ranking'. Diatom identifications made by eight operators were clustered together according to which of three laboratories they belonged. These three tests demonstrated significant potential bias of operators in both field and laboratory. This is the first documented case demonstrating the significant influence of observer bias on results from invertebrate field-based studies. Examples of two long-term trials are also given that illustrate further operator bias. Our results suggest that long-term ecological studies using invertebrates need to be rigorously audited to ensure that operator bias is accounted for during analysis and interpretation. Further, taxonomic harmonisation remains an important step in merging field and laboratory data collected by different operators.

  6. Gaussian Random Fields Methods for Fork-Join Network with Synchronization Constraints

    DTIC Science & Technology

    2014-12-22

    substantial efforts were dedicated to the study of the max-plus recursions [21, 3, 12]. More recently, Atar et al. [2] have studied a fork-join...feedback and NES, Atar et al. [2] show that a dynamic priority discipline achieves throughput optimal- ity asymptotically in the conventional heavy...2011) Patient flow in hospitals: a data-based queueing-science perspective. Submitted to Stochastic Systems, 20. [2] R. Atar , A. Mandelbaum and A

  7. Volumetric Light-field Encryption at the Microscopic Scale

    PubMed Central

    Li, Haoyu; Guo, Changliang; Muniraj, Inbarasan; Schroeder, Bryce C.; Sheridan, John T.; Jia, Shu

    2017-01-01

    We report a light-field based method that allows the optical encryption of three-dimensional (3D) volumetric information at the microscopic scale in a single 2D light-field image. The system consists of a microlens array and an array of random phase/amplitude masks. The method utilizes a wave optics model to account for the dominant diffraction effect at this new scale, and the system point-spread function (PSF) serves as the key for encryption and decryption. We successfully developed and demonstrated a deconvolution algorithm to retrieve both spatially multiplexed discrete data and continuous volumetric data from 2D light-field images. Showing that the method is practical for data transmission and storage, we obtained a faithful reconstruction of the 3D volumetric information from a digital copy of the encrypted light-field image. The method represents a new level of optical encryption, paving the way for broad industrial and biomedical applications in processing and securing 3D data at the microscopic scale. PMID:28059149

  8. Volumetric Light-field Encryption at the Microscopic Scale

    NASA Astrophysics Data System (ADS)

    Li, Haoyu; Guo, Changliang; Muniraj, Inbarasan; Schroeder, Bryce C.; Sheridan, John T.; Jia, Shu

    2017-01-01

    We report a light-field based method that allows the optical encryption of three-dimensional (3D) volumetric information at the microscopic scale in a single 2D light-field image. The system consists of a microlens array and an array of random phase/amplitude masks. The method utilizes a wave optics model to account for the dominant diffraction effect at this new scale, and the system point-spread function (PSF) serves as the key for encryption and decryption. We successfully developed and demonstrated a deconvolution algorithm to retrieve both spatially multiplexed discrete data and continuous volumetric data from 2D light-field images. Showing that the method is practical for data transmission and storage, we obtained a faithful reconstruction of the 3D volumetric information from a digital copy of the encrypted light-field image. The method represents a new level of optical encryption, paving the way for broad industrial and biomedical applications in processing and securing 3D data at the microscopic scale.

  9. Predicting stem total and assortment volumes in an industrial Pinus taeda L. forest plantation using airborne laser scanning data and random forest

    Treesearch

    Carlos Alberto Silva; Carine Klauberg; Andrew Thomas Hudak; Lee Alexander Vierling; Wan Shafrina Wan Mohd Jaafar; Midhun Mohan; Mariano Garcia; Antonio Ferraz; Adrian Cardil; Sassan Saatchi

    2017-01-01

    Improvements in the management of pine plantations result in multiple industrial and environmental benefits. Remote sensing techniques can dramatically increase the efficiency of plantation management by reducing or replacing time-consuming field sampling. We tested the utility and accuracy of combining field and airborne lidar data with Random Forest, a supervised...

  10. Rippled graphene in an in-plane magnetic field: effects of a random vector potential.

    PubMed

    Lundeberg, Mark B; Folk, Joshua A

    2010-10-01

    We report measurements of the effects of a random vector potential generated by applying an in-plane magnetic field to a graphene flake. Magnetic flux through the ripples cause orbital effects: Phase-coherent weak localization is suppressed, while quasirandom Lorentz forces lead to anisotropic magnetoresistance. Distinct signatures of these two effects enable the ripple size to be characterized.

  11. Non-random distribution of DNA double-strand breaks induced by particle irradiation

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Cooper, P. K.; Rydberg, B.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    Induction of DNA double-strand breaks (dsbs) in mammalian cells is dependent on the spatial distribution of energy deposition from the ionizing radiation. For high LET particle radiations the primary ionization sites occur in a correlated manner along the track of the particles, while for X-rays these sites are much more randomly distributed throughout the volume of the cell. It can therefore be expected that the distribution of dsbs linearly along the DNA molecule also varies with the type of radiation and the ionization density. Using pulsed-field gel and conventional gel techniques, we measured the size distribution of DNA molecules from irradiated human fibroblasts in the total range of 0.1 kbp-10 Mbp for X-rays and high LET particles (N ions, 97 keV/microns and Fe ions, 150 keV/microns). On a mega base pair scale we applied conventional pulsed-field gel electrophoresis techniques such as measurement of the fraction of DNA released from the well (FAR) and measurement of breakage within a specific NotI restriction fragment (hybridization assay). The induction rate for widely spaced breaks was found to decrease with LET. However, when the entire distribution of radiation-induced fragments was analysed, we detected an excess of fragments with sizes below about 200 kbp for the particles compared with X-irradiation. X-rays are thus more effective than high LET radiations in producing large DNA fragments but less effective in the production of smaller fragments. We determined the total induction rate of dsbs for the three radiations based on a quantitative analysis of all the measured radiation-induced fragments and found that the high LET particles were more efficient than X-rays at inducing dsbs, indicating an increasing total efficiency with LET. Conventional assays that are based only on the measurement of large fragments are therefore misleading when determining total dsb induction rates of high LET particles. The possible biological significance of this non-randomness for dsb induction is discussed.

  12. Conditional random field modelling of interactions between findings in mammography

    NASA Astrophysics Data System (ADS)

    Kooi, Thijs; Mordang, Jan-Jurre; Karssemeijer, Nico

    2017-03-01

    Recent breakthroughs in training deep neural network architectures, in particular deep Convolutional Neural Networks (CNNs), made a big impact on vision research and are increasingly responsible for advances in Computer Aided Diagnosis (CAD). Since many natural scenes and medical images vary in size and are too large to feed to the networks as a whole, two stage systems are typically employed, where in the first stage, small regions of interest in the image are located and presented to the network as training and test data. These systems allow us to harness accurate region based annotations, making the problem easier to learn. However, information is processed purely locally and context is not taken into account. In this paper, we present preliminary work on the employment of a Conditional Random Field (CRF) that is trained on top the CNN to model contextual interactions such as the presence of other suspicious regions, for mammography CAD. The model can easily be extended to incorporate other sources of information, such as symmetry, temporal change and various patient covariates and is general in the sense that it can have application in other CAD problems.

  13. Cluster mass inference via random field theory.

    PubMed

    Zhang, Hui; Nichols, Thomas E; Johnson, Timothy D

    2009-01-01

    Cluster extent and voxel intensity are two widely used statistics in neuroimaging inference. Cluster extent is sensitive to spatially extended signals while voxel intensity is better for intense but focal signals. In order to leverage strength from both statistics, several nonparametric permutation methods have been proposed to combine the two methods. Simulation studies have shown that of the different cluster permutation methods, the cluster mass statistic is generally the best. However, to date, there is no parametric cluster mass inference available. In this paper, we propose a cluster mass inference method based on random field theory (RFT). We develop this method for Gaussian images, evaluate it on Gaussian and Gaussianized t-statistic images and investigate its statistical properties via simulation studies and real data. Simulation results show that the method is valid under the null hypothesis and demonstrate that it can be more powerful than the cluster extent inference method. Further, analyses with a single subject and a group fMRI dataset demonstrate better power than traditional cluster size inference, and good accuracy relative to a gold-standard permutation test.

  14. Multiple scattering of waves in random media: Application to the study of the city-site effect in Mexico City area.

    NASA Astrophysics Data System (ADS)

    Ishizawa, O. A.; Clouteau, D.

    2007-12-01

    Long-duration, amplifications and spatial response's variability of the seismic records registered in Mexico City during the September 1985 earthquake cannot only be explained by the soil velocity model. We will try to explain these phenomena by studying the extent of the effect of buildings' diffracted wave fields during an earthquake. The main question is whether the presence of a large number of buildings can significantly modify the seismic wave field. We are interested in the interaction between the incident wave field propagating in a stratified half- space and a large number of structures at the free surface, i.e., the coupled city-site effect. We study and characterize the seismic wave propagation regimes in a city using the theory of wave propagation in random media. In the coupled city-site system, the buildings are modeled as resonant scatterers uniformly distributed at the surface of a deterministic, horizontally layered elastic half-space representing the soil. Based on the mean-field and the field correlation equations, we build a theoretical model which takes into account the multiple scattering of seismic waves and allows us to describe the coupled city-site system behavior in a simple and rapid way. The results obtained for the configurationally averaged field quantities are validated by means of 3D results for the seismic response of a deterministic model. The numerical simulations of this model are computed with MISS3D code based on classical Soil-Structure Interaction techniques and on a variational coupling between Boundary Integral Equations for a layered soil and a modal Finite Element approach for the buildings. This work proposes a detailed numerical and a theoretical analysis of the city-site interaction (CSI) in Mexico City area. The principal parameters in the study of the CSI are the buildings resonant frequency distribution, the soil characteristics of the site, the urban density and position of the buildings in the city, as well as the type of incident wave. The main results of the theoretical and numerical models allow us to characterize the seismic movement in urban areas.

  15. Does an outcome-based approach to continuing medical education improve physicians' competences in rational prescribing?

    PubMed

    Esmaily, Hamideh M; Savage, Carl; Vahidi, Rezagoli; Amini, Abolghasem; Dastgiri, Saeed; Hult, Hakan; Dahlgren, Lars Owe; Wahlstrom, Rolf

    2009-11-01

    Continuing medical education (CME) is compulsory in Iran, and traditionally it is lecture-based, which is mostly not successful. Outcome-based education has been proposed for CME programs. To evaluate the effectiveness of an outcome-based educational intervention with a new approach based on outcomes and aligned teaching methods, on knowledge and skills of general physicians (GPs) working in primary care compared with a concurrent CME program in the field of "Rational prescribing". The method used was cluster randomized controlled design. All GPs working in six cities in one province in Iran were invited to participate. The cities were matched and randomly divided into an intervention arm for education on rational prescribing with an outcome-based approach, and a control arm for a traditional program on the same topic. Knowledge and skills were assessed using a pre- and post-test, including case scenarios. In total, 112 GPs participated. There were significant improvements in knowledge and prescribing skills after the training in the intervention arm as well as in comparison with the changes in the control arm. The overall intervention effect was 26 percentage units. The introduction of an outcome-based approach in CME appears to be effective when creating programs to improve GPs' knowledge and skills.

  16. Towards a high-speed quantum random number generator

    NASA Astrophysics Data System (ADS)

    Stucki, Damien; Burri, Samuel; Charbon, Edoardo; Chunnilall, Christopher; Meneghetti, Alessio; Regazzoni, Francesco

    2013-10-01

    Randomness is of fundamental importance in various fields, such as cryptography, numerical simulations, or the gaming industry. Quantum physics, which is fundamentally probabilistic, is the best option for a physical random number generator. In this article, we will present the work carried out in various projects in the context of the development of a commercial and certified high speed random number generator.

  17. Investigation of Biotransport in a Tumor With Uncertain Material Properties Using a Nonintrusive Spectral Uncertainty Quantification Method.

    PubMed

    Alexanderian, Alen; Zhu, Liang; Salloum, Maher; Ma, Ronghui; Yu, Meilin

    2017-09-01

    In this study, statistical models are developed for modeling uncertain heterogeneous permeability and porosity in tumors, and the resulting uncertainties in pressure and velocity fields during an intratumoral injection are quantified using a nonintrusive spectral uncertainty quantification (UQ) method. Specifically, the uncertain permeability is modeled as a log-Gaussian random field, represented using a truncated Karhunen-Lòeve (KL) expansion, and the uncertain porosity is modeled as a log-normal random variable. The efficacy of the developed statistical models is validated by simulating the concentration fields with permeability and porosity of different uncertainty levels. The irregularity in the concentration field bears reasonable visual agreement with that in MicroCT images from experiments. The pressure and velocity fields are represented using polynomial chaos (PC) expansions to enable efficient computation of their statistical properties. The coefficients in the PC expansion are computed using a nonintrusive spectral projection method with the Smolyak sparse quadrature. The developed UQ approach is then used to quantify the uncertainties in the random pressure and velocity fields. A global sensitivity analysis is also performed to assess the contribution of individual KL modes of the log-permeability field to the total variance of the pressure field. It is demonstrated that the developed UQ approach can effectively quantify the flow uncertainties induced by uncertain material properties of the tumor.

  18. Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foye, Kevin C.; Soong, Te-Yang

    2012-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the wastemore » mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific example, relative density, which can be determined through field measurements, was selected as the field quality control parameter for waste placement. This technique can be extended to include a rigorous performance-based methodology using other parameters (void space criteria, debris-soil mix ratio, pre-loading, etc.). As shown in this example, each parameter range, or sets of parameter ranges can be selected such that they can result in an acceptable, long-term differential settlement according to the probabilistic model. The methodology can also be used to re-evaluate the long-term differential settlement behavior at closed land disposal facilities to identify, if any, problematic facilities so that remedial action (e.g., reinforcement of upper and intermediate waste layers) can be implemented. Considering the inherent spatial variability in waste and earth materials and the need for engineers to apply sound quantitative practices to engineering analysis, it is important to apply the available probabilistic techniques to problems of differential settlement. One such method to implement probability-based differential settlement analyses for the design of landfill final covers has been presented. The design evaluation technique presented is one tool to bridge the gap from deterministic practice to probabilistic practice. (authors)« less

  19. Spatio-temporal modelling of wind speed variations and extremes in the Caribbean and the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Rychlik, Igor; Mao, Wengang

    2018-02-01

    The wind speed variability in the North Atlantic has been successfully modelled using a spatio-temporal transformed Gaussian field. However, this type of model does not correctly describe the extreme wind speeds attributed to tropical storms and hurricanes. In this study, the transformed Gaussian model is further developed to include the occurrence of severe storms. In this new model, random components are added to the transformed Gaussian field to model rare events with extreme wind speeds. The resulting random field is locally stationary and homogeneous. The localized dependence structure is described by time- and space-dependent parameters. The parameters have a natural physical interpretation. To exemplify its application, the model is fitted to the ECMWF ERA-Interim reanalysis data set. The model is applied to compute long-term wind speed distributions and return values, e.g., 100- or 1000-year extreme wind speeds, and to simulate random wind speed time series at a fixed location or spatio-temporal wind fields around that location.

  20. Clustering, randomness, and regularity in cloud fields. 4. Stratocumulus cloud fields

    NASA Astrophysics Data System (ADS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-07-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (>900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  1. Clustering, randomness, and regularity in cloud fields. 4: Stratocumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-01-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (more than 900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  2. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  3. Transport of Charged Particles in Turbulent Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Parashar, T.; Subedi, P.; Sonsrettee, W.; Blasi, P.; Ruffolo, D. J.; Matthaeus, W. H.; Montgomery, D.; Chuychai, P.; Dmitruk, P.; Wan, M.; Chhiber, R.

    2017-12-01

    Magnetic fields permeate the Universe. They are found in planets, stars, galaxies, and the intergalactic medium. The magnetic field found in these astrophysical systems are usually chaotic, disordered, and turbulent. The investigation of the transport of cosmic rays in magnetic turbulence is a subject of considerable interest. One of the important aspects of cosmic ray transport is to understand their diffusive behavior and to calculate the diffusion coefficient in the presence of these turbulent fields. Research has most frequently concentrated on determining the diffusion coefficient in the presence of a mean magnetic field. Here, we will particularly focus on calculating diffusion coefficients of charged particles and magnetic field lines in a fully three-dimensional isotropic turbulent magnetic field with no mean field, which may be pertinent to many astrophysical situations. For charged particles in isotropic turbulence we identify different ranges of particle energy depending upon the ratio of the Larmor radius of the charged particle to the characteristic outer length scale of the turbulence. Different theoretical models are proposed to calculate the diffusion coefficient, each applicable to a distinct range of particle energies. The theoretical ideas are tested against results of detailed numerical experiments using Monte-Carlo simulations of particle propagation in stochastic magnetic fields. We also discuss two different methods of generating random magnetic field to study charged particle propagation using numerical simulation. One method is the usual way of generating random fields with a specified power law in wavenumber space, using Gaussian random variables. Turbulence, however, is non-Gaussian, with variability that comes in bursts called intermittency. We therefore devise a way to generate synthetic intermittent fields which have many properties of realistic turbulence. Possible applications of such synthetically generated intermittent fields are discussed.

  4. Nature of magnetization and lateral spin-orbit interaction in gated semiconductor nanowires.

    PubMed

    Karlsson, H; Yakimenko, I I; Berggren, K-F

    2018-05-31

    Semiconductor nanowires are interesting candidates for realization of spintronics devices. In this paper we study electronic states and effects of lateral spin-orbit coupling (LSOC) in a one-dimensional asymmetrically biased nanowire using the Hartree-Fock method with Dirac interaction. We have shown that spin polarization can be triggered by LSOC at finite source-drain bias,as a result of numerical noise representing a random magnetic field due to wiring or a random background magnetic field by Earth magnetic field, for instance. The electrons spontaneously arrange into spin rows in the wire due to electron interactions leading to a finite spin polarization. The direction of polarization is, however, random at zero source-drain bias. We have found that LSOC has an effect on orientation of spin rows only in the case when source-drain bias is applied.

  5. Nature of magnetization and lateral spin–orbit interaction in gated semiconductor nanowires

    NASA Astrophysics Data System (ADS)

    Karlsson, H.; Yakimenko, I. I.; Berggren, K.-F.

    2018-05-01

    Semiconductor nanowires are interesting candidates for realization of spintronics devices. In this paper we study electronic states and effects of lateral spin–orbit coupling (LSOC) in a one-dimensional asymmetrically biased nanowire using the Hartree–Fock method with Dirac interaction. We have shown that spin polarization can be triggered by LSOC at finite source-drain bias,as a result of numerical noise representing a random magnetic field due to wiring or a random background magnetic field by Earth magnetic field, for instance. The electrons spontaneously arrange into spin rows in the wire due to electron interactions leading to a finite spin polarization. The direction of polarization is, however, random at zero source-drain bias. We have found that LSOC has an effect on orientation of spin rows only in the case when source-drain bias is applied.

  6. A multiple scattering theory for EM wave propagation in a dense random medium

    NASA Technical Reports Server (NTRS)

    Karam, M. A.; Fung, A. K.; Wong, K. W.

    1985-01-01

    For a dense medium of randomly distributed scatterers an integral formulation for the total coherent field has been developed. This formulation accounts for the multiple scattering of electromagnetic waves including both the twoand three-particle terms. It is shown that under the Markovian assumption the total coherent field and the effective field have the same effective wave number. As an illustration of this theory, the effective wave number and the extinction coefficient are derived in terms of the polarizability tensor and the pair distribution function for randomly distributed small spherical scatterers. It is found that the contribution of the three-particle term increases with the particle size, the volume fraction, the frequency and the permittivity of the particle. This increase is more significant with frequency and particle size than with other parameters.

  7. Accelerated lifetime test of vibration isolator made of Metal Rubber material

    NASA Astrophysics Data System (ADS)

    Ao, Hongrui; Ma, Yong; Wang, Xianbiao; Chen, Jianye; Jiang, Hongyuan

    2017-01-01

    The Metal Rubber material (MR) is a kind of material with nonlinear damping characteristics for its application in the field of aerospace, petrochemical industry and so on. The study on the lifetime of MR material is impendent to its application in engineering. Based on the dynamic characteristic of MR, the accelerated lifetime experiments of vibration isolators made of MR working under random vibration load were conducted. The effects of structural parameters of MR components on the lifetime of isolators were studied and modelled with the fitting curves of degradation data. The lifetime prediction methods were proposed based on the models.

  8. De-identification of clinical notes via recurrent neural network and conditional random field.

    PubMed

    Liu, Zengjian; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai

    2017-11-01

    De-identification, identifying information from data, such as protected health information (PHI) present in clinical data, is a critical step to enable data to be shared or published. The 2016 Centers of Excellence in Genomic Science (CEGS) Neuropsychiatric Genome-scale and RDOC Individualized Domains (N-GRID) clinical natural language processing (NLP) challenge contains a de-identification track in de-identifying electronic medical records (EMRs) (i.e., track 1). The challenge organizers provide 1000 annotated mental health records for this track, 600 out of which are used as a training set and 400 as a test set. We develop a hybrid system for the de-identification task on the training set. Firstly, four individual subsystems, that is, a subsystem based on bidirectional LSTM (long-short term memory, a variant of recurrent neural network), a subsystem-based on bidirectional LSTM with features, a subsystem based on conditional random field (CRF) and a rule-based subsystem, are used to identify PHI instances. Then, an ensemble learning-based classifiers is deployed to combine all PHI instances predicted by above three machine learning-based subsystems. Finally, the results of the ensemble learning-based classifier and the rule-based subsystem are merged together. Experiments conducted on the official test set show that our system achieves the highest micro F1-scores of 93.07%, 91.43% and 95.23% under the "token", "strict" and "binary token" criteria respectively, ranking first in the 2016 CEGS N-GRID NLP challenge. In addition, on the dataset of 2014 i2b2 NLP challenge, our system achieves the highest micro F1-scores of 96.98%, 95.11% and 98.28% under the "token", "strict" and "binary token" criteria respectively, outperforming other state-of-the-art systems. All these experiments prove the effectiveness of our proposed method. Copyright © 2017. Published by Elsevier Inc.

  9. Computed narrow-band azimuthal time-reversing array retrofocusing in shallow water.

    PubMed

    Dungan, M R; Dowling, D R

    2001-10-01

    The process of acoustic time reversal sends sound waves back to their point of origin in reciprocal acoustic environments even when the acoustic environment is unknown. The properties of the time-reversed field commonly depend on the frequency of the original signal, the characteristics of the acoustic environment, and the configuration of the time-reversing transducer array (TRA). In particular, vertical TRAs are predicted to produce horizontally confined foci in environments containing random volume refraction. This article validates and extends this prediction to shallow water environments via monochromatic Monte Carlo propagation simulations (based on parabolic equation computations using RAM). The computational results determine the azimuthal extent of a TRA's retrofocus in shallow-water sound channels either having random bottom roughness or containing random internal-wave-induced sound speed fluctuations. In both cases, randomness in the environment may reduce the predicted azimuthal angular width of the vertical TRA retrofocus to as little as several degrees (compared to 360 degrees for uniform environments) for source-array ranges from 5 to 20 km at frequencies from 500 Hz to 2 kHz. For both types of randomness, power law scalings are found to collapse the calculated azimuthal retrofocus widths for shallow sources over a variety of acoustic frequencies, source-array ranges, water column depths, and random fluctuation amplitudes and correlation scales. Comparisons are made between retrofocusing on shallow and deep sources, and in strongly and mildly absorbing environments.

  10. Seismic noise attenuation using an online subspace tracking algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Yatong; Li, Shuhua; Zhang, Dong; Chen, Yangkang

    2018-02-01

    We propose a new low-rank based noise attenuation method using an efficient algorithm for tracking subspaces from highly corrupted seismic observations. The subspace tracking algorithm requires only basic linear algebraic manipulations. The algorithm is derived by analysing incremental gradient descent on the Grassmannian manifold of subspaces. When the multidimensional seismic data are mapped to a low-rank space, the subspace tracking algorithm can be directly applied to the input low-rank matrix to estimate the useful signals. Since the subspace tracking algorithm is an online algorithm, it is more robust to random noise than traditional truncated singular value decomposition (TSVD) based subspace tracking algorithm. Compared with the state-of-the-art algorithms, the proposed denoising method can obtain better performance. More specifically, the proposed method outperforms the TSVD-based singular spectrum analysis method in causing less residual noise and also in saving half of the computational cost. Several synthetic and field data examples with different levels of complexities demonstrate the effectiveness and robustness of the presented algorithm in rejecting different types of noise including random noise, spiky noise, blending noise, and coherent noise.

  11. Application of random seismic inversion method based on tectonic model in thin sand body research

    NASA Astrophysics Data System (ADS)

    Dianju, W.; Jianghai, L.; Qingkai, F.

    2017-12-01

    The oil and gas exploitation at Songliao Basin, Northeast China have already progressed to the period with high water production. The previous detailed reservoir description that based on seismic image, sediment core, borehole logging has great limitations in small scale structural interpretation and thin sand body characterization. Thus, precise guidance for petroleum exploration is badly in need of a more advanced method. To do so, we derived the method of random seismic inversion constrained by tectonic model.It can effectively improve the depicting ability of thin sand bodies, combining numerical simulation techniques, which can credibly reducing the blindness of reservoir analysis from the whole to the local and from the macroscopic to the microscopic. At the same time, this can reduce the limitations of the study under the constraints of different geological conditions of the reservoir, accomplish probably the exact estimation for the effective reservoir. Based on the research, this paper has optimized the regional effective reservoir evaluation and the productive location adjustment of applicability, combined with the practical exploration and development in Aonan oil field.

  12. Nonpoint Source Solute Transport Normal to Aquifer Bedding in Heterogeneous, Markov Chain Random Fields

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Harter, T.; Sivakumar, B.

    2005-12-01

    Facies-based geostatistical models have become important tools for the stochastic analysis of flow and transport processes in heterogeneous aquifers. However, little is known about the dependency of these processes on the parameters of facies- based geostatistical models. This study examines the nonpoint source solute transport normal to the major bedding plane in the presence of interconnected high conductivity (coarse- textured) facies in the aquifer medium and the dependence of the transport behavior upon the parameters of the constitutive facies model. A facies-based Markov chain geostatistical model is used to quantify the spatial variability of the aquifer system hydrostratigraphy. It is integrated with a groundwater flow model and a random walk particle transport model to estimate the solute travel time probability distribution functions (pdfs) for solute flux from the water table to the bottom boundary (production horizon) of the aquifer. The cases examined include, two-, three-, and four-facies models with horizontal to vertical facies mean length anisotropy ratios, ek, from 25:1 to 300:1, and with a wide range of facies volume proportions (e.g, from 5% to 95% coarse textured facies). Predictions of travel time pdfs are found to be significantly affected by the number of hydrostratigraphic facies identified in the aquifer, the proportions of coarse-textured sediments, the mean length of the facies (particularly the ratio of length to thickness of coarse materials), and - to a lesser degree - the juxtapositional preference among the hydrostratigraphic facies. In transport normal to the sedimentary bedding plane, travel time pdfs are not log- normally distributed as is often assumed. Also, macrodispersive behavior (variance of the travel time pdf) was found to not be a unique function of the conductivity variance. The skewness of the travel time pdf varied from negatively skewed to strongly positively skewed within the parameter range examined. We also show that the Markov chain approach may give significantly different travel time pdfs when compared to the more commonly used Gaussian random field approach even though the first and second order moments in the geostatistical distribution of the lnK field are identical. The choice of the appropriate geostatistical model is therefore critical in the assessment of nonpoint source transport.

  13. Nonpoint source solute transport normal to aquifer bedding in heterogeneous, Markov chain random fields

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Harter, Thomas; Sivakumar, Bellie

    2006-06-01

    Facies-based geostatistical models have become important tools for analyzing flow and mass transport processes in heterogeneous aquifers. Yet little is known about the relationship between these latter processes and the parameters of facies-based geostatistical models. In this study, we examine the transport of a nonpoint source solute normal (perpendicular) to the major bedding plane of an alluvial aquifer medium that contains multiple geologic facies, including interconnected, high-conductivity (coarse textured) facies. We also evaluate the dependence of the transport behavior on the parameters of the constitutive facies model. A facies-based Markov chain geostatistical model is used to quantify the spatial variability of the aquifer system's hydrostratigraphy. It is integrated with a groundwater flow model and a random walk particle transport model to estimate the solute traveltime probability density function (pdf) for solute flux from the water table to the bottom boundary (the production horizon) of the aquifer. The cases examined include two-, three-, and four-facies models, with mean length anisotropy ratios for horizontal to vertical facies, ek, from 25:1 to 300:1 and with a wide range of facies volume proportions (e.g., from 5 to 95% coarse-textured facies). Predictions of traveltime pdfs are found to be significantly affected by the number of hydrostratigraphic facies identified in the aquifer. Those predictions of traveltime pdfs also are affected by the proportions of coarse-textured sediments, the mean length of the facies (particularly the ratio of length to thickness of coarse materials), and, to a lesser degree, the juxtapositional preference among the hydrostratigraphic facies. In transport normal to the sedimentary bedding plane, traveltime is not lognormally distributed as is often assumed. Also, macrodispersive behavior (variance of the traveltime) is found not to be a unique function of the conductivity variance. For the parameter range examined, the third moment of the traveltime pdf varies from negatively skewed to strongly positively skewed. We also show that the Markov chain approach may give significantly different traveltime distributions when compared to the more commonly used Gaussian random field approach, even when the first- and second-order moments in the geostatistical distribution of the lnK field are identical. The choice of the appropriate geostatistical model is therefore critical in the assessment of nonpoint source transport, and uncertainty about that choice must be considered in evaluating the results.

  14. Sieve-based relation extraction of gene regulatory networks from biological literature

    PubMed Central

    2015-01-01

    Background Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. Results We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice of transforming data into skip-mention sequences is appropriate for detecting relations between distant mentions. Conclusions Linear-chain conditional random fields, along with appropriate data transformations, can be efficiently used to extract relations. The sieve-based architecture simplifies the system as new sieves can be easily added or removed and each sieve can utilize the results of previous ones. Furthermore, sieves with conditional random fields can be trained on arbitrary text data and hence are applicable to broad range of relation extraction tasks and data domains. PMID:26551454

  15. Sieve-based relation extraction of gene regulatory networks from biological literature.

    PubMed

    Žitnik, Slavko; Žitnik, Marinka; Zupan, Blaž; Bajec, Marko

    2015-01-01

    Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice of transforming data into skip-mention sequences is appropriate for detecting relations between distant mentions. Linear-chain conditional random fields, along with appropriate data transformations, can be efficiently used to extract relations. The sieve-based architecture simplifies the system as new sieves can be easily added or removed and each sieve can utilize the results of previous ones. Furthermore, sieves with conditional random fields can be trained on arbitrary text data and hence are applicable to broad range of relation extraction tasks and data domains.

  16. Scanning gate microscopy of quantum rings: effects of an external magnetic field and of charged defects.

    PubMed

    Pala, M G; Baltazar, S; Martins, F; Hackens, B; Sellier, H; Ouisse, T; Bayot, V; Huant, S

    2009-07-01

    We study scanning gate microscopy (SGM) in open quantum rings obtained from buried semiconductor InGaAs/InAlAs heterostructures. By performing a theoretical analysis based on the Keldysh-Green function approach we interpret the radial fringes observed in experiments as the effect of randomly distributed charged defects. We associate SGM conductance images with the local density of states (LDOS) of the system. We show that such an association cannot be made with the current density distribution. By varying an external magnetic field we are able to reproduce recursive quasi-classical orbits in LDOS and conductance images, which bear the same periodicity as the Aharonov-Bohm effect.

  17. The application of mean field theory to image motion estimation.

    PubMed

    Zhang, J; Hanauer, G G

    1995-01-01

    Previously, Markov random field (MRF) model-based techniques have been proposed for image motion estimation. Since motion estimation is usually an ill-posed problem, various constraints are needed to obtain a unique and stable solution. The main advantage of the MRF approach is its capacity to incorporate such constraints, for instance, motion continuity within an object and motion discontinuity at the boundaries between objects. In the MRF approach, motion estimation is often formulated as an optimization problem, and two frequently used optimization methods are simulated annealing (SA) and iterative-conditional mode (ICM). Although the SA is theoretically optimal in the sense of finding the global optimum, it usually takes many iterations to converge. The ICM, on the other hand, converges quickly, but its results are often unsatisfactory due to its "hard decision" nature. Previously, the authors have applied the mean field theory to image segmentation and image restoration problems. It provides results nearly as good as SA but with much faster convergence. The present paper shows how the mean field theory can be applied to MRF model-based motion estimation. This approach is demonstrated on both synthetic and real-world images, where it produced good motion estimates.

  18. Acute nonlymphocytic leukemia and residential exposure to power frequency magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Severson, R.K.

    1986-01-01

    A population-based case-control study of adult acute nonlymphocytic leukemia (ANLL) and residential exposure to power frequency magnetic fields was conducted in King, Pierce and Snohomish Counties in Washington state. Of 164 cases who were diagnosed from January 1, 1981 through December 31, 1984, 114 were interviewed. Controls were selected from the study area on the basis of random digit dialing and frequency matched to the cases by age and sex. Analyses were undertaken to evaluate whether exposure to high levels of power frequency magnetic fields in the residence were associated with an increased risk of ANLL. Neither the directly measuredmore » magnetic fields nor the surrogate values based on the wiring configurations were associated with ANLL. Additional analyses suggested that persons with prior allergies were at decreased risk of acute myelocytic leukemia (AML). Also, persons with prior autoimmune diseases were at increased risk of AML. The increase in AML risk in rheumatoid arthritics was of borderline statistical significance. Finally, cigarette smoking was associated with an increased risk of AML. The risk of AML increased significantly with the number of years of cigarette smoking.« less

  19. Response of space shuttle insulation panels to acoustic noise pressure

    NASA Technical Reports Server (NTRS)

    Vaicaitis, R.

    1976-01-01

    The response of reusable space shuttle insulation panels to random acoustic pressure fields are studied. The basic analytical approach in formulating the governing equations of motion uses a Rayleigh-Ritz technique. The input pressure field is modeled as a stationary Gaussian random process for which the cross-spectral density function is known empirically from experimental measurements. The response calculations are performed in both frequency and time domain.

  20. Vector solution for the mean electromagnetic fields in a layer of random particles

    NASA Technical Reports Server (NTRS)

    Lang, R. H.; Seker, S. S.; Levine, D. M.

    1986-01-01

    The mean electromagnetic fields are found in a layer of randomly oriented particles lying over a half space. A matrix-dyadic formulation of Maxwell's equations is employed in conjunction with the Foldy-Lax approximation to obtain equations for the mean fields. A two variable perturbation procedure, valid in the limit of small fractional volume, is then used to derive uncoupled equations for the slowly varying amplitudes of the mean wave. These equations are solved to obtain explicit expressions for the mean electromagnetic fields in the slab region in the general case of arbitrarily oriented particles and arbitrary polarization of the incident radiation. Numerical examples are given for the application to remote sensing of vegetation.

  1. Digital-Analog Hybrid Scheme and Its Application to Chaotic Random Number Generators

    NASA Astrophysics Data System (ADS)

    Yuan, Zeshi; Li, Hongtao; Miao, Yunchi; Hu, Wen; Zhu, Xiaohua

    2017-12-01

    Practical random number generation (RNG) circuits are typically achieved with analog devices or digital approaches. Digital-based techniques, which use field programmable gate array (FPGA) and graphics processing units (GPU) etc. usually have better performances than analog methods as they are programmable, efficient and robust. However, digital realizations suffer from the effect of finite precision. Accordingly, the generated random numbers (RNs) are actually periodic instead of being real random. To tackle this limitation, in this paper we propose a novel digital-analog hybrid scheme that employs the digital unit as the main body, and minimum analog devices to generate physical RNs. Moreover, the possibility of realizing the proposed scheme with only one memory element is discussed. Without loss of generality, we use the capacitor and the memristor along with FPGA to construct the proposed hybrid system, and a chaotic true random number generator (TRNG) circuit is realized, producing physical RNs at a throughput of Gbit/s scale. These RNs successfully pass all the tests in the NIST SP800-22 package, confirming the significance of the scheme in practical applications. In addition, the use of this new scheme is not restricted to RNGs, and it also provides a strategy to solve the effect of finite precision in other digital systems.

  2. Magnetic-field-induced irreversible antiferromagnetic-ferromagnetic phase transition around room temperature in as-cast Sm-Co based SmCo7-xSix alloys

    NASA Astrophysics Data System (ADS)

    Feng, D. Y.; Zhao, L. Z.; Liu, Z. W.

    2016-04-01

    A magnetic-field-induced irreversible metamagnetic phase transition from antiferro- to ferromagnetism, which leads to an anomalous initial-magnetization curve lying outside the magnetic hysteresis loop, is reported in arc-melted SmCo7-xSix alloys. The transition temperatures are near room temperature, much higher than other compounds with similar initial curves. Detailed investigation shows that this phenomenon is dependent on temperature, magnetic field and Si content and shows some interesting characteristics. It is suggested that varying interactions between the Sm and Co layers in the crystal are responsible for the formation of a metastable AFM structure, which induces the anomalous phenomenon in as-cast alloys. The random occupation of 3g sites by Si and Co atoms also has an effect on this phenomenon.

  3. Residential exposure to radiofrequency fields from mobile phone base stations, and broadcast transmitters: a population-based survey with personal meter.

    PubMed

    Viel, J F; Clerc, S; Barrera, C; Rymzhanova, R; Moissonnier, M; Hours, M; Cardis, E

    2009-08-01

    Both the public perceptions, and most published epidemiologic studies, rely on the assumption that the distance of a particular residence from a base station or a broadcast transmitter is an appropriate surrogate for exposure to radiofrequency fields, although complex propagation characteristics affect the beams from antennas. The main goal of this study was to characterise the distribution of residential exposure from antennas using personal exposure meters. A total of 200 randomly selected people were enrolled. Each participant was supplied with a personal exposure meter for 24 h measurements, and kept a time-location-activity diary. Two exposure metrics for each radiofrequency were then calculated: the proportion of measurements above the detection limit (0.05 V/m), and the maximum electric field strength. Residential address was geocoded, and distance from each antenna was calculated. Much of the time, the recorded field strength was below the detection level (0.05 V/m), the FM band standing apart with a proportion above the detection threshold of 12.3%. The maximum electric field strength was always lower than 1.5 V/m. Exposure to GSM and DCS waves peaked around 280 m and 1000 m from the antennas. A downward trend was found within a 10 km range for FM. Conversely, UMTS, TV 3, and TV 4&5 signals did not vary with distance. Despite numerous limiting factors entailing a high variability in radiofrequency exposure assessment, but owing to a sound statistical technique, we found that exposures from GSM and DCS base stations increase with distance in the near source zone, to a maximum where the main beam intersects the ground. We believe these results will contribute to the ongoing public debate over the location of base stations and their associated emissions.

  4. Scale-invariant puddles in graphene: Geometric properties of electron-hole distribution at the Dirac point.

    PubMed

    Najafi, M N; Nezhadhaghighi, M Ghasemi

    2017-03-01

    We characterize the carrier density profile of the ground state of graphene in the presence of particle-particle interaction and random charged impurity in zero gate voltage. We provide detailed analysis on the resulting spatially inhomogeneous electron gas, taking into account the particle-particle interaction and the remote Coulomb disorder on an equal footing within the Thomas-Fermi-Dirac theory. We present some general features of the carrier density probability measure of the graphene sheet. We also show that, when viewed as a random surface, the electron-hole puddles at zero chemical potential show peculiar self-similar statistical properties. Although the disorder potential is chosen to be Gaussian, we show that the charge field is non-Gaussian with unusual Kondev relations, which can be regarded as a new class of two-dimensional random-field surfaces. Using Schramm-Loewner (SLE) evolution, we numerically demonstrate that the ungated graphene has conformal invariance and the random zero-charge density contours are SLE_{κ} with κ=1.8±0.2, consistent with c=-3 conformal field theory.

  5. Practicing Field Hockey Skills Along the Contextual Interference Continuum: A Comparison of Five Practice Schedules

    PubMed Central

    Cheong, Jadeera Phaik Geok; Lay, Brendan; Grove, J. Robert; Medic, Nikola; Razman, Rizal

    2012-01-01

    To overcome the weakness of the contextual interference (CI) effect within applied settings, Brady, 2008 recommended that the amount of interference be manipulated. This study investigated the effect of five practice schedules on the learning of three field hockey skills. Fifty-five pre-university students performed a total of 90 trials for each skill under blocked, mixed or random practice orders. Results showed a significant time effect with all five practice conditions leading to improvements in acquisition and learning of the skills. No significant differences were found between the groups. The findings of the present study did not support the CI effect and suggest that either blocked, mixed, or random practice schedules can be used effectively when structuring practice for beginners. Key pointsThe contextual interference effect did not surface when using sport skills.There appears to be no difference between blocked and random practice schedules in the learning of field hockey skills.Low (blocked), moderate (mixed) or high (random) interference practice schedules can be used effectively when conducting a multiple skill practice session for beginners. PMID:24149204

  6. Practicing field hockey skills along the contextual interference continuum: a comparison of five practice schedules.

    PubMed

    Cheong, Jadeera Phaik Geok; Lay, Brendan; Grove, J Robert; Medic, Nikola; Razman, Rizal

    2012-01-01

    To overcome the weakness of the contextual interference (CI) effect within applied settings, Brady, 2008 recommended that the amount of interference be manipulated. This study investigated the effect of five practice schedules on the learning of three field hockey skills. Fifty-five pre-university students performed a total of 90 trials for each skill under blocked, mixed or random practice orders. Results showed a significant time effect with all five practice conditions leading to improvements in acquisition and learning of the skills. No significant differences were found between the groups. The findings of the present study did not support the CI effect and suggest that either blocked, mixed, or random practice schedules can be used effectively when structuring practice for beginners. Key pointsThe contextual interference effect did not surface when using sport skills.There appears to be no difference between blocked and random practice schedules in the learning of field hockey skills.Low (blocked), moderate (mixed) or high (random) interference practice schedules can be used effectively when conducting a multiple skill practice session for beginners.

  7. Cavity master equation for the continuous time dynamics of discrete-spin models.

    PubMed

    Aurell, E; Del Ferraro, G; Domínguez, E; Mulet, R

    2017-05-01

    We present an alternate method to close the master equation representing the continuous time dynamics of interacting Ising spins. The method makes use of the theory of random point processes to derive a master equation for local conditional probabilities. We analytically test our solution studying two known cases, the dynamics of the mean-field ferromagnet and the dynamics of the one-dimensional Ising system. We present numerical results comparing our predictions with Monte Carlo simulations in three different models on random graphs with finite connectivity: the Ising ferromagnet, the random field Ising model, and the Viana-Bray spin-glass model.

  8. Cavity master equation for the continuous time dynamics of discrete-spin models

    NASA Astrophysics Data System (ADS)

    Aurell, E.; Del Ferraro, G.; Domínguez, E.; Mulet, R.

    2017-05-01

    We present an alternate method to close the master equation representing the continuous time dynamics of interacting Ising spins. The method makes use of the theory of random point processes to derive a master equation for local conditional probabilities. We analytically test our solution studying two known cases, the dynamics of the mean-field ferromagnet and the dynamics of the one-dimensional Ising system. We present numerical results comparing our predictions with Monte Carlo simulations in three different models on random graphs with finite connectivity: the Ising ferromagnet, the random field Ising model, and the Viana-Bray spin-glass model.

  9. Segmentation of Large Unstructured Point Clouds Using Octree-Based Region Growing and Conditional Random Fields

    NASA Astrophysics Data System (ADS)

    Bassier, M.; Bonduel, M.; Van Genechten, B.; Vergauwen, M.

    2017-11-01

    Point cloud segmentation is a crucial step in scene understanding and interpretation. The goal is to decompose the initial data into sets of workable clusters with similar properties. Additionally, it is a key aspect in the automated procedure from point cloud data to BIM. Current approaches typically only segment a single type of primitive such as planes or cylinders. Also, current algorithms suffer from oversegmenting the data and are often sensor or scene dependent. In this work, a method is presented to automatically segment large unstructured point clouds of buildings. More specifically, the segmentation is formulated as a graph optimisation problem. First, the data is oversegmented with a greedy octree-based region growing method. The growing is conditioned on the segmentation of planes as well as smooth surfaces. Next, the candidate clusters are represented by a Conditional Random Field after which the most likely configuration of candidate clusters is computed given a set of local and contextual features. The experiments prove that the used method is a fast and reliable framework for unstructured point cloud segmentation. Processing speeds up to 40,000 points per second are recorded for the region growing. Additionally, the recall and precision of the graph clustering is approximately 80%. Overall, nearly 22% of oversegmentation is reduced by clustering the data. These clusters will be classified and used as a basis for the reconstruction of BIM models.

  10. Laser-induced speckle scatter patterns in Bacillus colonies

    PubMed Central

    Kim, Huisung; Singh, Atul K.; Bhunia, Arun K.; Bae, Euiwon

    2014-01-01

    Label-free bacterial colony phenotyping technology called BARDOT (Bacterial Rapid Detection using Optical scattering Technology) provided successful classification of several different bacteria at the genus, species, and serovar level. Recent experiments with colonies of Bacillus species provided strikingly different characteristics of elastic light scatter (ELS) patterns, which were comprised of random speckles compared to other bacteria, which are dominated by concentric rings and spokes. Since this laser-based optical sensor interrogates the whole volume of the colony, 3-D information of micro- and macro-structures are all encoded in the far-field scatter patterns. Here, we present a theoretical model explaining the underlying mechanism of the speckle formation by the colonies from Bacillus species. Except for Bacillus polymyxa, all Bacillus spp. produced random bright spots on the imaging plane, which presumably dependent on the cellular and molecular organization and content within the colony. Our scatter model-based analysis revealed that colony spread resulting in variable surface roughness can modify the wavefront of the scatter field. As the center diameter of the Bacillus spp. colony grew from 500 to 900 μm, average speckles area decreased two-fold and the number of small speckles increased seven-fold. In conclusion, as Bacillus colony grows, the average speckle size in the scatter pattern decreases and the number of smaller speckle increases due to the swarming growth characteristics of bacteria within the colony. PMID:25352840

  11. OBSERVATIONAL STUDIES VS. RANDOMIZED CONTROLLED TRIALS: AVENUES TO CAUSAL INFERENCE IN NEPHROLOGY

    PubMed Central

    Kovesdy, Csaba P; Kalantar-Zadeh, Kamyar

    2011-01-01

    A common frustration for practicing Nephrologists is the adage that the lack of randomized controlled trials (RCTs) does not allow us to establish causality, but merely associations. The field of Nephrology, like many other disciplines, has been suffering from a lack of RCTs. The view that short of RCTs there is no reliable evidence has hampered our ability to ascertain the best course of action for our patients. However, many clinically important questions in medicine and public health such as the association of smoking and lung cancer are not amenable to RCTs due to ethical or other considerations. Whereas RCTs unquestionably hold many advantages over observational studies, it should be recognized that they also have many flaws that render them fallible under certain circumstances. We provide a description of the various pros and cons of RCTs and of observational studies using examples from the Nephrology literature, and argue that it is simplistic to rank them solely based on pre-conceived notions about the superiority of one over the other. We also discuss methods whereby observational studies can become acceptable tools for causal inferences. Such approaches are especially important in a field like Nephrology where there are myriads of potential interventions based on complex pathophysiologic states, but where properly designed and conducted RCTs for all of these will probably never materialize. PMID:22364796

  12. Radio polarization and magnetic field structure in M 101

    NASA Astrophysics Data System (ADS)

    Berkhuijsen, E. M.; Urbanik, M.; Beck, R.; Han, J. L.

    2016-04-01

    We observed total and polarized radio continuum emission from the spiral galaxy M 101 at λλ 6.2 cm and 11.1 cm with the Effelsberg telescope. The angular resolutions are 2.´ 5 (=5.4 kpc) and 4.´ 4 (=9.5 kpc), respectively. We use these data to study various emission components in M 101 and properties of the magnetic field. Separation of thermal and non-thermal emission shows that the thermal emission is closely correlated with the spiral arms, while the non-thermal emission is more smoothly distributed indicating diffusion of cosmic ray electrons away from their places of origin. The radial distribution of both emissions has a break near R = 16 kpc (=7.´ 4), where it steepens to an exponential scale length of L ≃ 5 kpc, which is about 2.5 times smaller than at R< 16 kpc. The distribution of the polarized emission has a broad maximum near R = 12 kpc and beyond R = 16 kpc also decreases with L ≃ 5 kpc. It seems that near R = 16 kpc a major change in the structure of M 101 takes place, which also affects the distributions of the strength of the random and ordered magnetic field. Beyond R = 16 kpc the radial scale length of both fields is about 20 kpc, which implies that they decrease to about 0.3 μG at R = 70 kpc, which is the largest optical extent. The equipartition strength of the total field ranges from nearly 10 μG at R< 2 kpc to 4 μG at R = 22-24 kpc. As the random field dominates in M 101 (Bran/Bord ≃ 2.4), wavelength-independent polarization is the main polarization mechanism. We show that energetic events causing H I shells of mean diameter < 625 pc could partly be responsible for this. At radii < 24 kpc, the random magnetic field depends on the star formation rate/area, ΣSFR, with a power-law exponent of b = 0.28 ± 0.02. The ordered magnetic field is generally aligned with the spiral arms with pitch angles that are about 8° larger than those of H I filaments. Based on observations with the 100 m telescope of the MPIfR at Effelsberg.FITS files of the images are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/588/A114

  13. SU-E-T-571: Prostate IMRT QA: Prediction of the Range of Rectal NTCP Using a 2D Field Approach Based on Variations of the Rectal Wall Motion and Thickness.

    PubMed

    Grigorov, G; Chow, J; Foster, K

    2012-06-01

    The aims of this study is to (1) introduce a 2D field of possible rectal normal tissue complication probability (NTCP) in prostate intensity modulated radiotherapy (IMRT) plan, so that based on a given prescribed dose the rectal NTCP is merely a function of the rectal wall thickness and rectal motion; and (2) separate the 2D field of rectal NTCP into area of low risk and area of high risk for rectal toxicity < Grade II, based on the threshold rectal NTCP. The 2D field of NTCP model was developed using ten randomly selected prostate IMRT plans. The clinical rectal geometry was initially represented by the cylindrical contour in the treatment planning system. Different combinations of rectal motions, rectal wall thicknesses, planning target volume margins and prescribed doses were used to determine the NTCP in prostate IMRT plans. It was found that the functions bordering the 2D field for the given AP, LR and SI direction can be described as exponential, quadratic and linear equations, respectively. A ratio of the area of 2D field containing data of the low risk NTCP to the entire area of the field was introduced and calculated. Although our method is based on the Kutcher's dose response model and published tissue parameters, other mathematical models can be used in our approach. The 2D field of rectal NTCP is useful to estimate the rectal NTCP range in the prostate pre-treatment and treatment QA. Our method can determine the patient's threshold immobilization for a given rectal wall thickness so that prescribed dose can be delivered to the prostate to avoid rectal complication. Our method is also applicable to multi-phase prostate IMRT, and can be adapted to any treatment planning systems. © 2012 American Association of Physicists in Medicine.

  14. Zero-field random-field effect in diluted triangular lattice antiferromagnet CuFe1-xAlxO2

    NASA Astrophysics Data System (ADS)

    Nakajima, T.; Mitsuda, S.; Kitagawa, K.; Terada, N.; Komiya, T.; Noda, Y.

    2007-04-01

    We performed neutron scattering experiments on a diluted triangular lattice antiferromagnet (TLA), CuFe1-xAlxO2 with x = 0.10. The detailed analysis of the scattering profiles revealed that the scattering function of magnetic reflection is described as the sum of a Lorentzian term and a Lorentzian-squared term with anisotropic width. The Lorentzian-squared term dominating at low temperature is indicative of the domain state in the prototypical random-field Ising model. Taking account of the sinusoidally amplitude-modulated magnetic structure with incommensurate wavenumber in CuFe1-xAlxO2 with x = 0.10, we conclude that the effective random field arises even at zero field, owing to the combination of site-random magnetic vacancies and the sinusoidal structure that is regarded as a partially disordered (PD) structure in a wide sense, as reported in the typical three-sublattice PD phase of a diluted Ising TLA, CsCo0.83Mg0.17Br3 (van Duijn et al 2004 Phys. Rev. Lett. 92 077202). While the previous study revealed the existence of a domain state in CsCo0.83Mg0.17Br3 by detecting magnetic reflections specific to the spin configuration near the domain walls, our present study revealed the existence of a domain state in CuFe1-xAlxO2 (x = 0.10) by determination of the functional form of the scattering function.

  15. Model of a thin film optical fiber fluorosensor

    NASA Technical Reports Server (NTRS)

    Egalon, Claudio O.; Rogowski, Robert S.

    1991-01-01

    The efficiency of core-light injection from sources in the cladding of an optical fiber is modeled analytically by means of the exact field solution of a step-profile fiber. The analysis is based on the techniques by Marcuse (1988) in which the sources are treated as infinitesimal electric currents with random phase and orientation that excite radiation fields and bound modes. Expressions are developed based on an infinite cladding approximation which yield the power efficiency for a fiber coated with fluorescent sources in the core/cladding interface. Marcuse's results are confirmed for the case of a weakly guiding cylindrical fiber with fluorescent sources uniformly distributed in the cladding, and the power efficiency is shown to be practically constant for variable wavelengths and core radii. The most efficient fibers have the thin film located at the core/cladding boundary, and fibers with larger differences in the indices of refraction are shown to be the most efficient.

  16. Data and results of a laboratory investigation of microprocessor upset caused by simulated lightning-induced analog transients

    NASA Technical Reports Server (NTRS)

    Belcastro, C. M.

    1984-01-01

    Advanced composite aircraft designs include fault-tolerant computer-based digital control systems with thigh reliability requirements for adverse as well as optimum operating environments. Since aircraft penetrate intense electromagnetic fields during thunderstorms, onboard computer systems maya be subjected to field-induced transient voltages and currents resulting in functional error modes which are collectively referred to as digital system upset. A methodology was developed for assessing the upset susceptibility of a computer system onboard an aircraft flying through a lightning environment. Upset error modes in a general-purpose microprocessor were studied via tests which involved the random input of analog transients which model lightning-induced signals onto interface lines of an 8080-based microcomputer from which upset error data were recorded. The application of Markov modeling to upset susceptibility estimation is discussed and a stochastic model development.

  17. The role of educational trainings in the diffusion of smart metering platforms: An agent-based modeling approach

    NASA Astrophysics Data System (ADS)

    Weron, Tomasz; Kowalska-Pyzalska, Anna; Weron, Rafał

    2018-09-01

    Using an agent-based modeling approach we examine the impact of educational programs and trainings on the diffusion of smart metering platforms (SMPs). We also investigate how social responses, like conformity or independence, mass-media advertising as well as opinion stability impact the transition from predecisional and preactional behavioral stages (opinion formation) to actional and postactional stages (decision-making) of individual electricity consumers. We find that mass-media advertising (i.e., a global external field) and educational trainings (i.e., a local external field) lead to similar, though not identical adoption rates. Secondly, that spatially concentrated 'group' trainings are never worse than randomly scattered ones, and for a certain range of parameters are significantly better. Finally, that by manipulating the time required by an agent to make a decision, e.g., through promotions, we can speed up or slow down the diffusion of SMPs.

  18. The effective propagation constants of SH wave in composites reinforced by dispersive parallel nanofibers

    NASA Astrophysics Data System (ADS)

    Qiang, FangWei; Wei, PeiJun; Li, Li

    2012-07-01

    In the present paper, the effective propagation constants of elastic SH waves in composites with randomly distributed parallel cylindrical nanofibers are studied. The surface stress effects are considered based on the surface elasticity theory and non-classical interfacial conditions between the nanofiber and the host are derived. The scattering waves from individual nanofibers embedded in an infinite elastic host are obtained by the plane wave expansion method. The scattering waves from all fibers are summed up to obtain the multiple scattering waves. The interactions among random dispersive nanofibers are taken into account by the effective field approximation. The effective propagation constants are obtained by the configurational average of the multiple scattering waves. The effective speed and attenuation of the averaged wave and the associated dynamical effective shear modulus of composites are numerically calculated. Based on the numerical results, the size effects of the nanofibers on the effective propagation constants and the effective modulus are discussed.

  19. Evidentiary Pluralism as a Strategy for Research and Evidence-Based Practice in Rehabilitation Psychology

    PubMed Central

    Tucker, Jalie A.; Reed, Geoffrey M.

    2008-01-01

    This paper examines the utility of evidentiary pluralism, a research strategy that selects methods in service of content questions, in the context of rehabilitation psychology. Hierarchical views that favor randomized controlled clinical trials (RCTs) over other evidence are discussed, and RCTs are considered as they intersect with issues in the field. RCTs are vital for establishing treatment efficacy, but whether they are uniformly the best evidence to inform practice is critically evaluated. We argue that because treatment is only one of several variables that influence functioning, disability, and participation over time, an expanded set of conceptual and data analytic approaches should be selected in an informed way to support an expanded research agenda that investigates therapeutic and extra-therapeutic influences on rehabilitation processes and outcomes. The benefits of evidentiary pluralism are considered, including helping close the gap between the narrower clinical rehabilitation model and a public health disability model. KEY WORDS: evidence-based practice, evidentiary pluralism, rehabilitation psychology, randomized controlled trials PMID:19649150

  20. A sparse reconstruction method for the estimation of multi-resolution emission fields via atmospheric inversion

    DOE PAGES

    Ray, J.; Lee, J.; Yadav, V.; ...

    2015-04-29

    Atmospheric inversions are frequently used to estimate fluxes of atmospheric greenhouse gases (e.g., biospheric CO 2 flux fields) at Earth's surface. These inversions typically assume that flux departures from a prior model are spatially smoothly varying, which are then modeled using a multi-variate Gaussian. When the field being estimated is spatially rough, multi-variate Gaussian models are difficult to construct and a wavelet-based field model may be more suitable. Unfortunately, such models are very high dimensional and are most conveniently used when the estimation method can simultaneously perform data-driven model simplification (removal of model parameters that cannot be reliably estimated) andmore » fitting. Such sparse reconstruction methods are typically not used in atmospheric inversions. In this work, we devise a sparse reconstruction method, and illustrate it in an idealized atmospheric inversion problem for the estimation of fossil fuel CO 2 (ffCO 2) emissions in the lower 48 states of the USA. Our new method is based on stagewise orthogonal matching pursuit (StOMP), a method used to reconstruct compressively sensed images. Our adaptations bestow three properties to the sparse reconstruction procedure which are useful in atmospheric inversions. We have modified StOMP to incorporate prior information on the emission field being estimated and to enforce non-negativity on the estimated field. Finally, though based on wavelets, our method allows for the estimation of fields in non-rectangular geometries, e.g., emission fields inside geographical and political boundaries. Our idealized inversions use a recently developed multi-resolution (i.e., wavelet-based) random field model developed for ffCO 2 emissions and synthetic observations of ffCO 2 concentrations from a limited set of measurement sites. We find that our method for limiting the estimated field within an irregularly shaped region is about a factor of 10 faster than conventional approaches. It also reduces the overall computational cost by a factor of 2. Further, the sparse reconstruction scheme imposes non-negativity without introducing strong nonlinearities, such as those introduced by employing log-transformed fields, and thus reaps the benefits of simplicity and computational speed that are characteristic of linear inverse problems.« less

Top