Markov Random Fields, Stochastic Quantization and Image Analysis
1990-01-01
Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.
NASA Astrophysics Data System (ADS)
Vanmarcke, Erik
1983-03-01
Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.
NASA Technical Reports Server (NTRS)
Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.
1992-01-01
The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.
Surface plasmon enhanced cell microscopy with blocked random spatial activation
NASA Astrophysics Data System (ADS)
Son, Taehwang; Oh, Youngjin; Lee, Wonju; Yang, Heejin; Kim, Donghyun
2016-03-01
We present surface plasmon enhanced fluorescence microscopy with random spatial sampling using patterned block of silver nanoislands. Rigorous coupled wave analysis was performed to confirm near-field localization on nanoislands. Random nanoislands were fabricated in silver by temperature annealing. By analyzing random near-field distribution, average size of localized fields was found to be on the order of 135 nm. Randomly localized near-fields were used to spatially sample F-actin of J774 cells (mouse macrophage cell-line). Image deconvolution algorithm based on linear imaging theory was established for stochastic estimation of fluorescent molecular distribution. The alignment between near-field distribution and raw image was performed by the patterned block. The achieved resolution is dependent upon factors including the size of localized fields and estimated to be 100-150 nm.
Modeling and statistical analysis of non-Gaussian random fields with heavy-tailed distributions.
Nezhadhaghighi, Mohsen Ghasemi; Nakhlband, Abbas
2017-04-01
In this paper, we investigate and develop an alternative approach to the numerical analysis and characterization of random fluctuations with the heavy-tailed probability distribution function (PDF), such as turbulent heat flow and solar flare fluctuations. We identify the heavy-tailed random fluctuations based on the scaling properties of the tail exponent of the PDF, power-law growth of qth order correlation function, and the self-similar properties of the contour lines in two-dimensional random fields. Moreover, this work leads to a substitution for the fractional Edwards-Wilkinson (EW) equation that works in the presence of μ-stable Lévy noise. Our proposed model explains the configuration dynamics of the systems with heavy-tailed correlated random fluctuations. We also present an alternative solution to the fractional EW equation in the presence of μ-stable Lévy noise in the steady state, which is implemented numerically, using the μ-stable fractional Lévy motion. Based on the analysis of the self-similar properties of contour loops, we numerically show that the scaling properties of contour loop ensembles can qualitatively and quantitatively distinguish non-Gaussian random fields from Gaussian random fluctuations.
NASA Astrophysics Data System (ADS)
Andresen, Juan Carlos; Katzgraber, Helmut G.; Schechter, Moshe
2017-12-01
Random fields disorder Ising ferromagnets by aligning single spins in the direction of the random field in three space dimensions, or by flipping large ferromagnetic domains at dimensions two and below. While the former requires random fields of typical magnitude similar to the interaction strength, the latter Imry-Ma mechanism only requires infinitesimal random fields. Recently, it has been shown that for dilute anisotropic dipolar systems a third mechanism exists, where the ferromagnetic phase is disordered by finite-size glassy domains at a random field of finite magnitude that is considerably smaller than the typical interaction strength. Using large-scale Monte Carlo simulations and zero-temperature numerical approaches, we show that this mechanism applies to disordered ferromagnets with competing short-range ferromagnetic and antiferromagnetic interactions, suggesting its generality in ferromagnetic systems with competing interactions and an underlying spin-glass phase. A finite-size-scaling analysis of the magnetization distribution suggests that the transition might be first order.
Statistical analysis of loopy belief propagation in random fields
NASA Astrophysics Data System (ADS)
Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki
2015-10-01
Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.
Testing for a Signal with Unknown Location and Scale in a Stationary Gaussian Random Field
1994-01-07
Secondary 60D05, 52A22. Key words and phrases. Euler characteristic, integral geometry, image analysis , Gaussian fields, volume of tubes. SUMMARY We...words and phrases. Euler characteristic, integral geometry. image analysis . Gaussian fields. volume of tubes. 20. AMST RACT (Coith..o an revmreo ef* It
Restoration of dimensional reduction in the random-field Ising model at five dimensions
NASA Astrophysics Data System (ADS)
Fytas, Nikolaos G.; Martín-Mayor, Víctor; Picco, Marco; Sourlas, Nicolas
2017-04-01
The random-field Ising model is one of the few disordered systems where the perturbative renormalization group can be carried out to all orders of perturbation theory. This analysis predicts dimensional reduction, i.e., that the critical properties of the random-field Ising model in D dimensions are identical to those of the pure Ising ferromagnet in D -2 dimensions. It is well known that dimensional reduction is not true in three dimensions, thus invalidating the perturbative renormalization group prediction. Here, we report high-precision numerical simulations of the 5D random-field Ising model at zero temperature. We illustrate universality by comparing different probability distributions for the random fields. We compute all the relevant critical exponents (including the critical slowing down exponent for the ground-state finding algorithm), as well as several other renormalization-group invariants. The estimated values of the critical exponents of the 5D random-field Ising model are statistically compatible to those of the pure 3D Ising ferromagnet. These results support the restoration of dimensional reduction at D =5 . We thus conclude that the failure of the perturbative renormalization group is a low-dimensional phenomenon. We close our contribution by comparing universal quantities for the random-field problem at dimensions 3 ≤D <6 to their values in the pure Ising model at D -2 dimensions, and we provide a clear verification of the Rushbrooke equality at all studied dimensions.
Restoration of dimensional reduction in the random-field Ising model at five dimensions.
Fytas, Nikolaos G; Martín-Mayor, Víctor; Picco, Marco; Sourlas, Nicolas
2017-04-01
The random-field Ising model is one of the few disordered systems where the perturbative renormalization group can be carried out to all orders of perturbation theory. This analysis predicts dimensional reduction, i.e., that the critical properties of the random-field Ising model in D dimensions are identical to those of the pure Ising ferromagnet in D-2 dimensions. It is well known that dimensional reduction is not true in three dimensions, thus invalidating the perturbative renormalization group prediction. Here, we report high-precision numerical simulations of the 5D random-field Ising model at zero temperature. We illustrate universality by comparing different probability distributions for the random fields. We compute all the relevant critical exponents (including the critical slowing down exponent for the ground-state finding algorithm), as well as several other renormalization-group invariants. The estimated values of the critical exponents of the 5D random-field Ising model are statistically compatible to those of the pure 3D Ising ferromagnet. These results support the restoration of dimensional reduction at D=5. We thus conclude that the failure of the perturbative renormalization group is a low-dimensional phenomenon. We close our contribution by comparing universal quantities for the random-field problem at dimensions 3≤D<6 to their values in the pure Ising model at D-2 dimensions, and we provide a clear verification of the Rushbrooke equality at all studied dimensions.
Simulating and mapping spatial complexity using multi-scale techniques
De Cola, L.
1994-01-01
A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author
Magnetic field line random walk in models and simulations of reduced magnetohydrodynamic turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snodin, A. P.; Ruffolo, D.; Oughton, S.
2013-12-10
The random walk of magnetic field lines is examined numerically and analytically in the context of reduced magnetohydrodynamic (RMHD) turbulence, which provides a useful description of plasmas dominated by a strong mean field, such as in the solar corona. A recently developed non-perturbative theory of magnetic field line diffusion is compared with the diffusion coefficients obtained by accurate numerical tracing of magnetic field lines for both synthetic models and direct numerical simulations of RMHD. Statistical analysis of an ensemble of trajectories confirms the applicability of the theory, which very closely matches the numerical field line diffusion coefficient as a functionmore » of distance z along the mean magnetic field for a wide range of the Kubo number R. This theory employs Corrsin's independence hypothesis, sometimes thought to be valid only at low R. However, the results demonstrate that it works well up to R = 10, both for a synthetic RMHD model and an RMHD simulation. The numerical results from the RMHD simulation are compared with and without phase randomization, demonstrating a clear effect of coherent structures on the field line random walk for a very low Kubo number.« less
Probabilistic finite elements for transient analysis in nonlinear continua
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Mani, A.
1985-01-01
The probabilistic finite element method (PFEM), which is a combination of finite element methods and second-moment analysis, is formulated for linear and nonlinear continua with inhomogeneous random fields. Analogous to the discretization of the displacement field in finite element methods, the random field is also discretized. The formulation is simplified by transforming the correlated variables to a set of uncorrelated variables through an eigenvalue orthogonalization. Furthermore, it is shown that a reduced set of the uncorrelated variables is sufficient for the second-moment analysis. Based on the linear formulation of the PFEM, the method is then extended to transient analysis in nonlinear continua. The accuracy and efficiency of the method is demonstrated by application to a one-dimensional, elastic/plastic wave propagation problem. The moments calculated compare favorably with those obtained by Monte Carlo simulation. Also, the procedure is amenable to implementation in deterministic FEM based computer programs.
Lin, Susie; McKenna, Samuel J; Yao, Chuan-Fong; Chen, Yu-Ray; Chen, Chit
2017-01-01
The objective of this study was to evaluate the efficacy of hypotensive anesthesia in reducing intraoperative blood loss, decreasing operation time, and improving the quality of the surgical field during orthognathic surgery. A systematic review and meta-analysis of randomized controlled trials addressing these issues were carried out. An electronic database search was performed. The risk of bias was evaluated with the Jadad Scale and Delphi List. The inverse variance statistical method and a random-effects model were used. Ten randomized controlled trials were included for analysis. Our meta-analysis indicated that hypotensive anesthesia reduced intraoperative blood loss by a mean of about 169 mL. Hypotensive anesthesia was not shown to reduce the operation time for orthognathic surgery, but it did improve the quality of the surgical field. Subgroup analysis indicated that for blood loss in double-jaw surgery, the weighted mean difference favored the hypotensive group, with a reduction in blood loss of 175 mL, but no statistically significant reduction in blood loss was found for anterior maxillary osteotomy. If local anesthesia with epinephrine was used in conjunction with hypotensive anesthesia, the reduction in intraoperative blood loss was increased to 254.93 mL. Hypotensive anesthesia was effective in reducing blood loss and improving the quality of the surgical field, but it did not reduce the operation time for orthognathic surgery. The use of local anesthesia in conjunction with hypotensive general anesthesia further reduced the amount of intraoperative blood loss for orthognathic surgery. Copyright © 2016 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zi, Bin; Zhou, Bin
2016-07-01
For the prediction of dynamic response field of the luffing system of an automobile crane (LSOAAC) with random and interval parameters, a hybrid uncertain model is introduced. In the hybrid uncertain model, the parameters with certain probability distribution are modeled as random variables, whereas, the parameters with lower and upper bounds are modeled as interval variables instead of given precise values. Based on the hybrid uncertain model, the hybrid uncertain dynamic response equilibrium equation, in which different random and interval parameters are simultaneously included in input and output terms, is constructed. Then a modified hybrid uncertain analysis method (MHUAM) is proposed. In the MHUAM, based on random interval perturbation method, the first-order Taylor series expansion and the first-order Neumann series, the dynamic response expression of the LSOAAC is developed. Moreover, the mathematical characteristics of extrema of bounds of dynamic response are determined by random interval moment method and monotonic analysis technique. Compared with the hybrid Monte Carlo method (HMCM) and interval perturbation method (IPM), numerical results show the feasibility and efficiency of the MHUAM for solving the hybrid LSOAAC problems. The effects of different uncertain models and parameters on the LSOAAC response field are also investigated deeply, and numerical results indicate that the impact made by the randomness in the thrust of the luffing cylinder F is larger than that made by the gravity of the weight in suspension Q . In addition, the impact made by the uncertainty in the displacement between the lower end of the lifting arm and the luffing cylinder a is larger than that made by the length of the lifting arm L .
Patent citation network in nanotechnology (1976-2004)
NASA Astrophysics Data System (ADS)
Li, Xin; Chen, Hsinchun; Huang, Zan; Roco, Mihail C.
2007-06-01
The patent citation networks are described using critical node, core network, and network topological analysis. The main objective is understanding of the knowledge transfer processes between technical fields, institutions and countries. This includes identifying key influential players and subfields, the knowledge transfer patterns among them, and the overall knowledge transfer efficiency. The proposed framework is applied to the field of nanoscale science and engineering (NSE), including the citation networks of patent documents, submitting institutions, technology fields, and countries. The NSE patents were identified by keywords "full-text" searching of patents at the United States Patent and Trademark Office (USPTO). The analysis shows that the United States is the most important citation center in NSE research. The institution citation network illustrates a more efficient knowledge transfer between institutions than a random network. The country citation network displays a knowledge transfer capability as efficient as a random network. The technology field citation network and the patent document citation network exhibit a␣less efficient knowledge diffusion capability than a random network. All four citation networks show a tendency to form local citation clusters.
This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...
Supernova-regulated ISM. V. Space and Time Correlations
NASA Astrophysics Data System (ADS)
Hollins, J. F.; Sarson, G. R.; Shukurov, A.; Fletcher, A.; Gent, F. A.
2017-11-01
We apply correlation analysis to random fields in numerical simulations of the supernova-driven interstellar medium (ISM) with the magnetic field produced by dynamo action. We solve the magnetohydrodynamic (MHD) equations in a shearing Cartesian box representing a local region of the ISM, subject to thermal and kinetic energy injection by supernova explosions, and parameterized, optically thin radiative cooling. We consider the cold, warm, and hot phases of the ISM separately; the analysis mostly considers the warm gas, which occupies the bulk of the domain. Various physical variables have different correlation lengths in the warm phase: 40,50, and 60 {pc} for the random magnetic field, density, and velocity, respectively, in the midplane. The correlation time of the random velocity is comparable to the eddy turnover time, about {10}7 {year}, although it may be shorter in regions with a higher star formation rate. The random magnetic field is anisotropic, with the standard deviations of its components {b}x/{b}y/{b}z having approximate ratios 0.5/0.6/0.6 in the midplane. The anisotropy is attributed to the global velocity shear from galactic differential rotation and locally inhomogeneous outflow to the galactic halo. The correlation length of Faraday depth along the z axis, 120 {pc}, is greater than for electron density, 60{--}90 {pc}, and the vertical magnetic field, 60 {pc}. Such comparisons may be sensitive to the orientation of the line of sight. Uncertainties of the structure functions of synchrotron intensity rapidly increase with the scale. This feature is hidden in a power spectrum analysis, which can undermine the usefulness of power spectra for detailed studies of interstellar turbulence.
Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine
2017-09-01
According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Lauterbach, S.; Fina, M.; Wagner, W.
2018-04-01
Since structural engineering requires highly developed and optimized structures, the thickness dependency is one of the most controversially debated topics. This paper deals with stability analysis of lightweight thin structures combined with arbitrary geometrical imperfections. Generally known design guidelines only consider imperfections for simple shapes and loading, whereas for complex structures the lower-bound design philosophy still holds. Herein, uncertainties are considered with an empirical knockdown factor representing a lower bound of existing measurements. To fully understand and predict expected bearable loads, numerical investigations are essential, including geometrical imperfections. These are implemented into a stand-alone program code with a stochastic approach to compute random fields as geometric imperfections that are applied to nodes of the finite element mesh of selected structural examples. The stochastic approach uses the Karhunen-Loève expansion for the random field discretization. For this approach, the so-called correlation length l_c controls the random field in a powerful way. This parameter has a major influence on the buckling shape, and also on the stability load. First, the impact of the correlation length is studied for simple structures. Second, since most structures for engineering devices are more complex and combined structures, these are intensively discussed with the focus on constrained random fields for e.g. flange-web-intersections. Specific constraints for those random fields are pointed out with regard to the finite element model. Further, geometrical imperfections vanish where the structure is supported.
Modal energy analysis for mechanical systems excited by spatially correlated loads
NASA Astrophysics Data System (ADS)
Zhang, Peng; Fei, Qingguo; Li, Yanbin; Wu, Shaoqing; Chen, Qiang
2018-10-01
MODal ENergy Analysis (MODENA) is an energy-based method, which is proposed to deal with vibroacoustic problems. The performance of MODENA on the energy analysis of a mechanical system under spatially correlated excitation is investigated. A plate/cavity coupling system excited by a pressure field is studied in a numerical example, in which four kinds of pressure fields are involved, which include the purely random pressure field, the perfectly correlated pressure field, the incident diffuse field, and the turbulent boundary layer pressure fluctuation. The total energies of subsystems differ to reference solution only in the case of purely random pressure field and only for the non-excited subsystem (the cavity). A deeper analysis on the scale of modal energy is further conducted via another numerical example, in which two structural modes excited by correlated forces are coupled with one acoustic mode. A dimensionless correlation strength factor is proposed to determine the correlation strength between modal forces. Results show that the error on modal energy increases with the increment of the correlation strength factor. A criterion is proposed to establish a link between the error and the correlation strength factor. According to the criterion, the error is negligible when the correlation strength is weak, in this situation the correlation strength factor is less than a critical value.
Three-Level Models for Indirect Effects in School- and Class-Randomized Experiments in Education
ERIC Educational Resources Information Center
Pituch, Keenan A.; Murphy, Daniel L.; Tate, Richard L.
2009-01-01
Due to the clustered nature of field data, multi-level modeling has become commonly used to analyze data arising from educational field experiments. While recent methodological literature has focused on multi-level mediation analysis, relatively little attention has been devoted to mediation analysis when three levels (e.g., student, class,…
ERIC Educational Resources Information Center
Hafdahl, Adam R.; Williams, Michelle A.
2009-01-01
In 2 Monte Carlo studies of fixed- and random-effects meta-analysis for correlations, A. P. Field (2001) ostensibly evaluated Hedges-Olkin-Vevea Fisher-[zeta] and Schmidt-Hunter Pearson-r estimators and tests in 120 conditions. Some authors have cited those results as evidence not to meta-analyze Fisher-[zeta] correlations, especially with…
On Nonlinear Functionals of Random Spherical Eigenfunctions
NASA Astrophysics Data System (ADS)
Marinucci, Domenico; Wigman, Igor
2014-05-01
We prove central limit theorems and Stein-like bounds for the asymptotic behaviour of nonlinear functionals of spherical Gaussian eigenfunctions. Our investigation combines asymptotic analysis of higher order moments for Legendre polynomials and, in addition, recent results on Malliavin calculus and total variation bounds for Gaussian subordinated fields. We discuss applications to geometric functionals like the defect and invariant statistics, e.g., polyspectra of isotropic spherical random fields. Both of these have relevance for applications, especially in an astrophysical environment.
Xu, Jia; Li, Chao; Li, Yiran; Lim, Chee Wah; Zhu, Zhiwen
2018-05-04
In this paper, a kind of single-walled carbon nanotube nonlinear model is developed and the strongly nonlinear dynamic characteristics of such carbon nanotubes subjected to random magnetic field are studied. The nonlocal effect of the microstructure is considered based on Eringen’s differential constitutive model. The natural frequency of the strongly nonlinear dynamic system is obtained by the energy function method, the drift coefficient and the diffusion coefficient are verified. The stationary probability density function of the system dynamic response is given and the fractal boundary of the safe basin is provided. Theoretical analysis and numerical simulation show that stochastic resonance occurs when varying the random magnetic field intensity. The boundary of safe basin has fractal characteristics and the area of safe basin decreases when the intensity of the magnetic field permeability increases.
Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats
Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.
2012-01-01
This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.
Random harmonic analysis program, L221 (TEV156). Volume 1: Engineering and usage
NASA Technical Reports Server (NTRS)
Miller, R. D.; Graham, M. L.
1979-01-01
A digital computer program capable of calculating steady state solutions for linear second order differential equations due to sinusoidal forcing functions is described. The field of application of the program, the analysis of airplane response and loads due to continuous random air turbulence, is discussed. Optional capabilities including frequency dependent input matrices, feedback damping, gradual gust penetration, multiple excitation forcing functions, and a static elastic solution are described. Program usage and a description of the analysis used are presented.
The influence of an uncertain force environment on reshaping trial-to-trial motor variability.
Izawa, Jun; Yoshioka, Toshinori; Osu, Rieko
2014-09-10
Motor memory is updated to generate ideal movements in a novel environment. When the environment changes every trial randomly, how does the brain incorporate this uncertainty into motor memory? To investigate how the brain adapts to an uncertain environment, we considered a reach adaptation protocol where individuals practiced moving in a force field where a noise was injected. After they had adapted, we measured the trial-to-trial variability in the temporal profiles of the produced hand force. We found that the motor variability was significantly magnified by the adaptation to the random force field. Temporal profiles of the motor variance were significantly dissociable between two different types of random force fields experienced. A model-based analysis suggests that the variability is generated by noise in the gains of the internal model. It further suggests that the trial-to-trial motor variability magnified by the adaptation in a random force field is generated by the uncertainty of the internal model formed in the brain as a result of the adaptation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roach, Mack; De Silvio, Michelle; Valicenti, Richard
2006-11-01
Purpose: Radiation Therapy Oncology Group (RTOG) 9413 trial demonstrated a better progression-free survival (PFS) with whole-pelvis (WP) radiotherapy (RT) compared with prostate-only (PO) RT. This secondary analysis was undertaken to determine whether 'mini-pelvis' (MP; defined as {>=}10 x 11 cm but <11 x 11 cm) RT resulted in progression-free survival (PFS) comparable to that of WP RT. To avoid a timing bias, this analysis was limited to patients receiving neoadjuvant and concurrent hormonal therapy (N and CHT) in Arms 1 and 2 of the study. Methods and Materials: Eligible patients had a risk of lymph node (LN) involvement >15%. Neoadjuvantmore » and concurrent hormonal therapy (N and CHT) was administered 2 months before and during RT for 4 months. From April 1, 1995, to June 1, 1999, a group of 325 patients were randomized to WP RT + N and CHT and another group of 324 patients were randomized to receive PO RT + N and CHT. Patients randomized to PO RT were dichotomized by median field size (10 x 11 cm), with the larger field considered an 'MP' field and the smaller a PO field. Results: The median PFS was 5.2, 3.7, and 2.9 years for WP, MP, and PO fields, respectively (p = 0.02). The 7-year PFS was 40%, 35%, and 27% for patients treated to WP, MP, and PO fields, respectively. There was no association between field size and late Grade 3+ genitourinary toxicity but late Grade 3+ gastrointestinal RT complications correlated with increasing field size. Conclusions: This subset analysis demonstrates that RT field size has a major impact on PFS, and the findings support comprehensive nodal treatment in patients with a risk of LN involvement of >15%.« less
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
Bayesian network meta-analysis for cluster randomized trials with binary outcomes.
Uhlmann, Lorenz; Jensen, Katrin; Kieser, Meinhard
2017-06-01
Network meta-analysis is becoming a common approach to combine direct and indirect comparisons of several treatment arms. In recent research, there have been various developments and extensions of the standard methodology. Simultaneously, cluster randomized trials are experiencing an increased popularity, especially in the field of health services research, where, for example, medical practices are the units of randomization but the outcome is measured at the patient level. Combination of the results of cluster randomized trials is challenging. In this tutorial, we examine and compare different approaches for the incorporation of cluster randomized trials in a (network) meta-analysis. Furthermore, we provide practical insight on the implementation of the models. In simulation studies, it is shown that some of the examined approaches lead to unsatisfying results. However, there are alternatives which are suitable to combine cluster randomized trials in a network meta-analysis as they are unbiased and reach accurate coverage rates. In conclusion, the methodology can be extended in such a way that an adequate inclusion of the results obtained in cluster randomized trials becomes feasible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Bashir, Muhammad Mustehsan; Qayyum, Rehan; Saleem, Muhammad Hammad; Siddique, Kashif; Khan, Farid Ahmad
2015-08-01
To determine the optimal time interval between tumescent local anesthesia infiltration and the start of hand surgery without a tourniquet for improved operative field visibility. Patients aged 16 to 60 years who needed contracture release and tendon repair in the hand were enrolled from the outpatient clinic. Patients were randomized to 10-, 15-, or 25-minute intervals between tumescent anesthetic solution infiltration (0.18% lidocaine and 1:221,000 epinephrine) and the start of surgery. The end point of tumescence anesthetic infiltration was pale and firm skin. The surgical team was blinded to the time of anesthetic infiltration. At the completion of the procedure, the surgeon and the first assistant rated the operative field visibility as excellent, fair, or poor. We used logistic regression models without and with adjustment for confounding variables. Of the 75 patients enrolled in the study, 59 (79%) were males, 7 were randomized to 10-minute time intervals (further randomization was stopped after interim analysis found consistently poor operative field visibility), and 34 were randomized to the each of the 15- and 25-minute groups. Patients who were randomized to the 25-minute delay group had 29 times higher odds of having an excellent operative visual field than those randomized to the 15-minute delay group. After adjusting for age, sex, amount of tumescent solution infiltration, and duration of operation, the odds ratio remained highly significant. We found that an interval of 25 minutes provides vastly superior operative field visibility; 10-minute delay had the poorest results. Therapeutic I. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Power-law exponent of the Bouchaud-Mézard model on regular random networks
NASA Astrophysics Data System (ADS)
Ichinomiya, Takashi
2013-07-01
We study the Bouchaud-Mézard model on a regular random network. By assuming adiabaticity and independency, and utilizing the generalized central limit theorem and the Tauberian theorem, we derive an equation that determines the exponent of the probability distribution function of the wealth as x→∞. The analysis shows that the exponent can be smaller than 2, while a mean-field analysis always gives the exponent as being larger than 2. The results of our analysis are shown to be in good agreement with those of the numerical simulations.
Drew, L.J.; Attanasi, E.D.; Schuenemeyer, J.H.
1988-01-01
If observed oil and gas field size distributions are obtained by random samplings, the fitted distributions should approximate that of the parent population of oil and gas fields. However, empirical evidence strongly suggests that larger fields tend to be discovered earlier in the discovery process than they would be by random sampling. Economic factors also can limit the number of small fields that are developed and reported. This paper examines observed size distributions in state and federal waters of offshore Texas. Results of the analysis demonstrate how the shape of the observable size distributions change with significant hydrocarbon price changes. Comparison of state and federal observed size distributions in the offshore area shows how production cost differences also affect the shape of the observed size distribution. Methods for modifying the discovery rate estimation procedures when economic factors significantly affect the discovery sequence are presented. A primary conclusion of the analysis is that, because hydrocarbon price changes can significantly affect the observed discovery size distribution, one should not be confident about inferring the form and specific parameters of the parent field size distribution from the observed distributions. ?? 1988 International Association for Mathematical Geology.
Simulation of wave propagation in three-dimensional random media
NASA Technical Reports Server (NTRS)
Coles, William A.; Filice, J. P.; Frehlich, R. G.; Yadlowsky, M.
1993-01-01
Quantitative error analysis for simulation of wave propagation in three dimensional random media assuming narrow angular scattering are presented for the plane wave and spherical wave geometry. This includes the errors resulting from finite grid size, finite simulation dimensions, and the separation of the two-dimensional screens along the propagation direction. Simple error scalings are determined for power-law spectra of the random refractive index of the media. The effects of a finite inner scale are also considered. The spatial spectra of the intensity errors are calculated and compared to the spatial spectra of intensity. The numerical requirements for a simulation of given accuracy are determined for realizations of the field. The numerical requirements for accurate estimation of higher moments of the field are less stringent.
Najafi, M N; Nezhadhaghighi, M Ghasemi
2017-03-01
We characterize the carrier density profile of the ground state of graphene in the presence of particle-particle interaction and random charged impurity in zero gate voltage. We provide detailed analysis on the resulting spatially inhomogeneous electron gas, taking into account the particle-particle interaction and the remote Coulomb disorder on an equal footing within the Thomas-Fermi-Dirac theory. We present some general features of the carrier density probability measure of the graphene sheet. We also show that, when viewed as a random surface, the electron-hole puddles at zero chemical potential show peculiar self-similar statistical properties. Although the disorder potential is chosen to be Gaussian, we show that the charge field is non-Gaussian with unusual Kondev relations, which can be regarded as a new class of two-dimensional random-field surfaces. Using Schramm-Loewner (SLE) evolution, we numerically demonstrate that the ungated graphene has conformal invariance and the random zero-charge density contours are SLE_{κ} with κ=1.8±0.2, consistent with c=-3 conformal field theory.
Random vectors and spatial analysis by geostatistics for geotechnical applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, D.S.
1987-08-01
Geostatistics is extended to the spatial analysis of vector variables by defining the estimation variance and vector variogram in terms of the magnitude of difference vectors. Many random variables in geotechnology are in vectorial terms rather than scalars, and its structural analysis requires those sample variable interpolations to construct and characterize structural models. A better local estimator will result in greater quality of input models; geostatistics can provide such estimators; kriging estimators. The efficiency of geostatistics for vector variables is demonstrated in a case study of rock joint orientations in geological formations. The positive cross-validation encourages application of geostatistics tomore » spatial analysis of random vectors in geoscience as well as various geotechnical fields including optimum site characterization, rock mechanics for mining and civil structures, cavability analysis of block cavings, petroleum engineering, and hydrologic and hydraulic modelings.« less
A study of the breast cancer dynamics in North Carolina.
Christakos, G; Lai, J J
1997-11-01
This work is concerned with the study of breast cancer incidence in the State of North Carolina. Methodologically, the current analysis illustrates the importance of spatiotemporal random field modelling and introduces a mode of reasoning that is based on a combination of inductive and deductive processes. The composite space/time analysis utilizes the variability characteristics of incidence and the mathematical features of the random field model to fit it to the data. The analysis is significantly general and can efficiently represent non-homogeneous and non-stationary characteristics of breast cancer variation. Incidence predictions are produced using data at the same time period as well as data from other time periods and disease registries. The random field provides a rigorous and systematic method for generating detailed maps, which offer a quantitative description of the incidence variation from place to place and from time to time, together with a measure of the accuracy of the incidence maps. Spatiotemporal mapping accounts for the geographical locations and the time instants of the incidence observations, which is not usually the case with most empirical Bayes methods. It is also more accurate than purely spatial statistics methods, and can offer valuable information about the breast cancer risk and dynamics in North Carolina. Field studies could be initialized in high-rate areas identified by the maps in an effort to uncover environmental or life-style factors that might be responsible for the high risk rates. Also, the incidence maps can help elucidate causal mechanisms, explain disease occurrences at a certain scale, and offer guidance in health management and administration.
Social patterns revealed through random matrix theory
NASA Astrophysics Data System (ADS)
Sarkar, Camellia; Jalan, Sarika
2014-11-01
Despite the tremendous advancements in the field of network theory, very few studies have taken weights in the interactions into consideration that emerge naturally in all real-world systems. Using random matrix analysis of a weighted social network, we demonstrate the profound impact of weights in interactions on emerging structural properties. The analysis reveals that randomness existing in particular time frame affects the decisions of individuals rendering them more freedom of choice in situations of financial security. While the structural organization of networks remains the same throughout all datasets, random matrix theory provides insight into the interaction pattern of individuals of the society in situations of crisis. It has also been contemplated that individual accountability in terms of weighted interactions remains as a key to success unless segregation of tasks comes into play.
Boundary element analyses for sound transmission loss of panels.
Zhou, Ran; Crocker, Malcolm J
2010-02-01
The sound transmission characteristics of an aluminum panel and two composite sandwich panels were investigated by using two boundary element analyses. The effect of air loading on the structural behavior of the panels is included in one boundary element analysis, by using a light-fluid approximation for the eigenmode series to evaluate the structural response. In the other boundary element analysis, the air loading is treated as an added mass. The effect of the modal energy loss factor on the sound transmission loss of the panels was investigated. Both boundary element analyses were used to study the sound transmission loss of symmetric sandwich panels excited by a random incidence acoustic field. A classical wave impedance analysis was also used to make sound transmission loss predictions for the two foam-filled honeycomb sandwich panels. Comparisons between predictions of sound transmission loss for the two foam-filled honeycomb sandwich panels excited by a random incidence acoustic field obtained from the wave impedance analysis, the two boundary element analyses, and experimental measurements are presented.
Visibility graphs of random scalar fields and spatial data
NASA Astrophysics Data System (ADS)
Lacasa, Lucas; Iacovacci, Jacopo
2017-07-01
We extend the family of visibility algorithms to map scalar fields of arbitrary dimension into graphs, enabling the analysis of spatially extended data structures as networks. We introduce several possible extensions and provide analytical results on the topological properties of the graphs associated to different types of real-valued matrices, which can be understood as the high and low disorder limits of real-valued scalar fields. In particular, we find a closed expression for the degree distribution of these graphs associated to uncorrelated random fields of generic dimension. This result holds independently of the field's marginal distribution and it directly yields a statistical randomness test, applicable in any dimension. We showcase its usefulness by discriminating spatial snapshots of two-dimensional white noise from snapshots of a two-dimensional lattice of diffusively coupled chaotic maps, a system that generates high dimensional spatiotemporal chaos. The range of potential applications of this combinatorial framework includes image processing in engineering, the description of surface growth in material science, soft matter or medicine, and the characterization of potential energy surfaces in chemistry, disordered systems, and high energy physics. An illustration on the applicability of this method for the classification of the different stages involved in carcinogenesis is briefly discussed.
Alexanderian, Alen; Zhu, Liang; Salloum, Maher; Ma, Ronghui; Yu, Meilin
2017-09-01
In this study, statistical models are developed for modeling uncertain heterogeneous permeability and porosity in tumors, and the resulting uncertainties in pressure and velocity fields during an intratumoral injection are quantified using a nonintrusive spectral uncertainty quantification (UQ) method. Specifically, the uncertain permeability is modeled as a log-Gaussian random field, represented using a truncated Karhunen-Lòeve (KL) expansion, and the uncertain porosity is modeled as a log-normal random variable. The efficacy of the developed statistical models is validated by simulating the concentration fields with permeability and porosity of different uncertainty levels. The irregularity in the concentration field bears reasonable visual agreement with that in MicroCT images from experiments. The pressure and velocity fields are represented using polynomial chaos (PC) expansions to enable efficient computation of their statistical properties. The coefficients in the PC expansion are computed using a nonintrusive spectral projection method with the Smolyak sparse quadrature. The developed UQ approach is then used to quantify the uncertainties in the random pressure and velocity fields. A global sensitivity analysis is also performed to assess the contribution of individual KL modes of the log-permeability field to the total variance of the pressure field. It is demonstrated that the developed UQ approach can effectively quantify the flow uncertainties induced by uncertain material properties of the tumor.
NASA Astrophysics Data System (ADS)
Liu, Zhangjun; Liu, Zenghui
2018-06-01
This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.
Semantic segmentation of 3D textured meshes for urban scene analysis
NASA Astrophysics Data System (ADS)
Rouhani, Mohammad; Lafarge, Florent; Alliez, Pierre
2017-01-01
Classifying 3D measurement data has become a core problem in photogrammetry and 3D computer vision, since the rise of modern multiview geometry techniques, combined with affordable range sensors. We introduce a Markov Random Field-based approach for segmenting textured meshes generated via multi-view stereo into urban classes of interest. The input mesh is first partitioned into small clusters, referred to as superfacets, from which geometric and photometric features are computed. A random forest is then trained to predict the class of each superfacet as well as its similarity with the neighboring superfacets. Similarity is used to assign the weights of the Markov Random Field pairwise-potential and to account for contextual information between the classes. The experimental results illustrate the efficacy and accuracy of the proposed framework.
Resonant spin tunneling in randomly oriented nanospheres of Mn 12 acetate
Lendínez, S.; Zarzuela, R.; Tejada, J.; ...
2015-01-06
We report measurements and theoretical analysis of resonant spin tunneling in randomly oriented nanospheres of a molecular magnet. Amorphous nanospheres of Mn₁₂ acetate have been fabricated and characterized by chemical, infrared, TEM, X-ray, and magnetic methods. Magnetic measurements have revealed sharp tunneling peaks in the field derivative of the magnetization that occur at the typical resonant field values for the Mn₁₂ acetate crystal in the field parallel to the easy axis.Theoretical analysis is provided that explains these observations. We argue that resonant spin tunneling in a molecular magnet can be established in a powder sample, without the need for amore » single crystal and without aligning the easy magnetization axes of the molecules. This is confirmed by re-analyzing the old data on a powdered sample of non-oriented micron-size crystals of Mn₁₂ acetate. In conclusion, our findings can greatly simplify the selection of candidates for quantum spin tunneling among newly synthesized molecular magnets.« less
Resonant spin tunneling in randomly oriented nanospheres of Mn 12 acetate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lendínez, S.; Zarzuela, R.; Tejada, J.
We report measurements and theoretical analysis of resonant spin tunneling in randomly oriented nanospheres of a molecular magnet. Amorphous nanospheres of Mn₁₂ acetate have been fabricated and characterized by chemical, infrared, TEM, X-ray, and magnetic methods. Magnetic measurements have revealed sharp tunneling peaks in the field derivative of the magnetization that occur at the typical resonant field values for the Mn₁₂ acetate crystal in the field parallel to the easy axis.Theoretical analysis is provided that explains these observations. We argue that resonant spin tunneling in a molecular magnet can be established in a powder sample, without the need for amore » single crystal and without aligning the easy magnetization axes of the molecules. This is confirmed by re-analyzing the old data on a powdered sample of non-oriented micron-size crystals of Mn₁₂ acetate. In conclusion, our findings can greatly simplify the selection of candidates for quantum spin tunneling among newly synthesized molecular magnets.« less
Clustering, randomness, and regularity in cloud fields: 2. Cumulus cloud fields
NASA Astrophysics Data System (ADS)
Zhu, T.; Lee, J.; Weger, R. C.; Welch, R. M.
1992-12-01
During the last decade a major controversy has been brewing concerning the proper characterization of cumulus convection. The prevailing view has been that cumulus clouds form in clusters, in which cloud spacing is closer than that found for the overall cloud field and which maintains its identity over many cloud lifetimes. This "mutual protection hypothesis" of Randall and Huffman (1980) has been challenged by the "inhibition hypothesis" of Ramirez et al. (1990) which strongly suggests that the spatial distribution of cumuli must tend toward a regular distribution. A dilemma has resulted because observations have been reported to support both hypotheses. The present work reports a detailed analysis of cumulus cloud field spatial distributions based upon Landsat, Advanced Very High Resolution Radiometer, and Skylab data. Both nearest-neighbor and point-to-cloud cumulative distribution function statistics are investigated. The results show unequivocally that when both large and small clouds are included in the cloud field distribution, the cloud field always has a strong clustering signal. The strength of clustering is largest at cloud diameters of about 200-300 m, diminishing with increasing cloud diameter. In many cases, clusters of small clouds are found which are not closely associated with large clouds. As the small clouds are eliminated from consideration, the cloud field typically tends towards regularity. Thus it would appear that the "inhibition hypothesis" of Ramirez and Bras (1990) has been verified for the large clouds. However, these results are based upon the analysis of point processes. A more exact analysis also is made which takes into account the cloud size distributions. Since distinct clouds are by definition nonoverlapping, cloud size effects place a restriction upon the possible locations of clouds in the cloud field. The net effect of this analysis is that the large clouds appear to be randomly distributed, with only weak tendencies towards regularity. For clouds less than 1 km in diameter, the average nearest-neighbor distance is equal to 3-7 cloud diameters. For larger clouds, the ratio of cloud nearest-neighbor distance to cloud diameter increases sharply with increasing cloud diameter. This demonstrates that large clouds inhibit the growth of other large clouds in their vicinity. Nevertheless, this leads to random distributions of large clouds, not regularity.
NASA Astrophysics Data System (ADS)
Ivliev, S. V.
2017-12-01
For calculation of short laser pulse absorption in metal the imaginary part of permittivity, which is simply related to the conductivity, is required. Currently to find the static and dynamic conductivity the Kubo-Greenwood formula is most commonly used. It describes the electromagnetic energy absorption in the one-electron approach. In the present study, this formula is derived directly from the expression for the permittivity expression in the random phase approximation, which in fact is equivalent to the method of the mean field. The detailed analysis of the role of electron-electron interaction in the calculation of the matrix elements of the velocity operator is given. It is shown that in the one-electron random phase approximation the single-particle conductive electron wave functions in the field of fixed ions should be used. The possibility of considering the exchange and correlation effects by means of an amendment to a local function field is discussed.
NASA Astrophysics Data System (ADS)
Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei
2018-03-01
The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.
NASA Astrophysics Data System (ADS)
Raupov, Dmitry S.; Myakinin, Oleg O.; Bratchenko, Ivan A.; Zakharov, Valery P.; Khramov, Alexander G.
2016-10-01
In this paper, we propose a report about our examining of the validity of OCT in identifying changes using a skin cancer texture analysis compiled from Haralick texture features, fractal dimension, Markov random field method and the complex directional features from different tissues. Described features have been used to detect specific spatial characteristics, which can differentiate healthy tissue from diverse skin cancers in cross-section OCT images (B- and/or C-scans). In this work, we used an interval type-II fuzzy anisotropic diffusion algorithm for speckle noise reduction in OCT images. The Haralick texture features as contrast, correlation, energy, and homogeneity have been calculated in various directions. A box-counting method is performed to evaluate fractal dimension of skin probes. Markov random field have been used for the quality enhancing of the classifying. Additionally, we used the complex directional field calculated by the local gradient methodology to increase of the assessment quality of the diagnosis method. Our results demonstrate that these texture features may present helpful information to discriminate tumor from healthy tissue. The experimental data set contains 488 OCT-images with normal skin and tumors as Basal Cell Carcinoma (BCC), Malignant Melanoma (MM) and Nevus. All images were acquired from our laboratory SD-OCT setup based on broadband light source, delivering an output power of 20 mW at the central wavelength of 840 nm with a bandwidth of 25 nm. We obtained sensitivity about 97% and specificity about 73% for a task of discrimination between MM and Nevus.
Zero-field random-field effect in diluted triangular lattice antiferromagnet CuFe1-xAlxO2
NASA Astrophysics Data System (ADS)
Nakajima, T.; Mitsuda, S.; Kitagawa, K.; Terada, N.; Komiya, T.; Noda, Y.
2007-04-01
We performed neutron scattering experiments on a diluted triangular lattice antiferromagnet (TLA), CuFe1-xAlxO2 with x = 0.10. The detailed analysis of the scattering profiles revealed that the scattering function of magnetic reflection is described as the sum of a Lorentzian term and a Lorentzian-squared term with anisotropic width. The Lorentzian-squared term dominating at low temperature is indicative of the domain state in the prototypical random-field Ising model. Taking account of the sinusoidally amplitude-modulated magnetic structure with incommensurate wavenumber in CuFe1-xAlxO2 with x = 0.10, we conclude that the effective random field arises even at zero field, owing to the combination of site-random magnetic vacancies and the sinusoidal structure that is regarded as a partially disordered (PD) structure in a wide sense, as reported in the typical three-sublattice PD phase of a diluted Ising TLA, CsCo0.83Mg0.17Br3 (van Duijn et al 2004 Phys. Rev. Lett. 92 077202). While the previous study revealed the existence of a domain state in CsCo0.83Mg0.17Br3 by detecting magnetic reflections specific to the spin configuration near the domain walls, our present study revealed the existence of a domain state in CuFe1-xAlxO2 (x = 0.10) by determination of the functional form of the scattering function.
Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barajas-Solano, David A.; Tartakovsky, Alexandre M.
We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advectivemore » dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.« less
Clustering, randomness, and regularity in cloud fields. 4. Stratocumulus cloud fields
NASA Astrophysics Data System (ADS)
Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.
1994-07-01
To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (>900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.
Clustering, randomness, and regularity in cloud fields. 4: Stratocumulus cloud fields
NASA Technical Reports Server (NTRS)
Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.
1994-01-01
To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (more than 900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.
Load Balancing in Stochastic Networks: Algorithms, Analysis, and Game Theory
2014-04-16
SECURITY CLASSIFICATION OF: The classic randomized load balancing model is the so-called supermarket model, which describes a system in which...P.O. Box 12211 Research Triangle Park, NC 27709-2211 mean-field limits, supermarket model, thresholds, game, randomized load balancing REPORT...balancing model is the so-called supermarket model, which describes a system in which customers arrive to a service center with n parallel servers according
Assessing the significance of pedobarographic signals using random field theory.
Pataky, Todd C
2008-08-07
Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.
The Effects of ABRACADABRA on Reading Outcomes: A Meta-Analysis of Applied Field Research
ERIC Educational Resources Information Center
Abrami, Philip; Borohkovski, Eugene; Lysenko, Larysa
2015-01-01
This meta-analysis summarizes research on the effects of a comprehensive, interactive web-based software (AXXX) on the development of reading competencies among kindergarteners and elementary students. Findings from seven randomized control trials and quasi-experimental studies undertaken in a variety of contexts across Canada, Australia and Kenya…
NASA Astrophysics Data System (ADS)
Bakhtiar, Nurizatul Syarfinas Ahmad; Abdullah, Farah Aini; Hasan, Yahya Abu
2017-08-01
In this paper, we consider the dynamical behaviour of the random field on the pulsating and snaking solitons in a dissipative systems described by the one-dimensional cubic-quintic complex Ginzburg-Landau equation (cqCGLE). The dynamical behaviour of the random filed was simulated by adding a random field to the initial pulse. Then, we solve it numerically by fixing the initial amplitude profile for the pulsating and snaking solitons without losing any generality. In order to create the random field, we choose 0 ≤ ɛ ≤ 1.0. As a result, multiple soliton trains are formed when the random field is applied to a pulse like initial profile for the parameters of the pulsating and snaking solitons. The results also show the effects of varying the random field of the transient energy peaks in pulsating and snaking solitons.
Statistical analysis of vibration in tyres
NASA Astrophysics Data System (ADS)
Le Bot, Alain; Bazari, Zakia; Klein, Philippe; Lelong, Joël
2017-03-01
The vibration in tyres submitted to random forces in the contact zone is investigated with the model of prestressed orthotropic plate on visco-elastic foundation. It is shown that beyond a cut-on frequency a single wave propagates whose speed is directional-dependent. A systematic numerical exploration of the governing equation solutions shows that three regimes may exist in such plates. These are modal field, diffuse field and free field. For actual tyres which present a high level of damping, the passage from low to high frequencies generally explores the modal and free field regimes but not the diffuse field regime.
Probabilistic Structures Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The basic formulation for probabilistic finite element analysis is described and demonstrated on a few sample problems. This formulation is based on iterative perturbation that uses the factorized stiffness on the unperturbed system as the iteration preconditioner for obtaining the solution to the perturbed problem. This approach eliminates the need to compute, store and manipulate explicit partial derivatives of the element matrices and force vector, which not only reduces memory usage considerably, but also greatly simplifies the coding and validation tasks. All aspects for the proposed formulation were combined in a demonstration problem using a simplified model of a curved turbine blade discretized with 48 shell elements, and having random pressure and temperature fields with partial correlation, random uniform thickness, and random stiffness at the root.
Kouritzin, Michael A; Newton, Fraser; Wu, Biao
2013-04-01
Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.
Karyotype analysis of anuran trypanosomes by pulsed-field gradient gel electrophoresis.
Lun, Z R; Desser, S S
1995-12-01
The chromosomes of 12 species and isolates of anuran trypanosomes were investigated by pulsed-field gradient gel electrophoresis. Twelve to 16 chromosomes ranging from 0.45 to 2.2 megabase pairs were found in each of these trypanosomes. Minichromosomes were not observed in any of the examined isolates. Results indicate that different species of anuran trypanosomes display distinct karyotype patterns, and that isolates from the same region are similar. Our findings also reveal that most chromosome profiles of these trypanosomes are in accordance with isoenzyme and random amplified polymorphic DNA analysis data.
Large-scale structure of randomly jammed spheres
NASA Astrophysics Data System (ADS)
Ikeda, Atsushi; Berthier, Ludovic; Parisi, Giorgio
2017-05-01
We numerically analyze the density field of three-dimensional randomly jammed packings of monodisperse soft frictionless spherical particles, paying special attention to fluctuations occurring at large length scales. We study in detail the two-point static structure factor at low wave vectors in Fourier space. We also analyze the nature of the density field in real space by studying the large-distance behavior of the two-point pair correlation function, of density fluctuations in subsystems of increasing sizes, and of the direct correlation function. We show that such real space analysis can be greatly improved by introducing a coarse-grained density field to disentangle genuine large-scale correlations from purely local effects. Our results confirm that both Fourier and real space signatures of vanishing density fluctuations at large scale are absent, indicating that randomly jammed packings are not hyperuniform. In addition, we establish that the pair correlation function displays a surprisingly complex structure at large distances, which is however not compatible with the long-range negative correlation of hyperuniform systems but fully compatible with an analytic form for the structure factor. This implies that the direct correlation function is short ranged, as we also demonstrate directly. Our results reveal that density fluctuations in jammed packings do not follow the behavior expected for random hyperuniform materials, but display instead a more complex behavior.
Kernel-Correlated Levy Field Driven Forward Rate and Application to Derivative Pricing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bo Lijun; Wang Yongjin; Yang Xuewei, E-mail: xwyangnk@yahoo.com.cn
2013-08-01
We propose a term structure of forward rates driven by a kernel-correlated Levy random field under the HJM framework. The kernel-correlated Levy random field is composed of a kernel-correlated Gaussian random field and a centered Poisson random measure. We shall give a criterion to preclude arbitrage under the risk-neutral pricing measure. As applications, an interest rate derivative with general payoff functional is priced under this pricing measure.
NASA Astrophysics Data System (ADS)
Hong, Liang
2013-10-01
The availability of high spatial resolution remote sensing data provides new opportunities for urban land-cover classification. More geometric details can be observed in the high resolution remote sensing image, Also Ground objects in the high resolution remote sensing image have displayed rich texture, structure, shape and hierarchical semantic characters. More landscape elements are represented by a small group of pixels. Recently years, the an object-based remote sensing analysis methodology is widely accepted and applied in high resolution remote sensing image processing. The classification method based on Geo-ontology and conditional random fields is presented in this paper. The proposed method is made up of four blocks: (1) the hierarchical ground objects semantic framework is constructed based on geoontology; (2) segmentation by mean-shift algorithm, which image objects are generated. And the mean-shift method is to get boundary preserved and spectrally homogeneous over-segmentation regions ;(3) the relations between the hierarchical ground objects semantic and over-segmentation regions are defined based on conditional random fields framework ;(4) the hierarchical classification results are obtained based on geo-ontology and conditional random fields. Finally, high-resolution remote sensed image data -GeoEye, is used to testify the performance of the presented method. And the experimental results have shown the superiority of this method to the eCognition method both on the effectively and accuracy, which implies it is suitable for the classification of high resolution remote sensing image.
Is the Non-Dipole Magnetic Field Random?
NASA Technical Reports Server (NTRS)
Walker, Andrew D.; Backus, George E.
1996-01-01
Statistical modelling of the Earth's magnetic field B has a long history. In particular, the spherical harmonic coefficients of scalar fields derived from B can be treated as Gaussian random variables. In this paper, we give examples of highly organized fields whose spherical harmonic coefficients pass tests for independent Gaussian random variables. The fact that coefficients at some depth may be usefully summarized as independent samples from a normal distribution need not imply that there really is some physical, random process at that depth. In fact, the field can be extremely structured and still be regarded for some purposes as random. In this paper, we examined the radial magnetic field B(sub r) produced by the core, but the results apply to any scalar field on the core-mantle boundary (CMB) which determines B outside the CMB.
Adaptive Kalman filtering for real-time mapping of the visual field
Ward, B. Douglas; Janik, John; Mazaheri, Yousef; Ma, Yan; DeYoe, Edgar A.
2013-01-01
This paper demonstrates the feasibility of real-time mapping of the visual field for clinical applications. Specifically, three aspects of this problem were considered: (1) experimental design, (2) statistical analysis, and (3) display of results. Proper experimental design is essential to achieving a successful outcome, particularly for real-time applications. A random-block experimental design was shown to have less sensitivity to measurement noise, as well as greater robustness to error in modeling of the hemodynamic impulse response function (IRF) and greater flexibility than common alternatives. In addition, random encoding of the visual field allows for the detection of voxels that are responsive to multiple, not necessarily contiguous, regions of the visual field. Due to its recursive nature, the Kalman filter is ideally suited for real-time statistical analysis of visual field mapping data. An important feature of the Kalman filter is that it can be used for nonstationary time series analysis. The capability of the Kalman filter to adapt, in real time, to abrupt changes in the baseline arising from subject motion inside the scanner and other external system disturbances is important for the success of clinical applications. The clinician needs real-time information to evaluate the success or failure of the imaging run and to decide whether to extend, modify, or terminate the run. Accordingly, the analytical software provides real-time displays of (1) brain activation maps for each stimulus segment, (2) voxel-wise spatial tuning profiles, (3) time plots of the variability of response parameters, and (4) time plots of activated volume. PMID:22100663
Random waves in the brain: Symmetries and defect generation in the visual cortex
NASA Astrophysics Data System (ADS)
Schnabel, M.; Kaschube, M.; Löwel, S.; Wolf, F.
2007-06-01
How orientation maps in the visual cortex of the brain develop is a matter of long standing debate. Experimental and theoretical evidence suggests that their development represents an activity-dependent self-organization process. Theoretical analysis [1] exploring this hypothesis predicted that maps at an early developmental stage are realizations of Gaussian random fields exhibiting a rigorous lower bound for their densities of topological defects, called pinwheels. As a consequence, lower pinwheel densities, if observed in adult animals, are predicted to develop through the motion and annihilation of pinwheel pairs. Despite of being valid for a large class of developmental models this result depends on the symmetries of the models and thus of the predicted random field ensembles. In [1] invariance of the orientation map's statistical properties under independent space rotations and orientation shifts was assumed. However, full rotation symmetry appears to be broken by interactions of cortical neurons, e.g. selective couplings between groups of neurons with collinear orientation preferences [2]. A recently proposed new symmetry, called shift-twist symmetry [3], stating that spatial rotations have to occur together with orientation shifts in order to be an appropriate symmetry transformation, is more consistent with this organization. Here we generalize our random field approach to this important symmetry class. We propose a new class of shift-twist symmetric Gaussian random fields and derive the general correlation functions of this ensemble. It turns out that despite strong effects of the shift-twist symmetry on the structure of the correlation functions and on the map layout the lower bound on the pinwheel densities remains unaffected, predicting pinwheel annihilation in systems with low pinwheel densities.
Influence of Averaging Preprocessing on Image Analysis with a Markov Random Field Model
NASA Astrophysics Data System (ADS)
Sakamoto, Hirotaka; Nakanishi-Ohno, Yoshinori; Okada, Masato
2018-02-01
This paper describes our investigations into the influence of averaging preprocessing on the performance of image analysis. Averaging preprocessing involves a trade-off: image averaging is often undertaken to reduce noise while the number of image data available for image analysis is decreased. We formulated a process of generating image data by using a Markov random field (MRF) model to achieve image analysis tasks such as image restoration and hyper-parameter estimation by a Bayesian approach. According to the notions of Bayesian inference, posterior distributions were analyzed to evaluate the influence of averaging. There are three main results. First, we found that the performance of image restoration with a predetermined value for hyper-parameters is invariant regardless of whether averaging is conducted. We then found that the performance of hyper-parameter estimation deteriorates due to averaging. Our analysis of the negative logarithm of the posterior probability, which is called the free energy based on an analogy with statistical mechanics, indicated that the confidence of hyper-parameter estimation remains higher without averaging. Finally, we found that when the hyper-parameters are estimated from the data, the performance of image restoration worsens as averaging is undertaken. We conclude that averaging adversely influences the performance of image analysis through hyper-parameter estimation.
Tynkkynen, Soile; Satokari, Reetta; Saarela, Maria; Mattila-Sandholm, Tiina; Saxelin, Maija
1999-01-01
A total of 24 strains, biochemically identified as members of the Lactobacillus casei group, were identified by PCR with species-specific primers. The same set of strains was typed by randomly amplified polymorphic DNA (RAPD) analysis, ribotyping, and pulsed-field gel electrophoresis (PFGE) in order to compare the discriminatory power of the methods. Species-specific primers for L. rhamnosus and L. casei identified the type strain L. rhamnosus ATCC 7469 and the neotype strain L. casei ATCC 334, respectively, but did not give any signal with the recently revived species L. zeae, which contains the type strain ATCC 15820 and the strain ATCC 393, which was previously classified as L. casei. Our results are in accordance with the suggested new classification of the L. casei group. Altogether, 21 of the 24 strains studied were identified with the species-specific primers. In strain typing, PFGE was the most discriminatory method, revealing 17 genotypes for the 24 strains studied. Ribotyping and RAPD analysis yielded 15 and 12 genotypes, respectively. PMID:10473394
Tynkkynen, S; Satokari, R; Saarela, M; Mattila-Sandholm, T; Saxelin, M
1999-09-01
A total of 24 strains, biochemically identified as members of the Lactobacillus casei group, were identified by PCR with species-specific primers. The same set of strains was typed by randomly amplified polymorphic DNA (RAPD) analysis, ribotyping, and pulsed-field gel electrophoresis (PFGE) in order to compare the discriminatory power of the methods. Species-specific primers for L. rhamnosus and L. casei identified the type strain L. rhamnosus ATCC 7469 and the neotype strain L. casei ATCC 334, respectively, but did not give any signal with the recently revived species L. zeae, which contains the type strain ATCC 15820 and the strain ATCC 393, which was previously classified as L. casei. Our results are in accordance with the suggested new classification of the L. casei group. Altogether, 21 of the 24 strains studied were identified with the species-specific primers. In strain typing, PFGE was the most discriminatory method, revealing 17 genotypes for the 24 strains studied. Ribotyping and RAPD analysis yielded 15 and 12 genotypes, respectively.
The deterministic chaos and random noise in turbulent jet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Tian-Liang; Shanghai Institute of Space Propulsion, Shanghai 201112; Shanghai Engineering Research Center of Space Engine, Shanghai Institute of Space Propulsion, Shanghai 201112
2014-06-01
A turbulent flow is usually treated as a superposition of coherent structure and incoherent turbulence. In this paper, the largest Lyapunov exponent and the random noise in the near field of round jet and plane jet are estimated with our previously proposed method of chaotic time series analysis [T. L. Yao, et al., Chaos 22, 033102 (2012)]. The results show that the largest Lyapunov exponents of the round jet and plane jet are in direct proportion to the reciprocal of the integral time scale of turbulence, which is in accordance with the results of the dimensional analysis, and the proportionalitymore » coefficients are equal. In addition, the random noise of the round jet and plane jet has the same linear relation with the Kolmogorov velocity scale of turbulence. As a result, the random noise may well be from the incoherent disturbance in turbulence, and the coherent structure in turbulence may well follow the rule of chaotic motion.« less
Taillibert, Sophie; Kanner, Andrew; Read, William; Steinberg, David M.; Lhermitte, Benoit; Toms, Steven; Idbaih, Ahmed; Ahluwalia, Manmeet S.; Fink, Karen; Di Meco, Francesco; Lieberman, Frank; Zhu, Jay-Jiguang; Stragliotto, Giuseppe; Tran, David D.; Brem, Steven; Hottinger, Andreas F.; Kirson, Eilon D.; Lavy-Shahaf, Gitit; Weinberg, Uri; Kim, Chae-Yong; Paek, Sun-Ha; Nicholas, Garth; Burna, Jordi; Hirte, Hal; Weller, Michael; Palti, Yoram; Hegi, Monika E.; Ram, Zvi
2017-01-01
Importance Tumor-treating fields (TTFields) is an antimitotic treatment modality that interferes with glioblastoma cell division and organelle assembly by delivering low-intensity alternating electric fields to the tumor. Objective To investigate whether TTFields improves progression-free and overall survival of patients with glioblastoma, a fatal disease that commonly recurs at the initial tumor site or in the central nervous system. Design, Setting, and Participants In this randomized, open-label trial, 695 patients with glioblastoma whose tumor was resected or biopsied and had completed concomitant radiochemotherapy (median time from diagnosis to randomization, 3.8 months) were enrolled at 83 centers (July 2009-2014) and followed up through December 2016. A preliminary report from this trial was published in 2015; this report describes the final analysis. Interventions Patients were randomized 2:1 to TTFields plus maintenance temozolomide chemotherapy (n = 466) or temozolomide alone (n = 229). The TTFields, consisting of low-intensity, 200 kHz frequency, alternating electric fields, was delivered (≥ 18 hours/d) via 4 transducer arrays on the shaved scalp and connected to a portable device. Temozolomide was administered to both groups (150-200 mg/m2) for 5 days per 28-day cycle (6-12 cycles). Main Outcomes and Measures Progression-free survival (tested at α = .046). The secondary end point was overall survival (tested hierarchically at α = .048). Analyses were performed for the intent-to-treat population. Adverse events were compared by group. Results Of the 695 randomized patients (median age, 56 years; IQR, 48-63; 473 men [68%]), 637 (92%) completed the trial. Median progression-free survival from randomization was 6.7 months in the TTFields-temozolomide group and 4.0 months in the temozolomide-alone group (HR, 0.63; 95% CI, 0.52-0.76; P < .001). Median overall survival was 20.9 months in the TTFields-temozolomide group vs 16.0 months in the temozolomide-alone group (HR, 0.63; 95% CI, 0.53-0.76; P < .001). Systemic adverse event frequency was 48% in the TTFields-temozolomide group and 44% in the temozolomide-alone group. Mild to moderate skin toxicity underneath the transducer arrays occurred in 52% of patients who received TTFields-temozolomide vs no patients who received temozolomide alone. Conclusions and Relevance In the final analysis of this randomized clinical trial of patients with glioblastoma who had received standard radiochemotherapy, the addition of TTFields to maintenance temozolomide chemotherapy vs maintenance temozolomide alone, resulted in statistically significant improvement in progression-free survival and overall survival. These results are consistent with the previous interim analysis. Trial Registration clinicaltrials.gov Identifier: NCT00916409 PMID:29260225
Random analysis of bearing capacity of square footing using the LAS procedure
NASA Astrophysics Data System (ADS)
Kawa, Marek; Puła, Wojciech; Suska, Michał
2016-09-01
In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.
NASA Astrophysics Data System (ADS)
Asano, Takanori; Takaishi, Riichiro; Oda, Minoru; Sakuma, Kiwamu; Saitoh, Masumi; Tanaka, Hiroki
2018-04-01
We visualize the grain structures for individual nanosized thin film transistors (TFTs), which are electrically characterized, with an improved data processing technique for the dark-field image reconstruction of nanobeam electron diffraction maps. Our individual crystal analysis gives the one-to-one correspondence of TFTs with different grain boundary structures, such as random and coherent boundaries, to the characteristic degradations of ON-current and threshold voltage. Furthermore, the local crystalline uniformity inside a single grain is detected as the difference in diffraction intensity distribution.
On Pfaffian Random Point Fields
NASA Astrophysics Data System (ADS)
Kargin, V.
2014-02-01
We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.
An Arctic Ice/Ocean Coupled Model with Wave Interactions
2015-09-30
seas within and in the waters adjoining MIZs, using a conservative, multiple wave scattering approach in a medium with random geometrical properties...relating to wave-ice interactions have been collected since the MIZEX campaign of the 1980s, aside from a small number of ad hoc field experiments. This...from the better technology and analysis tools now available, including those related to the field experiments supported by an intensive remote sensing
Percolation Analysis of a Wiener Reconstruction of the IRAS 1.2 Jy Redshift Catalog
NASA Astrophysics Data System (ADS)
Yess, Capp; Shandarin, Sergei F.; Fisher, Karl B.
1997-01-01
We present percolation analyses of Wiener reconstructions of the IRAS 1.2 Jy redshift survey. There are 10 reconstructions of galaxy density fields in real space spanning the range β = 0.1-1.0, where β = Ω0.6/b, Ω is the present dimensionless density, and b is the bias factor. Our method uses the growth of the largest cluster statistic to characterize the topology of a density field, where Gaussian randomized versions of the reconstructions are used as standards for analysis. For the reconstruction volume of radius R ~ 100 h-1 Mpc, percolation analysis reveals a slight ``meatball'' topology for the real space, galaxy distribution of the IRAS survey.
Energy parasites trigger oncogene mutation.
Pokorný, Jiří; Pokorný, Jan; Jandová, Anna; Kobilková, Jitka; Vrba, Jan; Vrba, Jan
2016-10-01
Cancer initialization can be explained as a result of parasitic virus energy consumption leading to randomized genome chemical bonding. Analysis of experimental data on cell-mediated immunity (CMI) containing about 12,000 cases of healthy humans, cancer patients and patients with precancerous cervical lesions disclosed that the specific cancer and the non-specific lactate dehydrogenase-elevating (LDH) virus antigen elicit similar responses. The specific antigen is effective only in cancer type of its origin but the non-specific antigen in all examined cancers. CMI results of CIN patients display both healthy and cancer state. The ribonucleic acid (RNA) of the LDH virus parasitizing on energy reduces the ratio of coherent/random oscillations. Decreased effect of coherent cellular electromagnetic field on bonding electrons in biological macromolecules leads to elevating probability of random genome reactions. Overlapping of wave functions in biological macromolecules depends on energy of the cellular electromagnetic field which supplies energy to bonding electrons for selective chemical bonds. CMI responses of cancer and LDH virus antigens in all examined healthy, precancerous and cancer cases point to energy mechanism in cancer initiation. Dependence of the rate of biochemical reactions on biological electromagnetic field explains yet unknown mechanism of genome mutation.
Fractal planetary rings: Energy inequalities and random field model
NASA Astrophysics Data System (ADS)
Malyarenko, Anatoliy; Ostoja-Starzewski, Martin
2017-12-01
This study is motivated by a recent observation, based on photographs from the Cassini mission, that Saturn’s rings have a fractal structure in radial direction. Accordingly, two questions are considered: (1) What Newtonian mechanics argument in support of such a fractal structure of planetary rings is possible? (2) What kinematics model of such fractal rings can be formulated? Both challenges are based on taking planetary rings’ spatial structure as being statistically stationary in time and statistically isotropic in space, but statistically nonstationary in space. An answer to the first challenge is given through an energy analysis of circular rings having a self-generated, noninteger-dimensional mass distribution [V. E. Tarasov, Int. J. Mod Phys. B 19, 4103 (2005)]. The second issue is approached by taking the random field of angular velocity vector of a rotating particle of the ring as a random section of a special vector bundle. Using the theory of group representations, we prove that such a field is completely determined by a sequence of continuous positive-definite matrix-valued functions defined on the Cartesian square F2 of the radial cross-section F of the rings, where F is a fat fractal.
Development of a Random Field Model for Gas Plume Detection in Multiple LWIR Images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heasler, Patrick G.
This report develops a random field model that describes gas plumes in LWIR remote sensing images. The random field model serves as a prior distribution that can be combined with LWIR data to produce a posterior that determines the probability that a gas plume exists in the scene and also maps the most probable location of any plume. The random field model is intended to work with a single pixel regression estimator--a regression model that estimates gas concentration on an individual pixel basis.
Tensor Minkowski Functionals for random fields on the sphere
NASA Astrophysics Data System (ADS)
Chingangbam, Pravabati; Yogendran, K. P.; Joby, P. K.; Ganesan, Vidhya; Appleby, Stephen; Park, Changbom
2017-12-01
We generalize the translation invariant tensor-valued Minkowski Functionals which are defined on two-dimensional flat space to the unit sphere. We apply them to level sets of random fields. The contours enclosing boundaries of level sets of random fields give a spatial distribution of random smooth closed curves. We outline a method to compute the tensor-valued Minkowski Functionals numerically for any random field on the sphere. Then we obtain analytic expressions for the ensemble expectation values of the matrix elements for isotropic Gaussian and Rayleigh fields. The results hold on flat as well as any curved space with affine connection. We elucidate the way in which the matrix elements encode information about the Gaussian nature and statistical isotropy (or departure from isotropy) of the field. Finally, we apply the method to maps of the Galactic foreground emissions from the 2015 PLANCK data and demonstrate their high level of statistical anisotropy and departure from Gaussianity.
Anomalous diffusion on a random comblike structure
NASA Astrophysics Data System (ADS)
Havlin, Shlomo; Kiefer, James E.; Weiss, George H.
1987-08-01
We have recently studied a random walk on a comblike structure as an analog of diffusion on a fractal structure. In our earlier work, the comb was assumed to have a deterministic structure, the comb having teeth of infinite length. In the present paper we study diffusion on a one-dimensional random comb, the length of whose teeth are random variables with an asymptotic stable law distribution φ(L)~L-(1+γ) where 0<γ<=1. Two mean-field methods are used for the analysis, one based on the continuous-time random walk, and the second a self-consistent scaling theory. Both lead to the same conclusions. We find that the diffusion exponent characterizing the mean-square displacement along the backbone of the comb is dw=4/(1+γ) for γ<1 and dw=2 for γ>=1. The probability of being at the origin at time t is P0(t)~t-ds/2 for large t with ds=(3-γ)/2 for γ<1 and ds=1 for γ>1. When a field is applied along the backbone of the comb the diffusion exponent is dw=2/(1+γ) for γ<1 and dw=1 for γ>=1. The theoretical results are confirmed using the exact enumeration method.
Hannemann, P F W; Mommers, E H H; Schots, J P M; Brink, P R G; Poeze, M
2014-08-01
The aim of this systematic review and meta-analysis was to evaluate the best currently available evidence from randomized controlled trials comparing pulsed electromagnetic fields (PEMF) or low-intensity pulsed ultrasound (LIPUS) bone growth stimulation with placebo for acute fractures. We performed a systematic literature search of the medical literature from 1980 to 2013 for randomized clinical trials concerning acute fractures in adults treated with PEMF or LIPUS. Two reviewers independently determined the strength of the included studies by assessing the risk of bias according to the criteria in the Cochrane Handbook for Systematic Reviews of Interventions. Seven hundred and thirty-seven patients from 13 trials were included. Pooled results from 13 trials reporting proportion of nonunion showed no significant difference between PEMF or LIPUS and control. With regard to time to radiological union, we found heterogeneous results that significantly favoured PEMF or LIPUS bone growth stimulation only in non-operatively treated fractures or fractures of the upper limb. Furthermore, we found significant results that suggest that the use of PEMF or LIPUS in acute diaphyseal fractures may accelerate the time to clinical union. Current evidence from randomized trials is insufficient to conclude a benefit of PEMF or LIPUS bone growth stimulation in reducing the incidence of nonunions when used for treatment in acute fractures. However, our systematic review and meta-analysis suggest that PEMF or LIPUS can be beneficial in the treatment of acute fractures regarding time to radiological and clinical union. PEMF and LIPUS significantly shorten time to radiological union for acute fractures undergoing non-operative treatment and acute fractures of the upper limb. Furthermore, PEMF or LIPUS bone growth stimulation accelerates the time to clinical union for acute diaphyseal fractures.
Optimizing the LSST Dither Pattern for Survey Uniformity
NASA Astrophysics Data System (ADS)
Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration
2015-01-01
The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.
Safety assessment of a shallow foundation using the random finite element method
NASA Astrophysics Data System (ADS)
Zaskórski, Łukasz; Puła, Wojciech
2015-04-01
A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical probability distribution fits the empirical probability distribution of bearing capacity basing on 3000 realizations. Assessed probability distribution was applied to compute design values of the bearing capacity and related reliability indices β. Conducted analysis were carried out for a cohesion soil. Hence a friction angle and a cohesion were defined as a random parameters and characterized by two dimensional random fields. A friction angle was described by a bounded distribution as it differs within limited range. While a lognormal distribution was applied in case of a cohesion. Other properties - Young's modulus, Poisson's ratio and unit weight were assumed as deterministic values because they have negligible influence on the stochastic bearing capacity. Griffiths D. V., & Fenton G. A. (1993). Seepage beneath water retaining structures founded on spatially random soil. Géotechnique, 43(6), 577-587.
Probabilistic models for reactive behaviour in heterogeneous condensed phase media
NASA Astrophysics Data System (ADS)
Baer, M. R.; Gartling, D. K.; DesJardin, P. E.
2012-02-01
This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.
Global mean-field phase diagram of the spin-1 Ising ferromagnet in a random crystal field
NASA Astrophysics Data System (ADS)
Borelli, M. E. S.; Carneiro, C. E. I.
1996-02-01
We study the phase diagram of the mean-field spin-1 Ising ferromagnet in a uniform magnetic field H and a random crystal field Δi, with probability distribution P( Δi) = pδ( Δi - Δ) + (1 - p) δ( Δi). We analyse the effects of randomness on the first-order surfaces of the Δ- T- H phase diagram for different values of the concentration p and show how these surfaces are affected by the dilution of the crystal field.
Analysis of speckle and material properties in laider tracer
NASA Astrophysics Data System (ADS)
Ross, Jacob W.; Rigling, Brian D.; Watson, Edward A.
2017-04-01
The SAL simulation tool Laider Tracer models speckle: the random variation in intensity of an incident light beam across a rough surface. Within Laider Tracer, the speckle field is modeled as a 2-D array of jointly Gaussian random variables projected via ray tracing onto the scene of interest. Originally, all materials in Laider Tracer were treated as ideal diffuse scatterers, for which the far-field return computed uses the Lambertian Bidirectional Reflectance Distribution Function (BRDF). As presented here, we implement material properties into Laider Tracer via the Non-conventional Exploitation Factors Data System: a database of properties for thousands of different materials sampled at various wavelengths and incident angles. We verify the intensity behavior as a function of incident angle after material properties are added to the simulation.
Connectivity ranking of heterogeneous random conductivity models
NASA Astrophysics Data System (ADS)
Rizzo, C. B.; de Barros, F.
2017-12-01
To overcome the challenges associated with hydrogeological data scarcity, the hydraulic conductivity (K) field is often represented by a spatial random process. The state-of-the-art provides several methods to generate 2D or 3D random K-fields, such as the classic multi-Gaussian fields or non-Gaussian fields, training image-based fields and object-based fields. We provide a systematic comparison of these models based on their connectivity. We use the minimum hydraulic resistance as a connectivity measure, which it has been found to be strictly correlated with early time arrival of dissolved contaminants. A computationally efficient graph-based algorithm is employed, allowing a stochastic treatment of the minimum hydraulic resistance through a Monte-Carlo approach and therefore enabling the computation of its uncertainty. The results show the impact of geostatistical parameters on the connectivity for each group of random fields, being able to rank the fields according to their minimum hydraulic resistance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less
Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao
2016-04-01
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.
Daniels, Marcus G; Farmer, J Doyne; Gillemot, László; Iori, Giulia; Smith, Eric
2003-03-14
We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.
NASA Astrophysics Data System (ADS)
Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric
2003-03-01
We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.
NASA Astrophysics Data System (ADS)
Moran, Steve E.; Lugannani, Robert; Craig, Peter N.; Law, Robert L.
1989-02-01
An analysis is made of the performance of an optically phase-locked electronic speckle pattern interferometer in the presence of random noise displacements. Expressions for the phase-locked speckle contrast for single-frame imagery and the composite rms exposure for two sequentially subtracted frames are obtained in terms of the phase-locked composite and single-frame fringe functions. The noise fringe functions are evaluated for stationary, coherence-separable noise displacements obeying Gauss-Markov temporal statistics. The theoretical findings presented here are qualitatively supported by experimental results.
Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.
2013-01-01
Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689
NASA Astrophysics Data System (ADS)
Wu, Jinglai; Luo, Zhen; Zhang, Nong; Zhang, Yunqing; Walker, Paul D.
2017-02-01
This paper proposes an uncertain modelling and computational method to analyze dynamic responses of rigid-flexible multibody systems (or mechanisms) with random geometry and material properties. Firstly, the deterministic model for the rigid-flexible multibody system is built with the absolute node coordinate formula (ANCF), in which the flexible parts are modeled by using ANCF elements, while the rigid parts are described by ANCF reference nodes (ANCF-RNs). Secondly, uncertainty for the geometry of rigid parts is expressed as uniform random variables, while the uncertainty for the material properties of flexible parts is modeled as a continuous random field, which is further discretized to Gaussian random variables using a series expansion method. Finally, a non-intrusive numerical method is developed to solve the dynamic equations of systems involving both types of random variables, which systematically integrates the deterministic generalized-α solver with Latin Hypercube sampling (LHS) and Polynomial Chaos (PC) expansion. The benchmark slider-crank mechanism is used as a numerical example to demonstrate the characteristics of the proposed method.
Inflation with a graceful exit in a random landscape
NASA Astrophysics Data System (ADS)
Pedro, F. G.; Westphal, A.
2017-03-01
We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N ≪ 10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.
NASA Astrophysics Data System (ADS)
Graham, Wendy D.; Tankersley, Claude D.
1994-05-01
Stochastic methods are used to analyze two-dimensional steady groundwater flow subject to spatially variable recharge and transmissivity. Approximate partial differential equations are developed for the covariances and cross-covariances between the random head, transmissivity and recharge fields. Closed-form solutions of these equations are obtained using Fourier transform techniques. The resulting covariances and cross-covariances can be incorporated into a Bayesian conditioning procedure which provides optimal estimates of the recharge, transmissivity and head fields given available measurements of any or all of these random fields. Results show that head measurements contain valuable information for estimating the random recharge field. However, when recharge is treated as a spatially variable random field, the value of head measurements for estimating the transmissivity field can be reduced considerably. In a companion paper, the method is applied to a case study of the Upper Floridan Aquifer in NE Florida.
The random field Blume-Capel model revisited
NASA Astrophysics Data System (ADS)
Santos, P. V.; da Costa, F. A.; de Araújo, J. M.
2018-04-01
We have revisited the mean-field treatment for the Blume-Capel model under the presence of a discrete random magnetic field as introduced by Kaufman and Kanner (1990). The magnetic field (H) versus temperature (T) phase diagrams for given values of the crystal field D were recovered in accordance to Kaufman and Kanner original work. However, our main goal in the present work was to investigate the distinct structures of the crystal field versus temperature phase diagrams as the random magnetic field is varied because similar models have presented reentrant phenomenon due to randomness. Following previous works we have classified the distinct phase diagrams according to five different topologies. The topological structure of the phase diagrams is maintained for both H - T and D - T cases. Although the phase diagrams exhibit a richness of multicritical phenomena we did not found any reentrant effect as have been seen in similar models.
Chaotic behavior in the locomotion of Amoeba proteus.
Miyoshi, H; Kagawa, Y; Tsuchiya, Y
2001-01-01
The locomotion of Amoeba proteus has been investigated by algorithms evaluating correlation dimension and Lyapunov spectrum developed in the field of nonlinear science. It is presumed by these parameters whether the random behavior of the system is stochastic or deterministic. For the analysis of the nonlinear parameters, n-dimensional time-delayed vectors have been reconstructed from a time series of periphery and area of A. proteus images captured with a charge-coupled-device camera, which characterize its random motion. The correlation dimension analyzed has shown the random motion of A. proteus is subjected only to 3-4 macrovariables, though the system is a complex system composed of many degrees of freedom. Furthermore, the analysis of the Lyapunov spectrum has shown its largest exponent takes positive values. These results indicate the random behavior of A. proteus is chaotic and deterministic motion on an attractor with low dimension. It may be important for the elucidation of the cell locomotion to take account of nonlinear interactions among a small number of dynamics such as the sol-gel transformation, the cytoplasmic streaming, and the relating chemical reaction occurring in the cell.
Network meta-analysis, electrical networks and graph theory.
Rücker, Gerta
2012-12-01
Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.
EEG and MEG data analysis in SPM8.
Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl
2011-01-01
SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.
Microstructure of the IMF turbulences at 2.5 AU
NASA Technical Reports Server (NTRS)
Mavromichalaki, H.; Vassilaki, A.; Marmatsouri, L.; Moussas, X.; Quenby, J. J.; Smith, E. J.
1995-01-01
A detailed analysis of small period (15-900 sec) magnetohydrodynamic (MHD) turbulences of the interplanetary magnetic field (IMF) has been made using Pioneer-11 high time resolution data (0.75 sec) inside a Corotating Interaction Region (CIR) at a heliocentric distance of 2.5 AU in 1973. The methods used are the hodogram analysis, the minimum variance matrix analysis and the cohenrence analysis. The minimum variance analysis gives evidence of linear polarized wave modes. Coherence analysis has shown that the field fluctuations are dominated by the magnetosonic fast modes with periods 15 sec to 15 min. However, it is also shown that some small amplitude Alfven waves are present in the trailing edge of this region with characteristic periods (15-200 sec). The observed wave modes are locally generated and possibly attributed to the scattering of Alfven waves energy into random magnetosonic waves.
EEG and MEG Data Analysis in SPM8
Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl
2011-01-01
SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221
A New Algorithm with Plane Waves and Wavelets for Random Velocity Fields with Many Spatial Scales
NASA Astrophysics Data System (ADS)
Elliott, Frank W.; Majda, Andrew J.
1995-03-01
A new Monte Carlo algorithm for constructing and sampling stationary isotropic Gaussian random fields with power-law energy spectrum, infrared divergence, and fractal self-similar scaling is developed here. The theoretical basis for this algorithm involves the fact that such a random field is well approximated by a superposition of random one-dimensional plane waves involving a fixed finite number of directions. In general each one-dimensional plane wave is the sum of a random shear layer and a random acoustical wave. These one-dimensional random plane waves are then simulated by a wavelet Monte Carlo method for a single space variable developed recently by the authors. The computational results reported in this paper demonstrate remarkable low variance and economical representation of such Gaussian random fields through this new algorithm. In particular, the velocity structure function for an imcorepressible isotropic Gaussian random field in two space dimensions with the Kolmogoroff spectrum can be simulated accurately over 12 decades with only 100 realizations of the algorithm with the scaling exponent accurate to 1.1% and the constant prefactor accurate to 6%; in fact, the exponent of the velocity structure function can be computed over 12 decades within 3.3% with only 10 realizations. Furthermore, only 46,592 active computational elements are utilized in each realization to achieve these results for 12 decades of scaling behavior.
SMERFS: Stochastic Markov Evaluation of Random Fields on the Sphere
NASA Astrophysics Data System (ADS)
Creasey, Peter; Lang, Annika
2018-04-01
SMERFS (Stochastic Markov Evaluation of Random Fields on the Sphere) creates large realizations of random fields on the sphere. It uses a fast algorithm based on Markov properties and fast Fourier Transforms in 1d that generates samples on an n X n grid in O(n2 log n) and efficiently derives the necessary conditional covariance matrices.
Nonadiabatic Berry phase in nanocrystalline magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skomski, R.; Sellmyer, D. J.
2016-12-20
In this study, it is investigated how a Berry phase is created in polycrystalline nanomagnets and how the phase translates into an emergent magnetic field and into a topological Hall-effect contribution. The analysis starts directly from the spin of the conduction electrons and does not involve any adiabatic Hamiltonian. Completely random spin alignment in the nanocrystallites does not lead to a nonzero emergent field, but a modulation of the local magnetization does. As an explicit example, we consider a wire with a modulated cone angle.
NASA Astrophysics Data System (ADS)
Lamb, Derek A.
2016-10-01
While sunspots follow a well-defined pattern of emergence in space and time, small-scale flux emergence is assumed to occur randomly at all times in the quiet Sun. HMI's full-disk coverage, high cadence, spatial resolution, and duty cycle allow us to probe that basic assumption. Some case studies of emergence suggest that temporal clustering on spatial scales of 50-150 Mm may occur. If clustering is present, it could serve as a diagnostic of large-scale subsurface magnetic field structures. We present the results of a manual survey of small-scale flux emergence events over a short time period, and a statistical analysis addressing the question of whether these events show spatio-temporal behavior that is anything other than random.
Multilayer Markov Random Field models for change detection in optical remote sensing images
NASA Astrophysics Data System (ADS)
Benedek, Csaba; Shadaydeh, Maha; Kato, Zoltan; Szirányi, Tamás; Zerubia, Josiane
2015-09-01
In this paper, we give a comparative study on three Multilayer Markov Random Field (MRF) based solutions proposed for change detection in optical remote sensing images, called Multicue MRF, Conditional Mixed Markov model, and Fusion MRF. Our purposes are twofold. On one hand, we highlight the significance of the focused model family and we set them against various state-of-the-art approaches through a thematic analysis and quantitative tests. We discuss the advantages and drawbacks of class comparison vs. direct approaches, usage of training data, various targeted application fields and different ways of Ground Truth generation, meantime informing the Reader in which roles the Multilayer MRFs can be efficiently applied. On the other hand we also emphasize the differences between the three focused models at various levels, considering the model structures, feature extraction, layer interpretation, change concept definition, parameter tuning and performance. We provide qualitative and quantitative comparison results using principally a publicly available change detection database which contains aerial image pairs and Ground Truth change masks. We conclude that the discussed models are competitive against alternative state-of-the-art solutions, if one uses them as pre-processing filters in multitemporal optical image analysis. In addition, they cover together a large range of applications, considering the different usage options of the three approaches.
Time Correlations of Lightning Flash Sequences in Thunderstorms Revealed by Fractal Analysis
NASA Astrophysics Data System (ADS)
Gou, Xueqiang; Chen, Mingli; Zhang, Guangshu
2018-01-01
By using the data of lightning detection and ranging system at the Kennedy Space Center, the temporal fractal and correlation of interevent time series of lightning flash sequences in thunderstorms have been investigated with Allan factor (AF), Fano factor (FF), and detrended fluctuation analysis (DFA) methods. AF, FF, and DFA methods are powerful tools to detect the time-scaling structures and correlations in point processes. Totally 40 thunderstorms with distinguishing features of a single-cell storm and apparent increase and decrease in the total flash rate were selected for the analysis. It is found that the time-scaling exponents for AF (
Sustainability of transport structures - some aspects of the nonlinear reliability assessment
NASA Astrophysics Data System (ADS)
Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír
2017-09-01
Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.
Analysis of foliage effects on mobile propagation in dense urban environments
NASA Astrophysics Data System (ADS)
Bronshtein, Alexander; Mazar, Reuven; Lu, I.-Tai
2000-07-01
Attempts to reduce the interference level and to increase the spectral efficiency of cellular radio communication systems operating in dense urban and suburban areas lead to the microcellular approach with a consequent requirement to lower antenna heights. In large metropolitan areas having high buildings this requirement causes a situation where the transmitting and receiving antennas are both located below the rooftops, and the city street acts as a type of a waveguiding channel for the propagating signal. In this work, the city street is modeled as a random multislit waveguide with randomly distributed regions of foliage parallel to the building boundaries. The statistical propagation characteristics are expressed in terms of multiple ray-fields approaching the observer. Algorithms for predicting the path-loss along the waveguide and for computing the transverse field structure are presented.
Risk perception in epidemic modeling
NASA Astrophysics Data System (ADS)
Bagnoli, Franco; Liò, Pietro; Sguanci, Luca
2007-12-01
We investigate the effects of risk perception in a simple model of epidemic spreading. We assume that the perception of the risk of being infected depends on the fraction of neighbors that are ill. The effect of this factor is to decrease the infectivity, that therefore becomes a dynamical component of the model. We study the problem in the mean-field approximation and by numerical simulations for regular, random, and scale-free networks. We show that for homogeneous and random networks, there is always a value of perception that stops the epidemics. In the “worst-case” scenario of a scale-free network with diverging input connectivity, a linear perception cannot stop the epidemics; however, we show that a nonlinear increase of the perception risk may lead to the extinction of the disease. This transition is discontinuous, and is not predicted by the mean-field analysis.
Phase unwrapping using region-based markov random field model.
Dong, Ying; Ji, Jim
2010-01-01
Phase unwrapping is a classical problem in Magnetic Resonance Imaging (MRI), Interferometric Synthetic Aperture Radar and Sonar (InSAR/InSAS), fringe pattern analysis, and spectroscopy. Although many methods have been proposed to address this problem, robust and effective phase unwrapping remains a challenge. This paper presents a novel phase unwrapping method using a region-based Markov Random Field (MRF) model. Specifically, the phase image is segmented into regions within which the phase is not wrapped. Then, the phase image is unwrapped between different regions using an improved Highest Confidence First (HCF) algorithm to optimize the MRF model. The proposed method has desirable theoretical properties as well as an efficient implementation. Simulations and experimental results on MRI images show that the proposed method provides similar or improved phase unwrapping than Phase Unwrapping MAx-flow/min-cut (PUMA) method and ZpM method.
Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.
Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M
2005-11-01
We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.
Field Quality and Fabrication Analysis of HQ02 Reconstructed Nb3Sn Coil Cross Sections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holik, Eddie Frank; Ambrosio, Giorgio; Carbonara, Andrea
2017-01-23
The US LHC Accelerator Research Program (LARP) quadrupole HQ02 was designed and fully tested as part of the low-beta quad development for Hi-Lumi LHC. HQ02’s design is well documented with full fabrication accounting along with full field analysis at low and high current. With this history, HQ02 is an excellent test bed for developing a methodology for measuring turn locations from magnet cross sections and comparing with CAD models and measured field. All 4 coils of HQ02 were cut in identical locations along the magnetic length corresponding to magnetic field measurement and coil metrology. A real-time camera and coordinate measuringmore » equipment was used to plot turn corners. Measurements include systematic and random displacements of winding blocks and individual turns along the magnetic length. The range of cable shifts and the field harmonic range along the length are in agreement, although correlating turn locations and measured harmonics in each cross section is challenging.« less
Nanometer scale composition study of MBE grown BGaN performed by atom probe tomography
NASA Astrophysics Data System (ADS)
Bonef, Bastien; Cramer, Richard; Speck, James S.
2017-06-01
Laser assisted atom probe tomography is used to characterize the alloy distribution in BGaN. The effect of the evaporation conditions applied on the atom probe specimens on the mass spectrum and the quantification of the III site atoms is first evaluated. The evolution of the Ga++/Ga+ charge state ratio is used to monitor the strength of the applied field. Experiments revealed that applying high electric fields on the specimen results in the loss of gallium atoms, leading to the over-estimation of boron concentration. Moreover, spatial analysis of the surface field revealed a significant loss of atoms at the center of the specimen where high fields are applied. A good agreement between X-ray diffraction and atom probe tomography concentration measurements is obtained when low fields are applied on the tip. A random distribution of boron in the BGaN layer grown by molecular beam epitaxy is obtained by performing accurate and site specific statistical distribution analysis.
Power Analysis for Models of Change in Cluster Randomized Designs
ERIC Educational Resources Information Center
Li, Wei; Konstantopoulos, Spyros
2017-01-01
Field experiments in education frequently assign entire groups such as schools to treatment or control conditions. These experiments incorporate sometimes a longitudinal component where for example students are followed over time to assess differences in the average rate of linear change, or rate of acceleration. In this study, we provide methods…
Vegada, Bhavisha; Shukla, Apexa; Khilnani, Ajeetkumar; Charan, Jaykaran; Desai, Chetna
2016-01-01
Most of the academic teachers use four or five options per item of multiple choice question (MCQ) test as formative and summative assessment. Optimal number of options in MCQ item is a matter of considerable debate among academic teachers of various educational fields. There is a scarcity of the published literature regarding the optimum number of option in each item of MCQ in the field of medical education. To compare three options, four options, and five options MCQs test for the quality parameters - reliability, validity, item analysis, distracter analysis, and time analysis. Participants were 3 rd semester M.B.B.S. students. Students were divided randomly into three groups. Each group was given one set of MCQ test out of three options, four options, and five option randomly. Following the marking of the multiple choice tests, the participants' option selections were analyzed and comparisons were conducted of the mean marks, mean time, validity, reliability and facility value, discrimination index, point biserial value, distracter analysis of three different option formats. Students score more ( P = 0.000) and took less time ( P = 0.009) for the completion of three options as compared to four options and five options groups. Facility value was more ( P = 0.004) in three options group as compared to four and five options groups. There was no significant difference between three groups for the validity, reliability, and item discrimination. Nonfunctioning distracters were more in the four and five options group as compared to three option group. Assessment based on three option MCQs is can be preferred over four option and five option MCQs.
Mödden, Claudia; Behrens, Marion; Damke, Iris; Eilers, Norbert; Kastrup, Andreas; Hildebrandt, Helmut
2012-06-01
Compensatory and restorative treatments have been developed to improve visual field defects after stroke. However, no controlled trials have compared these interventions with standard occupational therapy (OT). A total of 45 stroke participants with visual field defect admitted for inpatient rehabilitation were randomized to restorative computerized training (RT) using computer-based stimulation of border areas of their visual field defects or to a computer-based compensatory therapy (CT) teaching a visual search strategy. OT, in which different compensation strategies were used to train for activities of daily living, served as standard treatment for the active control group. Each treatment group received 15 single sessions of 30 minutes distributed over 3 weeks. The primary outcome measures were visual field expansion for RT, visual search performance for CT, and reading performance for both treatments. Visual conjunction search, alertness, and the Barthel Index were secondary outcomes. Compared with OT, CT resulted in a better visual search performance, and RT did not result in a larger expansion of the visual field. Intragroup pre-post comparisons demonstrated that CT improved all defined outcome parameters and RT several, whereas OT only improved one. CT improved functional deficits after visual field loss compared with standard OT and may be the intervention of choice during inpatient rehabilitation. A larger trial that includes lesion location in the analysis is recommended.
Random walk study of electron motion in helium in crossed electromagnetic fields
NASA Technical Reports Server (NTRS)
Englert, G. W.
1972-01-01
Random walk theory, previously adapted to electron motion in the presence of an electric field, is extended to include a transverse magnetic field. In principle, the random walk approach avoids mathematical complexity and concomitant simplifying assumptions and permits determination of energy distributions and transport coefficients within the accuracy of available collisional cross section data. Application is made to a weakly ionized helium gas. Time of relaxation of electron energy distribution, determined by the random walk, is described by simple expressions based on energy exchange between the electron and an effective electric field. The restrictive effect of the magnetic field on electron motion, which increases the required number of collisions per walk to reach a terminal steady state condition, as well as the effect of the magnetic field on electron transport coefficients and mean energy can be quite adequately described by expressions involving only the Hall parameter.
Persistence and Lifelong Fidelity of Phase Singularities in Optical Random Waves.
De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L
2017-11-17
Phase singularities are locations where light is twisted like a corkscrew, with positive or negative topological charge depending on the twisting direction. Among the multitude of singularities arising in random wave fields, some can be found at the same location, but only when they exhibit opposite topological charge, which results in their mutual annihilation. New pairs can be created as well. With near-field experiments supported by theory and numerical simulations, we study the persistence and pairing statistics of phase singularities in random optical fields as a function of the excitation wavelength. We demonstrate how such entities can encrypt fundamental properties of the random fields in which they arise.
Persistence and Lifelong Fidelity of Phase Singularities in Optical Random Waves
NASA Astrophysics Data System (ADS)
De Angelis, L.; Alpeggiani, F.; Di Falco, A.; Kuipers, L.
2017-11-01
Phase singularities are locations where light is twisted like a corkscrew, with positive or negative topological charge depending on the twisting direction. Among the multitude of singularities arising in random wave fields, some can be found at the same location, but only when they exhibit opposite topological charge, which results in their mutual annihilation. New pairs can be created as well. With near-field experiments supported by theory and numerical simulations, we study the persistence and pairing statistics of phase singularities in random optical fields as a function of the excitation wavelength. We demonstrate how such entities can encrypt fundamental properties of the random fields in which they arise.
Random scalar fields and hyperuniformity
NASA Astrophysics Data System (ADS)
Ma, Zheng; Torquato, Salvatore
2017-06-01
Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystals and liquids. Hyperuniform systems have attracted recent attention because they are endowed with novel transport and optical properties. Recently, the hyperuniformity concept has been generalized to characterize two-phase media, scalar fields, and random vector fields. In this paper, we devise methods to explicitly construct hyperuniform scalar fields. Specifically, we analyze spatial patterns generated from Gaussian random fields, which have been used to model the microwave background radiation and heterogeneous materials, the Cahn-Hilliard equation for spinodal decomposition, and Swift-Hohenberg equations that have been used to model emergent pattern formation, including Rayleigh-Bénard convection. We show that the Gaussian random scalar fields can be constructed to be hyperuniform. We also numerically study the time evolution of spinodal decomposition patterns and demonstrate that they are hyperuniform in the scaling regime. Moreover, we find that labyrinth-like patterns generated by the Swift-Hohenberg equation are effectively hyperuniform. We show that thresholding (level-cutting) a hyperuniform Gaussian random field to produce a two-phase random medium tends to destroy the hyperuniformity of the progenitor scalar field. We then propose guidelines to achieve effectively hyperuniform two-phase media derived from thresholded non-Gaussian fields. Our investigation paves the way for new research directions to characterize the large-structure spatial patterns that arise in physics, chemistry, biology, and ecology. Moreover, our theoretical results are expected to guide experimentalists to synthesize new classes of hyperuniform materials with novel physical properties via coarsening processes and using state-of-the-art techniques, such as stereolithography and 3D printing.
Feldon, Steven E
2004-01-01
ABSTRACT Purpose To validate a computerized expert system evaluating visual fields in a prospective clinical trial, the Ischemic Optic Neuropathy Decompression Trial (IONDT). To identify the pattern and within-pattern severity of field defects for study eyes at baseline and 6-month follow-up. Design Humphrey visual field (HVF) change was used as the outcome measure for a prospective, randomized, multi-center trial to test the null hypothesis that optic nerve sheath decompression was ineffective in treating nonarteritic anterior ischemic optic neuropathy and to ascertain the natural history of the disease. Methods An expert panel established criteria for the type and severity of visual field defects. Using these criteria, a rule-based computerized expert system interpreted HVF from baseline and 6-month visits for patients randomized to surgery or careful follow-up and for patients who were not randomized. Results A computerized expert system was devised and validated. The system was then used to analyze HVFs. The pattern of defects found at baseline for patients randomized to surgery did not differ from that of patients randomized to careful follow-up. The most common pattern of defect was a superior and inferior arcuate with central scotoma for randomized eyes (19.2%) and a superior and inferior arcuate for nonrandomized eyes (30.6%). Field patterns at 6 months and baseline were not different. For randomized study eyes, the superior altitudinal defects improved (P = .03), as did the inferior altitudinal defects (P = .01). For nonrandomized study eyes, only the inferior altitudinal defects improved (P = .02). No treatment effect was noted. Conclusions A novel rule-based expert system successfully interpreted visual field defects at baseline of eyes enrolled in the IONDT. PMID:15747764
Random Assignment: Practical Considerations from Field Experiments.
ERIC Educational Resources Information Center
Dunford, Franklyn W.
1990-01-01
Seven qualitative issues associated with randomization that have the potential to weaken or destroy otherwise sound experimental designs are reviewed and illustrated via actual field experiments. Issue areas include ethics and legality, liability risks, manipulation of randomized outcomes, hidden bias, design intrusiveness, case flow, and…
NASA Astrophysics Data System (ADS)
da Silva, Roberto; Vainstein, Mendeli H.; Gonçalves, Sebastián; Paula, Felipe S. F.
2013-08-01
Statistics of soccer tournament scores based on the double round robin system of several countries are studied. Exploring the dynamics of team scoring during tournament seasons from recent years we find evidences of superdiffusion. A mean-field analysis results in a drift velocity equal to that of real data but in a different diffusion coefficient. Along with the analysis of real data we present the results of simulations of soccer tournaments obtained by an agent-based model which successfully describes the final scoring distribution [da Silva , Comput. Phys. Commun.CPHCBZ0010-465510.1016/j.cpc.2012.10.030 184, 661 (2013)]. Such model yields random walks of scores over time with the same anomalous diffusion as observed in real data.
Asiimwe, Stephen; Oloya, James; Song, Xiao; Whalen, Christopher C
2014-12-01
Unsupervised HIV self-testing (HST) has potential to increase knowledge of HIV status; however, its accuracy is unknown. To estimate the accuracy of unsupervised HST in field settings in Uganda, we performed a non-blinded, randomized controlled, non-inferiority trial of unsupervised compared with supervised HST among selected high HIV risk fisherfolk (22.1 % HIV Prevalence) in three fishing villages in Uganda between July and September 2013. The study enrolled 246 participants and randomized them in a 1:1 ratio to unsupervised HST or provider-supervised HST. In an intent-to-treat analysis, the HST sensitivity was 90 % in the unsupervised arm and 100 % among the provider-supervised, yielding a difference 0f -10 % (90 % CI -21, 1 %); non-inferiority was not shown. In a per protocol analysis, the difference in sensitivity was -5.6 % (90 % CI -14.4, 3.3 %) and did show non-inferiority. We conclude that unsupervised HST is feasible in rural Africa and may be non-inferior to provider-supervised HST.
New constraints on modelling the random magnetic field of the MW
NASA Astrophysics Data System (ADS)
Beck, Marcus C.; Beck, Alexander M.; Beck, Rainer; Dolag, Klaus; Strong, Andrew W.; Nielaba, Peter
2016-05-01
We extend the description of the isotropic and anisotropic random component of the small-scale magnetic field within the existing magnetic field model of the Milky Way from Jansson & Farrar, by including random realizations of the small-scale component. Using a magnetic-field power spectrum with Gaussian random fields, the NE2001 model for the thermal electrons and the Galactic cosmic-ray electron distribution from the current GALPROP model we derive full-sky maps for the total and polarized synchrotron intensity as well as the Faraday rotation-measure distribution. While previous work assumed that small-scale fluctuations average out along the line-of-sight or which only computed ensemble averages of random fields, we show that these fluctuations need to be carefully taken into account. Comparing with observational data we obtain not only good agreement with 408 MHz total and WMAP7 22 GHz polarized intensity emission maps, but also an improved agreement with Galactic foreground rotation-measure maps and power spectra, whose amplitude and shape strongly depend on the parameters of the random field. We demonstrate that a correlation length of 0≈22 pc (05 pc being a 5σ lower limit) is needed to match the slope of the observed power spectrum of Galactic foreground rotation-measure maps. Using multiple realizations allows us also to infer errors on individual observables. We find that previously-used amplitudes for random and anisotropic random magnetic field components need to be rescaled by factors of ≈0.3 and 0.6 to account for the new small-scale contributions. Our model predicts a rotation measure of -2.8±7.1 rad/m2 and 04.4±11. rad/m2 for the north and south Galactic poles respectively, in good agreement with observations. Applying our model to deflections of ultra-high-energy cosmic rays we infer a mean deflection of ≈3.5±1.1 degree for 60 EeV protons arriving from CenA.
Strand-seq: a unifying tool for studies of chromosome segregation
Falconer, Ester; Lansdorp, Peter M.
2013-01-01
Non random segregation of sister chromatids has been implicated to help specify daughter cell fate (the Silent Sister Hypothesis [1]) or to protect the genome of long-lived stem cells (the Immortal Strand Hypothesis [2]). The idea that sister chromatids are non-randomly segregated into specific daughter cells is only marginally supported by data in sporadic and often contradictory studies. As a result, the field has moved forward rather slowly. The advent of being able to directly label and differentiate sister chromatids in vivo using fluorescence in situ hybridization [3] was a significant advance for such studies. However, this approach is limited by the need for large tracks of unidirectional repeats on chromosomes and the reliance on quantitative imaging of fluorescent probes and rigorous statistical analysis to discern between the two competing hypotheses. A novel method called Strand-seq which uses next-generation sequencing to assay sister chromatid inheritance patterns independently for each chromosome [4] offers a comprehensive approach to test for non-random segregation. In addition Strand-seq enables studies on the deposition of chromatin marks in relation to DNA replication. This method is expected to help unify the field by testing previous claims of non-random segregation in an unbiased way in many model systems in vitro and in vivo. PMID:23665005
NASA Astrophysics Data System (ADS)
Voronin, A. A.; Panchenko, V. Ya; Zheltikov, A. M.
2016-06-01
High-intensity ultrashort laser pulses propagating in gas media or in condensed matter undergo complex nonlinear spatiotemporal evolution where temporal transformations of optical field waveforms are strongly coupled to an intricate beam dynamics and ultrafast field-induced ionization processes. At the level of laser peak powers orders of magnitude above the critical power of self-focusing, the beam exhibits modulation instabilities, producing random field hot spots and breaking up into multiple noise-seeded filaments. This problem is described by a (3 + 1)-dimensional nonlinear field evolution equation, which needs to be solved jointly with the equation for ultrafast ionization of a medium. Analysis of this problem, which is equivalent to solving a billion-dimensional evolution problem, is only possible by means of supercomputer simulations augmented with coordinated big-data processing of large volumes of information acquired through theory-guiding experiments and supercomputations. Here, we review the main challenges of supercomputations and big-data processing encountered in strong-field ultrafast optical physics and discuss strategies to confront these challenges.
Fatigue failure of pb-free electronic packages under random vibration loads
NASA Astrophysics Data System (ADS)
Saravanan, S.; Prabhu, S.; Muthukumar, R.; Gowtham Raj, S.; Arun Veerabagu, S.
2018-03-01
The electronic equipment are used in several fields like, automotive, aerospace, consumer goods where they are subjected to vibration loads leading to failure of solder joints used in these equipment. This paper presents a methodology to predict the fatigue life of Pb-free surface mounted BGA packages subjected to random vibrations. The dynamic characteristics of the PCB, such as the natural frequencies, mode shapes and damping ratios were determined. Spectrum analysis was used to determine the stress response of the critical solder joint and the cumulative fatigue damage accumulated by the solder joint for a specific duration was determined.
Multisource passive acoustic tracking: an application of random finite set data fusion
NASA Astrophysics Data System (ADS)
Ali, Andreas M.; Hudson, Ralph E.; Lorenzelli, Flavio; Yao, Kung
2010-04-01
Multisource passive acoustic tracking is useful in animal bio-behavioral study by replacing or enhancing human involvement during and after field data collection. Multiple simultaneous vocalizations are a common occurrence in a forest or a jungle, where many species are encountered. Given a set of nodes that are capable of producing multiple direction-of-arrivals (DOAs), such data needs to be combined into meaningful estimates. Random Finite Set provides the mathematical probabilistic model, which is suitable for analysis and optimal estimation algorithm synthesis. Then the proposed algorithm has been verified using a simulation and a controlled test experiment.
Phase retrieval in generalized optical interferometry systems.
Farriss, Wesley E; Fienup, James R; Malhotra, Tanya; Vamivakas, A Nick
2018-02-05
Modal analysis of an optical field via generalized interferometry (GI) is a novel technique that treats said field as a linear superposition of transverse modes and recovers the amplitudes of modal weighting coefficients. We use phase retrieval by nonlinear optimization to recover the phase of these modal weighting coefficients. Information diversity increases the robustness of the algorithm by better constraining the solution. Additionally, multiple sets of random starting phase values assist the algorithm in overcoming local minima. The algorithm was able to recover nearly all coefficient phases for simulated fields consisting of up to 21 superpositioned Hermite Gaussian modes from simulated data and proved to be resilient to shot noise.
NASA Astrophysics Data System (ADS)
Weng, Jingmeng; Wen, Weidong; Cui, Haitao; Chen, Bo
2018-06-01
A new method to generate the random distribution of fibers in the transverse cross-section of fiber reinforced composites with high fiber volume fraction is presented in this paper. Based on the microscopy observation of the transverse cross-sections of unidirectional composite laminates, hexagon arrangement is set as the initial arrangement status, and the initial velocity of each fiber is arbitrary at an arbitrary direction, the micro-scale representative volume element (RVE) is established by simulating perfectly elastic collision. Combined with the proposed periodic boundary conditions which are suitable for multi-axial loading, the effective elastic properties of composite materials can be predicted. The predicted properties show reasonable agreement with experimental results. By comparing the stress field of RVE with fibers distributed randomly and RVE with fibers distributed periodically, the predicted elastic modulus of RVE with fibers distributed randomly is greater than RVE with fibers distributed periodically.
Spatial Distribution of Phase Singularities in Optical Random Vector Waves.
De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L
2016-08-26
Phase singularities are dislocations widely studied in optical fields as well as in other areas of physics. With experiment and theory we show that the vectorial nature of light affects the spatial distribution of phase singularities in random light fields. While in scalar random waves phase singularities exhibit spatial distributions reminiscent of particles in isotropic liquids, in vector fields their distribution for the different vector components becomes anisotropic due to the direct relation between propagation and field direction. By incorporating this relation in the theory for scalar fields by Berry and Dennis [Proc. R. Soc. A 456, 2059 (2000)], we quantitatively describe our experiments.
Analysis And Validation of the Field Coupled Through an Aperture in an Avionics Enclosure
NASA Astrophysics Data System (ADS)
Bakore, Rahul
This work focused on accurately predicting the current response of an equipment under test (EUT) to a random electromagnetic field representing a threat source to model radio frequency directed energy weapons (RFDEWs). The modeled EUT consists of a single wire attached to the interior wall of a shielding enclosure that includes an aperture on one face. An in-house computational electromagnetic (CEM) code based on method of moments (MOM) and accelerated by the multi-level fast multipole algorithm (MLFMA), was enhanced through the implementation of first order vector basis functions that approximates the EUT surface current. The electric field integral equation (EFIE) is solved using MOM/MLFMA. Use of first-order basis functions gives a large savings in computational time over the previous implementation with zero-order Rao-Wilton-Glisson basis functions. A sample EUT was fabricated and tested within an anechoic chamber and a reverberation chamber over a wide frequency band. In the anechoic chamber measurements, the current response on the wire within the EUT due to a single uniform plane wave was found and compared with the numerical simulations. In the reverberation chamber measurements, the mean current magnitude excited on the wire within the EUT by a mechanically stirred random field was measured and compared with the numerical simulations. The measured scattering parameter between the source antenna and the EUT measurement port was used to derive the current response on the wire in both chambers. The numerically simulated currents agree very well with the measurements in both the anechoic and reverberation chambers over the measured frequency band, confirming the validity of the numerical approach for calculating EUT response due to a random field. An artificial neural network (ANN) was trained that can rapidly provide the mean induced current response of an EUT due to a random field under different aperture configurations arbitrarily placed on one face of an EUT. However, ANN proved no better than simple linear interpolation in approximating the induced currents on EUTs that give strong resonances and nulls in the response.
Transport properties and pinning analysis for Co-doped BaFe2As2 thin films on metal tapes
NASA Astrophysics Data System (ADS)
Xu, Zhongtang; Yuan, Pusheng; Fan, Fan; Chen, Yimin; Ma, Yanwei
2018-05-01
We report on the transport properties and pinning analysis of BaFe1.84Co0.16As2 (Ba122:Co) thin films on metal tapes by pulsed laser deposition. The thin films exhibit a large in-plane misorientation of 5.6°, close to that of the buffer layer SrTiO3 (5.9°). Activation energy U 0(H) analysis reveals a power law relationship with field, having three different exponents at different field regions, indicative of variation from single-vortex pinning to a collective flux creep regime. The Ba122:Co coated conductors present {{T}{{c}}}{{onset}} = 20.2 K and {{T}{{c}}}{{zero}} = 19.0 K along with a self-field J c of 1.14 MA cm‑2 and an in-field J c as high as 0.98 and 0.86 MA cm‑2 up to 9 T at 4.2 K for both major crystallographic directions of the applied field, promising for high field applications. Pinning force analysis indicates a significant enhancement compared with similar Ba122:Co coated conductors. By using the anisotropic scaling approach, intrinsic pinning associated with coupling between superconducting blocks can be identified as the pinning source in the vicinity of H//ab, while for H//c random point defects are likely to play a role but correlated defects start to be active at high temperatures.
NASA Astrophysics Data System (ADS)
Yüksel, Yusuf
2018-05-01
We propose an atomistic model and present Monte Carlo simulation results regarding the influence of FM/AF interface structure on the hysteresis mechanism and exchange bias behavior for a spin valve type FM/FM/AF magnetic junction. We simulate perfectly flat and roughened interface structures both with uncompensated interfacial AF moments. In order to simulate rough interface effect, we introduce the concept of random exchange anisotropy field induced at the interface, and acting on the interface AF spins. Our results yield that different types of the random field distributions of anisotropy field may lead to different behavior of exchange bias.
Analysis of dependent scattering mechanism in hard-sphere Yukawa random media
NASA Astrophysics Data System (ADS)
Wang, B. X.; Zhao, C. Y.
2018-06-01
The structural correlations in the microscopic structures of random media can induce the dependent scattering mechanism and thus influence the optical scattering properties. Based on our recent theory on the dependent scattering mechanism in random media composed of discrete dipolar scatterers [B. X. Wang and C. Y. Zhao, Phys. Rev. A 97, 023836 (2018)], in this paper, we study the hard-sphere Yukawa random media, in order to further elucidate the role of structural correlations in the dependent scattering mechanism and hence optical scattering properties. Here, we consider charged colloidal suspensions, whose effective pair interaction between colloids is described by a screened Coulomb (Yukawa) potential. By means of adding salt ions, the pair interaction between the charged particles can be flexibly tailored and therefore the structural correlations are modified. It is shown that this strategy can affect the optical properties significantly. For colloidal TiO2 suspensions, the modification of electric and magnetic dipole excitations induced by the structural correlations can substantially influence the optical scattering properties, in addition to the far-field interference effect described by the structure factor. However, this modification is only slightly altered by different salt concentrations and is mainly because of the packing-density-dependent screening effect. On the other hand, for low refractive index colloidal polystyrene suspensions, the dependent scattering mechanism mainly involves the far-field interference effect, and the effective exciting field amplitude for the electric dipole almost remains unchanged under different structural correlations. The present study has profound implications for understanding the role of structural correlations in the dependent scattering mechanism.
Summer School Effects in a Randomized Field Trial
ERIC Educational Resources Information Center
Zvoch, Keith; Stevens, Joseph J.
2013-01-01
This field-based randomized trial examined the effect of assignment to and participation in summer school for two moderately at-risk samples of struggling readers. Application of multiple regression models to difference scores capturing the change in summer reading fluency revealed that kindergarten students randomly assigned to summer school…
Quasi-experimental evaluation without regression analysis.
Rohrer, James E
2009-01-01
Evaluators of public health programs in field settings cannot always randomize subjects into experimental or control groups. By default, they may choose to employ the weakest study design available: the pretest, posttest approach without a comparison group. This essay argues that natural experiments involving comparison groups are within reach of public health program managers. Methods for analyzing natural experiments are discussed.
ERIC Educational Resources Information Center
Smith, David Arthur
2010-01-01
Much recent work in natural language processing treats linguistic analysis as an inference problem over graphs. This development opens up useful connections between machine learning, graph theory, and linguistics. The first part of this dissertation formulates syntactic dependency parsing as a dynamic Markov random field with the novel…
ERIC Educational Resources Information Center
Azar, Ali Sorayyaei; Hashim, Azirah
2014-01-01
The classes, purposes and characteristics associated with the review article in the field of applied linguistics were analyzed. The data were collected from a randomly selected corpus of thirty two review articles from a discipline-related key journal in applied linguistics. The findings revealed that different sub-genres can be identified within…
Research Agendas and Pedagogical Applications: What "Public Relations Review" Tells Us.
ERIC Educational Resources Information Center
Thomsen, Steven R.
A study explored the research agenda of "Public Relations Review," the oldest scholarly journal in the public relations field. To provide a descriptive and inferential analysis of the content of the journal from 1985 to 1994, four volumes were selected at random (1985, 1987, 1991, and 1993) and all the articles in them were analyzed.…
Spin dynamics of random Ising chain in coexisting transverse and longitudinal magnetic fields
NASA Astrophysics Data System (ADS)
Liu, Zhong-Qiang; Jiang, Su-Rong; Kong, Xiang-Mu; Xu, Yu-Liang
2017-05-01
The dynamics of the random Ising spin chain in coexisting transverse and longitudinal magnetic fields is studied by the recursion method. Both the spin autocorrelation function and its spectral density are investigated by numerical calculations. It is found that system's dynamical behaviors depend on the deviation σJ of the random exchange coupling between nearest-neighbor spins and the ratio rlt of the longitudinal and the transverse fields: (i) For rlt = 0, the system undergoes two crossovers from N independent spins precessing about the transverse magnetic field to a collective-mode behavior, and then to a central-peak behavior as σJ increases. (ii) For rlt ≠ 0, the system may exhibit a coexistence behavior of a collective-mode one and a central-peak one. When σJ is small (or large enough), system undergoes a crossover from a coexistence behavior (or a disordered behavior) to a central-peak behavior as rlt increases. (iii) Increasing σJ depresses effects of both the transverse and the longitudinal magnetic fields. (iv) Quantum random Ising chain in coexisting magnetic fields may exhibit under-damping and critical-damping characteristics simultaneously. These results indicate that changing the external magnetic fields may control and manipulate the dynamics of the random Ising chain.
Lee, Jiwon; Kim, Won Ho; Ryu, Ho-Geol; Lee, Hyung-Chul; Chung, Eun-Jin; Yang, Seong-Mi; Jung, Chul-Woo
2017-08-01
We previously demonstrated the usefulness of milrinone for living donor hepatectomy. However, a less-invasive alternative to central venous catheterization and perioperative contributors to good surgical outcomes remain undetermined. The current study evaluated whether the stroke volume variation (SVV)-guided method can substitute central venous catheterization during milrinone-induced profound vasodilation. We randomly assigned 42 living liver donors to receive either SVV guidance or central venous pressure (CVP) guidance to obtain milrinone-induced low CVP. Target SVV of 9% was used as a substitute for CVP of 5 mm Hg. The surgical field grade evaluated by 2 attending surgeons on a 4-point scale was compared between the CVP- and SVV-guided groups (n = 19, total number of scores = 38 per group) as a primary outcome variable. Multivariable analysis was performed to identify independent factors associated with the best surgical field as a post hoc analysis. Surgical field grades, which were either 1 or 2, were not found to be different between the 2 groups via Mann-Whitney U test (P = .358). There was a very weak correlation between SVV and CVP during profound vasodilation such as CVP ≤ 5 mm Hg (R = -0.06; 95% confidence interval, -0.09 to -0.04; P < .001). Additional post hoc analysis suggested that younger age, lower baseline CVP, and longer duration of milrinone infusion might be helpful in providing the best surgical field. Milrinone-induced vasodilation resulted in favorable surgical environment regardless of guidance methods of low CVP during living donor hepatectomy. However, SVV was not a useful indicator of low CVP because of very weak correlation between SVV and CVP during profound vasodilation. In addition, factors contributing to the best surgical field such as donor age, proactive fasting, and proper dosing of milrinone need to be investigated further, ideally through prospective studies.
NASA Astrophysics Data System (ADS)
Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos
2006-10-01
We use the detrended fluctuation analysis (DFA) and the Grassberger-Proccacia analysis (GP) methods in order to study language characteristics. Despite that we construct our signals using only word lengths or word frequencies, excluding in this way huge amount of information from language, the application of GP analysis indicates that linguistic signals may be considered as the manifestation of a complex system of high dimensionality, different from random signals or systems of low dimensionality such as the Earth climate. The DFA method is additionally able to distinguish a natural language signal from a computer code signal. This last result may be useful in the field of cryptography.
The ABC (in any D) of logarithmic CFT
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro
2017-10-01
Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.
Cartoon music in a candy store: a field experiment.
Le Guellec, Hélène; Guéguen, Nicolas; Jacob, Céline; Pascual, Alexandre
2007-06-01
An experiment on consumers' behavior was carried out in a new field context. According to a random assignment, 60 customers from ages 12 to 14 years who entered a candy store were exposed to Top Forty music which was usually played in this store, music from cartoons (Captain Flame, Candy, Olive & Tom, etc.), or no music. Analysis showed that customers spent significantly more time in the store when cartoon music was played, but the two styles of music were not related to the amount of money spent.
Cosmic Rays in Intermittent Magnetic Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shukurov, Anvar; Seta, Amit; Bushby, Paul J.
The propagation of cosmic rays in turbulent magnetic fields is a diffusive process driven by the scattering of the charged particles by random magnetic fluctuations. Such fields are usually highly intermittent, consisting of intense magnetic filaments and ribbons surrounded by weaker, unstructured fluctuations. Studies of cosmic-ray propagation have largely overlooked intermittency, instead adopting Gaussian random magnetic fields. Using test particle simulations, we calculate cosmic-ray diffusivity in intermittent, dynamo-generated magnetic fields. The results are compared with those obtained from non-intermittent magnetic fields having identical power spectra. The presence of magnetic intermittency significantly enhances cosmic-ray diffusion over a wide range of particlemore » energies. We demonstrate that the results can be interpreted in terms of a correlated random walk.« less
Introduction to the Special Issue.
ERIC Educational Resources Information Center
Petrosino, Anthony
2003-01-01
Introduces the articles of this special issue focusing on randomized field trials in criminology. In spite of the overall lack of randomized field trials in criminology, some agencies and individuals are able to mount an impressive number of field trials, and these articles focus on their experiences. (SLD)
Ideas for a pattern-oriented approach towards a VERA analysis ensemble
NASA Astrophysics Data System (ADS)
Gorgas, T.; Dorninger, M.
2010-09-01
Ideas for a pattern-oriented approach towards a VERA analysis ensemble For many applications in meteorology and especially for verification purposes it is important to have some information about the uncertainties of observation and analysis data. A high quality of these "reference data" is an absolute necessity as the uncertainties are reflected in verification measures. The VERA (Vienna Enhanced Resolution Analysis) scheme includes a sophisticated quality control tool which accounts for the correction of observational data and provides an estimation of the observation uncertainty. It is crucial for meteorologically and physically reliable analysis fields. VERA is based on a variational principle and does not need any first guess fields. It is therefore NWP model independent and can also be used as an unbiased reference for real time model verification. For downscaling purposes VERA uses an a priori knowledge on small-scale physical processes over complex terrain, the so called "fingerprint technique", which transfers information from rich to data sparse regions. The enhanced Joint D-PHASE and COPS data set forms the data base for the analysis ensemble study. For the WWRP projects D-PHASE and COPS a joint activity has been started to collect GTS and non-GTS data from the national and regional meteorological services in Central Europe for 2007. Data from more than 11.000 stations are available for high resolution analyses. The usage of random numbers as perturbations for ensemble experiments is a common approach in meteorology. In most implementations, like for NWP-model ensemble systems, the focus lies on error growth and propagation on the spatial and temporal scale. When defining errors in analysis fields we have to consider the fact that analyses are not time dependent and that no perturbation method aimed at temporal evolution is possible. Further, the method applied should respect two major sources of analysis errors: Observation errors AND analysis or interpolation errors. With the concept of an analysis ensemble we hope to get a more detailed sight on both sources of analysis errors. For the computation of the VERA ensemble members a sample of Gaussian random perturbations is produced for each station and parameter. The deviation of perturbations is based on the correction proposals by the VERA QC scheme to provide some "natural" limits for the ensemble. In order to put more emphasis on the weather situation we aim to integrate the main synoptic field structures as weighting factors for the perturbations. Two widely approved approaches are used for the definition of these main field structures: The Principal Component Analysis and a 2D-Discrete Wavelet Transform. The results of tests concerning the implementation of this pattern-supported analysis ensemble system and a comparison of the different approaches are given in the presentation.
BIGEL analysis of gene expression in HL60 cells exposed to X rays or 60 Hz magnetic fields
NASA Technical Reports Server (NTRS)
Balcer-Kubiczek, E. K.; Zhang, X. F.; Han, L. H.; Harrison, G. H.; Davis, C. C.; Zhou, X. J.; Ioffe, V.; McCready, W. A.; Abraham, J. M.; Meltzer, S. J.
1998-01-01
We screened a panel of 1,920 randomly selected cDNAs to discover genes that are differentially expressed in HL60 cells exposed to 60 Hz magnetic fields (2 mT) or X rays (5 Gy) compared to unexposed cells. Identification of these clones was accomplished using our two-gel cDNA library screening method (BIGEL). Eighteen cDNAs differentially expressed in X-irradiated compared to control HL60 cells were recovered from a panel of 1,920 clones. Differential expression in experimental compared to control cells was confirmed independently by Northern blotting of paired total RNA samples hybridized to each of the 18 clone-specific cDNA probes. DNA sequencing revealed that 15 of the 18 cDNA clones produced matches with the database for genes related to cell growth, protein synthesis, energy metabolism, oxidative stress or apoptosis (including MYC, neuroleukin, copper zinc-dependent superoxide dismutase, TC4 RAS-like protein, peptide elongation factor 1alpha, BNIP3, GATA3, NF45, cytochrome c oxidase II and triosephosphate isomerase mRNAs). In contrast, BIGEL analysis of the same 1,920 cDNAs revealed no differences greater than 1.5-fold in expression levels in magnetic-field compared to sham-exposed cells. Magnetic-field-exposed and control samples were analyzed further for the presence of mRNA encoding X-ray-responsive genes by hybridization of the 18 specific cDNA probes to RNA from exposed and control HL60 cells. Our results suggest that differential gene expression is induced in approximately 1% of a random pool of cDNAs by ionizing radiation but not by 60 Hz magnetic fields under the present experimental conditions.
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zentner, I.; Ferré, G., E-mail: gregoire.ferre@ponts.org; Poirion, F.
2016-06-01
In this paper, a new method for the identification and simulation of non-Gaussian and non-stationary stochastic fields given a database is proposed. It is based on two successive biorthogonal decompositions aiming at representing spatio–temporal stochastic fields. The proposed double expansion allows to build the model even in the case of large-size problems by separating the time, space and random parts of the field. A Gaussian kernel estimator is used to simulate the high dimensional set of random variables appearing in the decomposition. The capability of the method to reproduce the non-stationary and non-Gaussian features of random phenomena is illustrated bymore » applications to earthquakes (seismic ground motion) and sea states (wave heights).« less
[Visual field progression in glaucoma: cluster analysis].
Bresson-Dumont, H; Hatton, J; Foucher, J; Fonteneau, M
2012-11-01
Visual field progression analysis is one of the key points in glaucoma monitoring, but distinction between true progression and random fluctuation is sometimes difficult. There are several different algorithms but no real consensus for detecting visual field progression. The trend analysis of global indices (MD, sLV) may miss localized deficits or be affected by media opacities. Conversely, point-by-point analysis makes progression difficult to differentiate from physiological variability, particularly when the sensitivity of a point is already low. The goal of our study was to analyse visual field progression with the EyeSuite™ Octopus Perimetry Clusters algorithm in patients with no significant changes in global indices or worsening of the analysis of pointwise linear regression. We analyzed the visual fields of 162 eyes (100 patients - 58 women, 42 men, average age 66.8 ± 10.91) with ocular hypertension or glaucoma. For inclusion, at least six reliable visual fields per eye were required, and the trend analysis (EyeSuite™ Perimetry) of visual field global indices (MD and SLV), could show no significant progression. The analysis of changes in cluster mode was then performed. In a second step, eyes with statistically significant worsening of at least one of their clusters were analyzed point-by-point with the Octopus Field Analysis (OFA). Fifty four eyes (33.33%) had a significant worsening in some clusters, while their global indices remained stable over time. In this group of patients, more advanced glaucoma was present than in stable group (MD 6.41 dB vs. 2.87); 64.82% (35/54) of those eyes in which the clusters progressed, however, had no statistically significant change in the trend analysis by pointwise linear regression. Most software algorithms for analyzing visual field progression are essentially trend analyses of global indices, or point-by-point linear regression. This study shows the potential role of analysis by clusters trend. However, for best results, it is preferable to compare the analyses of several tests in combination with morphologic exam. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Subjects related to electromagnetic compatibility (EMC) analysis are discussed, taking into account forcing terms of line equations for externally excited transmission lines, E-fields over ground, electromagnetic near fields as a function of electrical size, a program for experimental verification of EMC analysis models, random susceptability of an IC 7400 TTL NAND gate, and a comparison of IEMCAP and SEMCAP. Other topics explored are concerned with EMC measurements, spectrum management, the electromagnetic pulse (EMP), a Navy EMC program, measurement systems, filters, EMC design, electromagnetic vulnerability (EMV) assessment of weapon systems, FCC rules and regulations, shielding, and electromagnetic interference (EMI) in communication systems. Attention is also given to nonsinusoidal functions in radar and communications, transients/electrostatic discharge, open field testing, cables and connectors, interference effects of induced and conducted earth current at dc and ELF, test cells, and cable coupling.
Subcritical Multiplicative Chaos for Regularized Counting Statistics from Random Matrix Theory
NASA Astrophysics Data System (ADS)
Lambert, Gaultier; Ostrovsky, Dmitry; Simm, Nick
2018-05-01
For an {N × N} Haar distributed random unitary matrix U N , we consider the random field defined by counting the number of eigenvalues of U N in a mesoscopic arc centered at the point u on the unit circle. We prove that after regularizing at a small scale {ɛN > 0}, the renormalized exponential of this field converges as N \\to ∞ to a Gaussian multiplicative chaos measure in the whole subcritical phase. We discuss implications of this result for obtaining a lower bound on the maximum of the field. We also show that the moments of the total mass converge to a Selberg-like integral and by taking a further limit as the size of the arc diverges, we establish part of the conjectures in Ostrovsky (Nonlinearity 29(2):426-464, 2016). By an analogous construction, we prove that the multiplicative chaos measure coming from the sine process has the same distribution, which strongly suggests that this limiting object should be universal. Our approach to the L 1-phase is based on a generalization of the construction in Berestycki (Electron Commun Probab 22(27):12, 2017) to random fields which are only asymptotically Gaussian. In particular, our method could have applications to other random fields coming from either random matrix theory or a different context.
Correcting Biases in a lower resolution global circulation model with data assimilation
NASA Astrophysics Data System (ADS)
Canter, Martin; Barth, Alexander
2016-04-01
With this work, we aim at developping a new method of bias correction using data assimilation. This method is based on the stochastic forcing of a model to correct bias. First, through a preliminary run, we estimate the bias of the model and its possible sources. Then, we establish a forcing term which is directly added inside the model's equations. We create an ensemble of runs and consider the forcing term as a control variable during the assimilation of observations. We then use this analysed forcing term to correct the bias of the model. Since the forcing is added inside the model, it acts as a source term, unlike external forcings such as wind. This procedure has been developed and successfully tested with a twin experiment on a Lorenz 95 model. It is currently being applied and tested on the sea ice ocean NEMO LIM model, which is used in the PredAntar project. NEMO LIM is a global and low resolution (2 degrees) coupled model (hydrodynamic model and sea ice model) with long time steps allowing simulations over several decades. Due to its low resolution, the model is subject to bias in area where strong currents are present. We aim at correcting this bias by using perturbed current fields from higher resolution models and randomly generated perturbations. The random perturbations need to be constrained in order to respect the physical properties of the ocean, and not create unwanted phenomena. To construct those random perturbations, we first create a random field with the Diva tool (Data-Interpolating Variational Analysis). Using a cost function, this tool penalizes abrupt variations in the field, while using a custom correlation length. It also decouples disconnected areas based on topography. Then, we filter the field to smoothen it and remove small scale variations. We use this field as a random stream function, and take its derivatives to get zonal and meridional velocity fields. We also constrain the stream function along the coasts in order not to have currents perpendicular to the coast. The randomly generated stochastic forcing are then directly injected into the NEMO LIM model's equations in order to force the model at each timestep, and not only during the assimilation step. Results from a twin experiment will be presented. This method is being applied to a real case, with observations on the sea surface height available from the mean dynamic topography of CNES (Centre national d'études spatiales). The model, the bias correction, and more extensive forcings, in particular with a three dimensional structure and a time-varying component, will also be presented.
Power calculator for instrumental variable analysis in pharmacoepidemiology
Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M
2017-01-01
Abstract Background Instrumental variable analysis, for example with physicians’ prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. Methods and Results The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. Conclusions The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. PMID:28575313
NASA Astrophysics Data System (ADS)
Hadjiagapiou, Ioannis A.; Velonakis, Ioannis N.
2018-07-01
The Sherrington-Kirkpatrick Ising spin glass model, in the presence of a random magnetic field, is investigated within the framework of the one-step replica symmetry breaking. The two random variables (exchange integral interaction Jij and random magnetic field hi) are drawn from a joint Gaussian probability density function characterized by a correlation coefficient ρ, assuming positive and negative values. The thermodynamic properties, the three different phase diagrams and system's parameters are computed with respect to the natural parameters of the joint Gaussian probability density function at non-zero and zero temperatures. The low temperature negative entropy controversy, a result of the replica symmetry approach, has been partly remedied in the current study, leading to a less negative result. In addition, the present system possesses two successive spin glass phase transitions with characteristic temperatures.
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-04-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-01-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325
Five-Year-Old Cottonwood Plantation on a Clay Site: Growth, Yield, and Soil Properties
R. M. Krinard; H. E. Kennedy
1980-01-01
A random sample of Stoneville select cottonwood (Populus deltoides Bartr.) clones planted on recent old-field clay soils at 12- by 12- foot spacing averaged 75-percent survival after five years. The growth and yield was about half that expected from planted cottonwood on medium-textured soils. Soil moisture analysis showed more height growth in years...
Robinson, Sean; Nevalainen, Jaakko; Pinna, Guillaume; Campalans, Anna; Radicella, J. Pablo; Guyon, Laurent
2017-01-01
Abstract Motivation: Incorporating gene interaction data into the identification of ‘hit’ genes in genomic experiments is a well-established approach leveraging the ‘guilt by association’ assumption to obtain a network based hit list of functionally related genes. We aim to develop a method to allow for multivariate gene scores and multiple hit labels in order to extend the analysis of genomic screening data within such an approach. Results: We propose a Markov random field-based method to achieve our aim and show that the particular advantages of our method compared with those currently used lead to new insights in previously analysed data as well as for our own motivating data. Our method additionally achieves the best performance in an independent simulation experiment. The real data applications we consider comprise of a survival analysis and differential expression experiment and a cell-based RNA interference functional screen. Availability and implementation: We provide all of the data and code related to the results in the paper. Contact: sean.j.robinson@utu.fi or laurent.guyon@cea.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28881978
Distributed memory parallel Markov random fields using graph partitioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heinemann, C.; Perciano, T.; Ushizima, D.
Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less
Conditional random fields for pattern recognition applied to structured data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, Tom; Skurikhin, Alexei
In order to predict labels from an output domain, Y, pattern recognition is used to gather measurements from an input domain, X. Image analysis is one setting where one might want to infer whether a pixel patch contains an object that is “manmade” (such as a building) or “natural” (such as a tree). Suppose the label for a pixel patch is “manmade”; if the label for a nearby pixel patch is then more likely to be “manmade” there is structure in the output domain that can be exploited to improve pattern recognition performance. Modeling P(X) is difficult because features betweenmore » parts of the model are often correlated. Thus, conditional random fields (CRFs) model structured data using the conditional distribution P(Y|X = x), without specifying a model for P(X), and are well suited for applications with dependent features. Our paper has two parts. First, we overview CRFs and their application to pattern recognition in structured problems. Our primary examples are image analysis applications in which there is dependence among samples (pixel patches) in the output domain. Second, we identify research topics and present numerical examples.« less
Pécot, Thierry; Bouthemy, Patrick; Boulanger, Jérôme; Chessel, Anatole; Bardin, Sabine; Salamero, Jean; Kervrann, Charles
2015-02-01
Image analysis applied to fluorescence live cell microscopy has become a key tool in molecular biology since it enables to characterize biological processes in space and time at the subcellular level. In fluorescence microscopy imaging, the moving tagged structures of interest, such as vesicles, appear as bright spots over a static or nonstatic background. In this paper, we consider the problem of vesicle segmentation and time-varying background estimation at the cellular scale. The main idea is to formulate the joint segmentation-estimation problem in the general conditional random field framework. Furthermore, segmentation of vesicles and background estimation are alternatively performed by energy minimization using a min cut-max flow algorithm. The proposed approach relies on a detection measure computed from intensity contrasts between neighboring blocks in fluorescence microscopy images. This approach permits analysis of either 2D + time or 3D + time data. We demonstrate the performance of the so-called C-CRAFT through an experimental comparison with the state-of-the-art methods in fluorescence video-microscopy. We also use this method to characterize the spatial and temporal distribution of Rab6 transport carriers at the cell periphery for two different specific adhesion geometries.
Conditional random fields for pattern recognition applied to structured data
Burr, Tom; Skurikhin, Alexei
2015-07-14
In order to predict labels from an output domain, Y, pattern recognition is used to gather measurements from an input domain, X. Image analysis is one setting where one might want to infer whether a pixel patch contains an object that is “manmade” (such as a building) or “natural” (such as a tree). Suppose the label for a pixel patch is “manmade”; if the label for a nearby pixel patch is then more likely to be “manmade” there is structure in the output domain that can be exploited to improve pattern recognition performance. Modeling P(X) is difficult because features betweenmore » parts of the model are often correlated. Thus, conditional random fields (CRFs) model structured data using the conditional distribution P(Y|X = x), without specifying a model for P(X), and are well suited for applications with dependent features. Our paper has two parts. First, we overview CRFs and their application to pattern recognition in structured problems. Our primary examples are image analysis applications in which there is dependence among samples (pixel patches) in the output domain. Second, we identify research topics and present numerical examples.« less
Charged-particle motion in multidimensional magnetic-field turbulence
NASA Technical Reports Server (NTRS)
Giacalone, J.; Jokipii, J. R.
1994-01-01
We present a new analysis of the fundamental physics of charged-particle motion in a turbulent magnetic field using a numerical simulation. The magnetic field fluctuations are taken to be static and to have a power spectrum which is Kolmogorov. The charged particles are treated as test particles. It is shown that when the field turbulence is independent of one coordinate (i.e., k lies in a plane), the motion of these particles across the magnetic field is essentially zero, as required by theory. Consequently, the only motion across the average magnetic field direction that is allowed is that due to field-line random walk. On the other hand, when a fully three-dimensional realization of the turbulence is considered, the particles readily cross the field. Transport coefficients both along and across the ambient magnetic field are computed. This scheme provides a direct computation of the Fokker-Planck coefficients based on the motions of individual particles, and allows for comparison with analytic theory.
Evaluation of random errors in Williams’ series coefficients obtained with digital image correlation
NASA Astrophysics Data System (ADS)
Lychak, Oleh V.; Holyns'kiy, Ivan S.
2016-03-01
The use of the Williams’ series parameters for fracture analysis requires valid information about their error values. The aim of this investigation is the development of the method for estimation of the standard deviation of random errors of the Williams’ series parameters, obtained from the measured components of the stress field. Also, the criteria for choosing the optimal number of terms in the truncated Williams’ series for derivation of their parameters with minimal errors is proposed. The method was used for the evaluation of the Williams’ parameters, obtained from the data, and measured by the digital image correlation technique for testing a three-point bending specimen.
Trojak, Benoit; Sauvaget, Anne; Fecteau, Shirley; Lalanne, Laurence; Chauvet-Gelinier, Jean-Christophe; Koch, Sonja; Bulteau, Samuel; Zullino, Daniele; Achab, Sophia
2017-01-01
Non-invasive brain stimulation (NIBS) might be a new approach to treat substance use disorders (SUD). A systematic review and critical analysis was performed to identify potential therapeutic effects of NIBS on addictions. A search of the Medline database was conducted for randomized sham-controlled trials using NIBS in the field of addiction and published until August 2016. Twenty-six studies in various SUD met the inclusion criteria. Converging evidence indicates that NIBS might be a promising mean to treat patients with alcohol and tobacco use disorders, by acting on craving reduction and other mechanisms such as improvement in cognitive dysfunctions.
The Multi-Orientable Random Tensor Model, a Review
NASA Astrophysics Data System (ADS)
Tanasa, Adrian
2016-06-01
After its introduction (initially within a group field theory framework) in [Tanasa A., J. Phys. A: Math. Theor. 45 (2012), 165401, 19 pages, arXiv:1109.0694], the multi-orientable (MO) tensor model grew over the last years into a solid alternative of the celebrated colored (and colored-like) random tensor model. In this paper we review the most important results of the study of this MO model: the implementation of the 1/N expansion and of the large N limit (N being the size of the tensor), the combinatorial analysis of the various terms of this expansion and finally, the recent implementation of a double scaling limit.
Theoretical and observational analysis of spacecraft fields
NASA Technical Reports Server (NTRS)
Neubauer, F. M.; Schatten, K. H.
1972-01-01
In order to investigate the nondipolar contributions of spacecraft magnetic fields a simple magnetic field model is proposed. This model consists of randomly oriented dipoles in a given volume. Two sets of formulas are presented which give the rms-multipole field components, for isotropic orientations of the dipoles at given positions and for isotropic orientations of the dipoles distributed uniformly throughout a cube or sphere. The statistical results for an 8 cu m cube together with individual examples computed numerically show the following features: Beyond about 2 to 3 m distance from the center of the cube, the field is dominated by an equivalent dipole. The magnitude of the magnetic moment of the dipolar part is approximated by an expression for equal magnetic moments or generally by the Pythagorean sum of the dipole moments. The radial component is generally greater than either of the transverse components for the dipole portion as well as for the nondipolar field contributions.
Enhancement of Electron Acceleration in Laser Wakefields by Random Fields
NASA Astrophysics Data System (ADS)
Tataronis, J. A.; Petržílka, V.
1999-11-01
There is increasing evidence that intense laser pulses can accelerate electrons to high energies. The energy appears to increase with the distance over which the electrons are accelerated. This is difficult to explain by electron trapping in a single wakefield wave.^1 We demonstrate that enhanced electron acceleration can arise in inhomogeneous laser wakefields through the effects of spontaneously excited random fields. This acceleration mechanism is analogous to fast electron production by random fields near rf antennae in fusion devices and helicon plasma sources.^2 Electron acceleration in a transverse laser wave due to random field effects was recently found.^3 In the present study we solve numerically the governing equations of an ensemble of test electrons in a longitudinal electric wakefield perturbed by random fields. [1pt] Supported by the Czech grant IGA A1043701 and the U.S. DOE under grant No. DE-FG02-97ER54398. [1pt] 1. A. Pukhov and J. Meyer-ter-Vehn, in Superstrong Fields in Plasmas, AIP Conf. Proc. 426, p. 93 (1997). 2. V. Petržílka, J. A. Tataronis, et al., in Proc. Varenna - Lausanne Fusion Theory Workshop, p. 95 (1998). 3. J. Meyer-ter-Vehn and Z. M. Sheng, Phys. Plasmas 6, 641 (1999).
Effective pore-scale dispersion upscaling with a correlated continuous time random walk approach
NASA Astrophysics Data System (ADS)
Le Borgne, T.; Bolster, D.; Dentz, M.; de Anna, P.; Tartakovsky, A.
2011-12-01
We investigate the upscaling of dispersion from a pore-scale analysis of Lagrangian velocities. A key challenge in the upscaling procedure is to relate the temporal evolution of spreading to the pore-scale velocity field properties. We test the hypothesis that one can represent Lagrangian velocities at the pore scale as a Markov process in space. The resulting effective transport model is a continuous time random walk (CTRW) characterized by a correlated random time increment, here denoted as correlated CTRW. We consider a simplified sinusoidal wavy channel model as well as a more complex heterogeneous pore space. For both systems, the predictions of the correlated CTRW model, with parameters defined from the velocity field properties (both distribution and correlation), are found to be in good agreement with results from direct pore-scale simulations over preasymptotic and asymptotic times. In this framework, the nontrivial dependence of dispersion on the pore boundary fluctuations is shown to be related to the competition between distribution and correlation effects. In particular, explicit inclusion of spatial velocity correlation in the effective CTRW model is found to be important to represent incomplete mixing in the pore throats.
Strand-seq: a unifying tool for studies of chromosome segregation.
Falconer, Ester; Lansdorp, Peter M
2013-01-01
Non random segregation of sister chromatids has been implicated to help specify daughter cell fate (the Silent Sister Hypothesis [1]) or to protect the genome of long-lived stem cells (the Immortal Strand Hypothesis [2]). The idea that sister chromatids are non-randomly segregated into specific daughter cells is only marginally supported by data in sporadic and often contradictory studies. As a result, the field has moved forward rather slowly. The advent of being able to directly label and differentiate sister chromatids in vivo using fluorescence in situ hybridization [3] was a significant advance for such studies. However, this approach is limited by the need for large tracks of unidirectional repeats on chromosomes and the reliance on quantitative imaging of fluorescent probes and rigorous statistical analysis to discern between the two competing hypotheses. A novel method called Strand-seq which uses next-generation sequencing to assay sister chromatid inheritance patterns independently for each chromosome [4] offers a comprehensive approach to test for non-random segregation. In addition Strand-seq enables studies on the deposition of chromatin marks in relation to DNA replication. This method is expected to help unify the field by testing previous claims of non-random segregation in an unbiased way in many model systems in vitro and in vivo. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Topological signatures of interstellar magnetic fields - I. Betti numbers and persistence diagrams
NASA Astrophysics Data System (ADS)
Makarenko, Irina; Shukurov, Anvar; Henderson, Robin; Rodrigues, Luiz F. S.; Bushby, Paul; Fletcher, Andrew
2018-04-01
The interstellar medium (ISM) is a magnetized system in which transonic or supersonic turbulence is driven by supernova explosions. This leads to the production of intermittent, filamentary structures in the ISM gas density, whilst the associated dynamo action also produces intermittent magnetic fields. The traditional theory of random functions, restricted to second-order statistical moments (or power spectra), does not adequately describe such systems. We apply topological data analysis (TDA), sensitive to all statistical moments and independent of the assumption of Gaussian statistics, to the gas density fluctuations in a magnetohydrodynamic simulation of the multiphase ISM. This simulation admits dynamo action, so produces physically realistic magnetic fields. The topology of the gas distribution, with and without magnetic fields, is quantified in terms of Betti numbers and persistence diagrams. Like the more standard correlation analysis, TDA shows that the ISM gas density is sensitive to the presence of magnetic fields. However, TDA gives us important additional information that cannot be obtained from correlation functions. In particular, the Betti numbers per correlation cell are shown to be physically informative. Magnetic fields make the ISM more homogeneous, reducing the abundance of both isolated gas clouds and cavities, with a stronger effect on the cavities. Remarkably, the modification of the gas distribution by magnetic fields is captured by the Betti numbers even in regions more than 300 pc from the mid-plane, where the magnetic field is weaker and correlation analysis fails to detect any signatures of magnetic effects.
A Novel Weighted Kernel PCA-Based Method for Optimization and Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Thimmisetty, C.; Talbot, C.; Chen, X.; Tong, C. H.
2016-12-01
It has been demonstrated that machine learning methods can be successfully applied to uncertainty quantification for geophysical systems through the use of the adjoint method coupled with kernel PCA-based optimization. In addition, it has been shown through weighted linear PCA how optimization with respect to both observation weights and feature space control variables can accelerate convergence of such methods. Linear machine learning methods, however, are inherently limited in their ability to represent features of non-Gaussian stochastic random fields, as they are based on only the first two statistical moments of the original data. Nonlinear spatial relationships and multipoint statistics leading to the tortuosity characteristic of channelized media, for example, are captured only to a limited extent by linear PCA. With the aim of coupling the kernel-based and weighted methods discussed, we present a novel mathematical formulation of kernel PCA, Weighted Kernel Principal Component Analysis (WKPCA), that both captures nonlinear relationships and incorporates the attribution of significance levels to different realizations of the stochastic random field of interest. We also demonstrate how new instantiations retaining defining characteristics of the random field can be generated using Bayesian methods. In particular, we present a novel WKPCA-based optimization method that minimizes a given objective function with respect to both feature space random variables and observation weights through which optimal snapshot significance levels and optimal features are learned. We showcase how WKPCA can be applied to nonlinear optimal control problems involving channelized media, and in particular demonstrate an application of the method to learning the spatial distribution of material parameter values in the context of linear elasticity, and discuss further extensions of the method to stochastic inversion.
Recognition and processing of randomly fluctuating electric signals by Na,K-ATPase.
Xie, T. D.; Marszalek, P.; Chen, Y. D.; Tsong, T. Y.
1994-01-01
Previous work has shown that Na,K-ATPase of human erythrocytes can extract free energy from sinusoidal electric fields to pump cations up their respective concentration gradients. Because regularly oscillating waveform is not a feature of the transmembrane electric potential of cells, questions have been raised whether these observed effects are biologically relevant. Here we show that a random-telegraph fluctuating electric field (RTF) consisting of alternating square electric pulses with random lifetimes can also stimulate the Rb(+)-pumping mode of the Na,K-ATPase. The net RTF-stimulated, ouabain-sensitive Rb+ pumping was monitored with 86Rb+. The tracer-measured, Rb+ influx exhibited frequency and amplitude dependencies that peaked at the mean frequency of 1.0 kHz and amplitude of 20 V/cm. At 4 degrees C, the maximal pumping activity under these optimal conditions was 28 Rb+/RBC-hr, which is approximately 50% higher than that obtained with the sinusoidal electric field. These findings indicate that Na,K-ATPase can recognize an electric signal, either regularly oscillatory or randomly fluctuating, for energy coupling, with high fidelity. The use of RTF for activation also allowed a quantitative theoretical analysis of kinetics of a membrane transport model of any complexity according to the theory of electroconformational coupling (ECC) by the diagram methods. A four-state ECC model was shown to produce the amplitude and the frequency windows of the Rb(+)-pumping if the free energy of interaction of the transporter with the membrane potential was to include a nonlinear quadratic term. Kinetic constants for the ECC model have been derived. These results indicate that the ECC is a plausible mechanism for the recognition and processing of electric signals by proteins of the cell membrane. PMID:7811939
Random isotropic one-dimensional XY-model
NASA Astrophysics Data System (ADS)
Gonçalves, L. L.; Vieira, A. P.
1998-01-01
The 1D isotropic s = ½XY-model ( N sites), with random exchange interaction in a transverse random field is considered. The random variables satisfy bimodal quenched distributions. The solution is obtained by using the Jordan-Wigner fermionization and a canonical transformation, reducing the problem to diagonalizing an N × N matrix, corresponding to a system of N noninteracting fermions. The calculations are performed numerically for N = 1000, and the field-induced magnetization at T = 0 is obtained by averaging the results for the different samples. For the dilute case, in the uniform field limit, the magnetization exhibits various discontinuities, which are the consequence of the existence of disconnected finite clusters distributed along the chain. Also in this limit, for finite exchange constants J A and J B, as the probability of J A varies from one to zero, the saturation field is seen to vary from Γ A to Γ B, where Γ A(Γ B) is the value of the saturation field for the pure case with exchange constant equal to J A(J B) .
Njenga, M. Kariuki; Njagi, Leonard; Thumbi, S. Mwangi; Kahariri, Samuel; Githinji, Jane; Omondi, Eunice; Baden, Amy; Murithi, Mbabu; Paweska, Janusz; Ithondeka, Peter M.; Ngeiywa, Kisa J.; Dungu, Baptiste; Donadeu, Meritxell; Munyua, Peninah M.
2015-01-01
Background Although livestock vaccination is effective in preventing Rift Valley fever (RVF) epidemics, there are concerns about safety and effectiveness of the only commercially available RVF Smithburn vaccine. We conducted a randomized controlled field trial to evaluate the immunogenicity and safety of the new RVF Clone 13 vaccine, recently registered in South Africa. Methods In a blinded randomized controlled field trial, 404 animals (85 cattle, 168 sheep, and 151 goats) in three farms in Kenya were divided into three groups. Group A included males and non-pregnant females that were randomized and assigned to two groups; one vaccinated with RVF Clone 13 and the other given placebo. Groups B included animals in 1st half of pregnancy, and group C animals in 2nd half of pregnancy, which were also randomized and either vaccinated and given placebo. Animals were monitored for one year and virus antibodies titers assessed on days 14, 28, 56, 183 and 365. Results In vaccinated goats (N = 72), 72% developed anti-RVF virus IgM antibodies and 97% neutralizing IgG antibodies. In vaccinated sheep (N = 77), 84% developed IgM and 91% neutralizing IgG antibodies. Vaccinated cattle (N = 42) did not develop IgM antibodies but 67% developed neutralizing IgG antibodies. At day 14 post-vaccination, the odds of being seropositive for IgG in the vaccine group was 3.6 (95% CI, 1.5 – 9.2) in cattle, 90.0 (95% CI, 25.1 – 579.2) in goats, and 40.0 (95% CI, 16.5 – 110.5) in sheep. Abortion was observed in one vaccinated goat but histopathologic analysis did not indicate RVF virus infection. There was no evidence of teratogenicity in vaccinated or placebo animals. Conclusions The results suggest RVF Clone 13 vaccine is safe to use and has high (>90%) immunogenicity in sheep and goats but moderate (> 65%) immunogenicity in cattle. PMID:25756501
NASA Astrophysics Data System (ADS)
Mandolesi, E.; Moorkamp, M.; Jones, A. G.
2014-12-01
Most electromagnetic (EM) geophysical methods focus on the electrical conductivity of rocks and sediments to determine the geological structure of the subsurface. Electric conductivity itself is measured in the laboratory with a wide range of instruments and techniques. These measurements seldom return a compatible result. The presence of partially-interconnected random pathways of electrically conductive materials in resistive hosts has been studied for decades, and recently with increasing interest. To comprehend which conductive mechanism scales from the microstructures up to field electrical conductivity measurements, two main branch of studies have been undertaken: statistical probability of having a conductive pathways and mixing laws. Several numerical approaches have been tested to understand the effects of interconnected pathways of conductors at field scale. Usually these studies were restricted in two ways: the sources are considered constant in time (i.e., DC) and the domain is, with few exception, two-dimensional. We simulated the effects of time-varying EM sources on the conductivity measured on the surface of a three-dimensional randomly generated body embedded in an uniform host by using electromagnetic induction equations. We modelled a two-phase mixture of resistive and conductive elements with the goal of comparing the conductivity measured on field scale with the one proper of the elements constituting the random rock, and to test how the internal structures influence the directionality of the responses. Moreover, we modelled data from randomly generated bodies characterized by coherent internal structures, to check the effect of the named structures on the anisotropy of the effective conductivity. We compared these values with the electrical conductivity limits predicted by Hashin-Shtrikman bounds and the effective conductivity predicted by the Archie's law, both cast in its classic form and in an updated that allow to take in account two materials. The same analysis was done for both the resistive and the conductive conductivity values for the anisotropic case.
NASA Technical Reports Server (NTRS)
Pope, L. D.; Rennison, D. C.; Wilby, E. G.
1980-01-01
The basic theoretical work required to understand sound transmission into an enclosed space (that is, one closed by the transmitting structure) is developed for random pressure fields and for harmonic (tonal) excitation. The analysis is used to predict the noise reducton of unpressurized unstiffened cylinder, and also the interior response of the cylinder given a tonal (plane wave) excitation. Predictions and measurements are compared and the transmission is analyzed. In addition, results for tonal (harmonic) mechanical excitation are considered.
NASA Astrophysics Data System (ADS)
Reynders, Edwin P. B.; Langley, Robin S.
2018-08-01
The hybrid deterministic-statistical energy analysis method has proven to be a versatile framework for modeling built-up vibro-acoustic systems. The stiff system components are modeled deterministically, e.g., using the finite element method, while the wave fields in the flexible components are modeled as diffuse. In the present paper, the hybrid method is extended such that not only the ensemble mean and variance of the harmonic system response can be computed, but also of the band-averaged system response. This variance represents the uncertainty that is due to the assumption of a diffuse field in the flexible components of the hybrid system. The developments start with a cross-frequency generalization of the reciprocity relationship between the total energy in a diffuse field and the cross spectrum of the blocked reverberant loading at the boundaries of that field. By making extensive use of this generalization in a first-order perturbation analysis, explicit expressions are derived for the cross-frequency and band-averaged variance of the vibrational energies in the diffuse components and for the cross-frequency and band-averaged variance of the cross spectrum of the vibro-acoustic field response of the deterministic components. These expressions are extensively validated against detailed Monte Carlo analyses of coupled plate systems in which diffuse fields are simulated by randomly distributing small point masses across the flexible components, and good agreement is found.
NASA Technical Reports Server (NTRS)
Crowe, D. R.; Henricks, W.
1983-01-01
The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhary, Kenny; Najm, Habib N.
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
Chowdhary, Kenny; Najm, Habib N.
2016-04-13
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
Towards rigorous analysis of the Levitov-Mirlin-Evers recursion
NASA Astrophysics Data System (ADS)
Fyodorov, Y. V.; Kupiainen, A.; Webb, C.
2016-12-01
This paper aims to develop a rigorous asymptotic analysis of an approximate renormalization group recursion for inverse participation ratios P q of critical powerlaw random band matrices. The recursion goes back to the work by Mirlin and Evers (2000 Phys. Rev. B 62 7920) and earlier works by Levitov (1990 Phys. Rev. Lett. 64 547, 1999 Ann. Phys. 8 697-706) and is aimed to describe the ensuing multifractality of the eigenvectors of such matrices. We point out both similarities and dissimilarities between the LME recursion and those appearing in the theory of multiplicative cascades and branching random walks and show that the methods developed in those fields can be adapted to the present case. In particular the LME recursion is shown to exhibit a phase transition, which we expect is a freezing transition, where the role of temperature is played by the exponent q. However, the LME recursion has features that make its rigorous analysis considerably harder and we point out several open problems for further study.
Engineering Design Handbook. Army Weapon Systems Analysis. Part 2
1979-10-01
EXPERIMENTAL DESIGN ............................... ............ 41-3 41-5 RESULTS OF THE ASARS lIX SIMULATIONS ........................... 41-4 41-6 LATIN...sciences and human factors engineering fields utilizing experimental methodology and multi-variable statistical techniques drawn from experimental ...randomly to grenades for the test design . The nine experimental types of hand grenades (first’ nine in Table 33-2) had a "pip" on their spherical
ERIC Educational Resources Information Center
Narli, Serkan; Yorek, Nurettin; Sahin, Mehmet; Usak, Muhammet
2010-01-01
This study investigates the possibility of analyzing educational data using the theory of rough sets which is mostly employed in the fields of data analysis and data mining. Data were collected using an open-ended conceptual understanding test of the living things administered to first-year high school students. The responses of randomly selected…
ERIC Educational Resources Information Center
Stamovlasis, Dimitrios; Tsitsipis, Georgios; Papageorgiou, George
2010-01-01
This work uses the concepts and tools of complexity theory to examine the effect of logical thinking and two cognitive styles, such as, the degree of field dependence/independence and the convergent/divergent thinking on students' understanding of the structure of matter. Students were categorized according to the model they adopted for the…
Xiaoqian Sun; Zhuoqiong He; John Kabrick
2008-01-01
This paper presents a Bayesian spatial method for analysing the site index data from the Missouri Ozark Forest Ecosystem Project (MOFEP). Based on ecological background and availability, we select three variables, the aspect class, the soil depth and the land type association as covariates for analysis. To allow great flexibility of the smoothness of the random field,...
Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo
Herckenrath, Daan; Langevin, Christian D.; Doherty, John
2011-01-01
Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of generating calibration-constrained parameter fields approximately doubled. Predictive uncertainty variance computed through the NSMC method was compared with that computed through linear analysis. The results were in good agreement, with the NSMC method estimate showing a slightly smaller range of prediction uncertainty than was calculated by the linear method. Copyright 2011 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Musenge, Eustasius; Chirwa, Tobias Freeman; Kahn, Kathleen; Vounatsou, Penelope
2013-06-01
Longitudinal mortality data with few deaths usually have problems of zero-inflation. This paper presents and applies two Bayesian models which cater for zero-inflation, spatial and temporal random effects. To reduce the computational burden experienced when a large number of geo-locations are treated as a Gaussian field (GF) we transformed the field to a Gaussian Markov Random Fields (GMRF) by triangulation. We then modelled the spatial random effects using the Stochastic Partial Differential Equations (SPDEs). Inference was done using a computationally efficient alternative to Markov chain Monte Carlo (MCMC) called Integrated Nested Laplace Approximation (INLA) suited for GMRF. The models were applied to data from 71,057 children aged 0 to under 10 years from rural north-east South Africa living in 15,703 households over the years 1992-2010. We found protective effects on HIV/TB mortality due to greater birth weight, older age and more antenatal clinic visits during pregnancy (adjusted RR (95% CI)): 0.73(0.53;0.99), 0.18(0.14;0.22) and 0.96(0.94;0.97) respectively. Therefore childhood HIV/TB mortality could be reduced if mothers are better catered for during pregnancy as this can reduce mother-to-child transmissions and contribute to improved birth weights. The INLA and SPDE approaches are computationally good alternatives in modelling large multilevel spatiotemporal GMRF data structures.
Random Interchange of Magnetic Connectivity
NASA Astrophysics Data System (ADS)
Matthaeus, W. H.; Ruffolo, D. J.; Servidio, S.; Wan, M.; Rappazzo, A. F.
2015-12-01
Magnetic connectivity, the connection between two points along a magnetic field line, has a stochastic character associated with field lines random walking in space due to magnetic fluctuations, but connectivity can also change in time due to dynamical activity [1]. For fluctuations transverse to a strong mean field, this connectivity change be caused by stochastic interchange due to component reconnection. The process may be understood approximately by formulating a diffusion-like Fokker-Planck coefficient [2] that is asymptotically related to standard field line random walk. Quantitative estimates are provided, for transverse magnetic field models and anisotropic models such as reduced magnetohydrodynamics. In heliospheric applications, these estimates may be useful for understanding mixing between open and close field line regions near coronal hole boundaries, and large latitude excursions of connectivity associated with turbulence. [1] A. F. Rappazzo, W. H. Matthaeus, D. Ruffolo, S. Servidio & M. Velli, ApJL, 758, L14 (2012) [2] D. Ruffolo & W. Matthaeus, ApJ, 806, 233 (2015)
The magnetic field of the Milky Way
NASA Astrophysics Data System (ADS)
Jansson, Ronnie
The magnetic field of the Milky Way is a significant component of our Galaxy, and impacts a great variety of Galactic processes. For example, it regulates star formation, accelerates cosmic rays, transports energy and momentum, acts as a source of pressure, and obfuscates the arrival directions of ultrahigh energy cosmic rays (UHECRs). This thesis is mainly concerned with the large scale Galactic magnetic field (GMF), and the effect it has on UHECRs. In Chapter 1 we review what is known about Galactic and extragalactic magnetic fields, their origin, the different observables of the GMF, and the ancillary data that is necessary to constrain astrophysical magnetic fields. Chapter 2 introduces a method to quantify the quality-of-fit between data and observables sensitive to the large scale Galactic magnetic field. We combine WMAP5 polarized synchrotron data and rotation measures of extragalactic sources in a joint analysis to obtain best-fit parameters and confidence levels for GMF models common in the literature. None of the existing models provide a good fit in both the disk and halo regions, and in many instances best-fit parameters are quite different than the original values. We introduce a simple model of the magnetic field in the halo that provides a much improved fit to the data. We show that some characteristics of the electron densities can already be constrained using our method and with future data it may be possible to carry out a self-consistent analysis in which models of the GMF and electron densities are simultaneously optimized. Chapter 3 investigates the observed excess of UHECRs in the region of the sky close to the nearby radio galaxy Centaurus A. We constrain the large-scale Galactic magnetic field and the small-scale random magnetic field in the direction of Cen A, and estimate the deflection of the observed UHECRs and predict their source positions on the sky. We find that the deflection due to random fields are small compared to deflections due to the regular field. Assuming the UHECRs are protons we find that 4 of the published Auger events above 57 EeV are consistent with coming from Cen A.We conclude that the proposed scenarios in which most of the events within approximately 20° of Cen A come from it are unlikely, regardless of the composition of the UHECRs.
Mock, U; Dieckmann, K; Wolff, U; Knocke, T H; Pötter, R
1999-08-01
Geometrical accuracy in patient positioning can vary substantially during external radiotherapy. This study estimated the set-up accuracy during pelvic irradiation for gynecological malignancies for determination of safety margins (planning target volume, PTV). Based on electronic portal imaging devices (EPID), 25 patients undergoing 4-field pelvic irradiation for gynecological malignancies were analyzed with regard to set-up accuracy during the treatment course. Regularly performed EPID images were used in order to systematically assess the systematic and random component of set-up displacements. Anatomical matching of verification and simulation images was followed by measuring corresponding distances between the central axis and anatomical features. Data analysis of set-up errors referred to the x-, y-,and z-axes. Additionally, cumulative frequencies were evaluated. A total of 50 simulation films and 313 verification images were analyzed. For the anterior-posterior (AP) beam direction mean deviations along the x- and z-axes were 1.5 mm and -1.9 mm, respectively. Moreover, random errors of 4.8 mm (x-axis) and 3.0 mm (z-axis) were determined. Concerning the latero-lateral treatment fields, the systematic errors along the two axes were calculated to 2.9 mm (y-axis) and -2.0 mm (z-axis) and random errors of 3.8 mm and 3.5 mm were found, respectively. The cumulative frequency of misalignments < or =5 mm showed values of 75% (AP fields) and 72% (latero-lateral fields). With regard to cumulative frequencies < or =10 mm quantification revealed values of 97% for both beam directions. During external pelvic irradiation therapy for gynecological malignancies, EPID images on a regular basis revealed acceptable set-up inaccuracies. Safety margins (PTV) of 1 cm appear to be sufficient, accounting for more than 95% of all deviations.
Zhang, Renduo; Wood, A Lynn; Enfield, Carl G; Jeong, Seung-Woo
2003-01-01
Stochastical analysis was performed to assess the effect of soil spatial variability and heterogeneity on the recovery of denser-than-water nonaqueous phase liquids (DNAPL) during the process of surfactant-enhanced remediation. UTCHEM, a three-dimensional, multicomponent, multiphase, compositional model, was used to simulate water flow and chemical transport processes in heterogeneous soils. Soil spatial variability and heterogeneity were accounted for by considering the soil permeability as a spatial random variable and a geostatistical method was used to generate random distributions of the permeability. The randomly generated permeability fields were incorporated into UTCHEM to simulate DNAPL transport in heterogeneous media and stochastical analysis was conducted based on the simulated results. From the analysis, an exponential relationship between average DNAPL recovery and soil heterogeneity (defined as the standard deviation of log of permeability) was established with a coefficient of determination (r2) of 0.991, which indicated that DNAPL recovery decreased exponentially with increasing soil heterogeneity. Temporal and spatial distributions of relative saturations in the water phase, DNAPL, and microemulsion in heterogeneous soils were compared with those in homogeneous soils and related to soil heterogeneity. Cleanup time and uncertainty to determine DNAPL distributions in heterogeneous soils were also quantified. The study would provide useful information to design strategies for the characterization and remediation of nonaqueous phase liquid-contaminated soils with spatial variability and heterogeneity.
Infinite hidden conditional random fields for human behavior analysis.
Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja
2013-01-01
Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF (iHCRF), which is a nonparametric model based on hierarchical Dirichlet processes and is capable of automatically learning the optimal number of hidden states for a classification task. We show how we learn the model hyperparameters with an effective Markov-chain Monte Carlo sampling technique, and we explain the process that underlines our iHCRF model with the Restaurant Franchise Rating Agencies analogy. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCRFs--chosen via cross-validation--for the difficult tasks of recognizing instances of agreement, disagreement, and pain. Moreover, the iHCRF manages to achieve this performance in significantly less total training, validation, and testing time.
NASA Astrophysics Data System (ADS)
Tega, Naoki; Miki, Hiroshi; Mine, Toshiyuki; Ohmori, Kenji; Yamada, Keisaku
2014-03-01
It is demonstrated from a statistical perspective that the generation of random telegraph noise (RTN) changes before and after the application of negative-bias temperature instability (NBTI) stress. The NBTI stress generates a large number of permanent interface traps and, at the same time, a large number of RTN traps causing temporary RTN and one-time RTN. The interface trap and the RTN trap show different features in the recovery process. That is, a re-passivation of interface states is the minor cause of the recovery after the NBTI stress, and in contrast, rapid disappearance of the temporary RTN and the one-time RTN is the main cause of the recovery. The RTN traps are less likely to become permanent. This two-type trap, namely, the interface trap and RTN trap, model simply explains NBTI degradation and recovery in scaled p-channel metal-oxide-semiconductor field-effect transistors.
NASA Astrophysics Data System (ADS)
Maekawa, Keiichi; Makiyama, Hideki; Yamamoto, Yoshiki; Hasegawa, Takumi; Okanishi, Shinobu; Sonoda, Kenichiro; Shinkawata, Hiroki; Yamashita, Tomohiro; Kamohara, Shiro; Yamaguchi, Yasuo
2018-04-01
The low-frequency noise (LFN) variability in bulk and fully depleted silicon-on-insulator (FDSOI) metal–oxide–semiconductor field-effect transistor (MOSFET) with silicon on thin box (SOTB) technology was investigated. LFN typically shows a flicker noise component and a signal Lorentzian component by random telegraph noise (RTN). At a weak inversion state, the random dopant fluctuation (RDF) in a channel is strongly affected to not only RTN variability but also flicker noise variability in the bulk MOSFET compared with SOTB MOSFET because of local carrier number fluctuation in the channel. On the other hand, the typical level of LFN in SOTB MOSFET is slightly larger than that in the bulk MOSFET because of an additional interface on the buried oxide layer. However, considering the tailing characteristics of LFN variability, LFN in SOTB MOSFET can be assumed to be smaller than that in the bulk MOSFET, which enables the low-voltage operation of analog circuits.
Solar wind driving and substorm triggering
NASA Astrophysics Data System (ADS)
Newell, Patrick T.; Liou, Kan
2011-03-01
We compare solar wind driving and its changes for three data sets: (1) 4861 identifications of substorm onsets from satellite global imagers (Polar UVI and IMAGE FUV); (2) a similar number of otherwise random times chosen with a similar solar wind distribution (slightly elevated driving); (3) completely random times. Multiple measures of solar wind driving were used, including interplanetary magnetic field (IMF) Bz, the Kan-Lee electric field, the Borovsky function, and dΦMP/dt (all of which estimate dayside merging). Superposed epoch analysis verifies that the mean Bz has a northward turning (or at least averages less southward) starting 20 min before onset. We argue that the delay between IMF impact on the magnetopause and tail effects appearing in the ionosphere is about that long. The northward turning is not the effect of a few extreme events. The median field shows the same result, as do all other measures of solar wind driving. We compare the rate of northward turning to that observed after random times with slightly elevated driving. The subsequent reversion to mean is essentially the same between random elevations and substorms. To further verify this, we consider in detail the distribution of changes from the statistical peak (20 min prior to onset) to onset. For Bz, the mean change after onset is +0.14 nT (i.e., IMF becomes more northward), but the standard deviation is σ = 2.8 nT. Thus large changes in either direction are common. For EKL, the change is -15 nT km/s ± 830 nT km/s. Thus either a hypothesis predicting northward turnings or one predicting southward turnings would find abundant yet random confirming examples. Indeed, applying the Lyons et al. (1997) trigger criteria (excluding only the prior requirement of 22/30 min Bz < 0, which is often not valid for actual substorms) to these three sets of data shows that "northward turning triggers" occur in 23% of the random data, 24% of the actual substorms, and after 27% of the random elevations. These results strongly support the idea of Morley and Freeman (2007), that substorms require initial elevated solar wind driving, but that there is no evidence for external triggering. Finally dynamic pressure, p, and velocity, v, show no meaningful variation around onset (although p averages 10% above an 11 year mean).
Mean-field equations for neuronal networks with arbitrary degree distributions.
Nykamp, Duane Q; Friedman, Daniel; Shaker, Sammy; Shinn, Maxwell; Vella, Michael; Compte, Albert; Roxin, Alex
2017-04-01
The emergent dynamics in networks of recurrently coupled spiking neurons depends on the interplay between single-cell dynamics and network topology. Most theoretical studies on network dynamics have assumed simple topologies, such as connections that are made randomly and independently with a fixed probability (Erdös-Rényi network) (ER) or all-to-all connected networks. However, recent findings from slice experiments suggest that the actual patterns of connectivity between cortical neurons are more structured than in the ER random network. Here we explore how introducing additional higher-order statistical structure into the connectivity can affect the dynamics in neuronal networks. Specifically, we consider networks in which the number of presynaptic and postsynaptic contacts for each neuron, the degrees, are drawn from a joint degree distribution. We derive mean-field equations for a single population of homogeneous neurons and for a network of excitatory and inhibitory neurons, where the neurons can have arbitrary degree distributions. Through analysis of the mean-field equations and simulation of networks of integrate-and-fire neurons, we show that such networks have potentially much richer dynamics than an equivalent ER network. Finally, we relate the degree distributions to so-called cortical motifs.
Mean-field equations for neuronal networks with arbitrary degree distributions
NASA Astrophysics Data System (ADS)
Nykamp, Duane Q.; Friedman, Daniel; Shaker, Sammy; Shinn, Maxwell; Vella, Michael; Compte, Albert; Roxin, Alex
2017-04-01
The emergent dynamics in networks of recurrently coupled spiking neurons depends on the interplay between single-cell dynamics and network topology. Most theoretical studies on network dynamics have assumed simple topologies, such as connections that are made randomly and independently with a fixed probability (Erdös-Rényi network) (ER) or all-to-all connected networks. However, recent findings from slice experiments suggest that the actual patterns of connectivity between cortical neurons are more structured than in the ER random network. Here we explore how introducing additional higher-order statistical structure into the connectivity can affect the dynamics in neuronal networks. Specifically, we consider networks in which the number of presynaptic and postsynaptic contacts for each neuron, the degrees, are drawn from a joint degree distribution. We derive mean-field equations for a single population of homogeneous neurons and for a network of excitatory and inhibitory neurons, where the neurons can have arbitrary degree distributions. Through analysis of the mean-field equations and simulation of networks of integrate-and-fire neurons, we show that such networks have potentially much richer dynamics than an equivalent ER network. Finally, we relate the degree distributions to so-called cortical motifs.
NASA Technical Reports Server (NTRS)
Lassiter, Leslie W; Hess, Robert W
1958-01-01
Flat 2024-t3 aluminum panels measuring 11 inches by 13 inches were tested in the near noise fields of a 4-inch air jet and turbojet engine. The stresses which were developed in the panels are compared with those calculated by generalized harmonic analysis. The calculated and measured stresses were found to be in good agreement. In order to make the stress calculations, supplementary data relating to the transfer characteristics, damping, and static response of flat and curved panels under periodic loading are necessary and were determined experimentally. In addition, an appendix containing detailed data on the near pressure field of the turbojet engine is included.
Quenched bond randomness: Superfluidity in porous media and the strong violation of universality
NASA Astrophysics Data System (ADS)
Falicov, Alexis; Berker, A. Nihat
1997-04-01
The effects of quenched bond randomness are most readily studied with superfluidity immersed in a porous medium. A lattice model for3He-4He mixtures and incomplete4He fillings in aerogel yields the signature effect of bond randomness, namely the conversion of symmetry-breaking first-order phase transitions into second-order phase transitions, the λ-line reaching zero temperature, and the elimination of non-symmetry-breaking first-order phase transitions. The model recognizes the importance of the connected nature of aerogel randomness and thereby yields superfluidity at very low4He concentrations, a phase separation entirely within the superfluid phase, and the order-parameter contrast between mixtures and incomplete fillings, all in agreement with experiments. The special properties of the helium mixture/aerogel system are distinctly linked to the aerogel properties of connectivity, randomness, and tenuousness, via the additional study of a regularized “jungle-gym” aerogel. Renormalization-group calculations indicate that a strong violation of the empirical universality principle of critical phenomena occurs under quenched bond randomness. It is argued that helium/aerogel critical properties reflect this violation and further experiments are suggested. Renormalization-group analysis also shows that, adjoiningly to the strong universality violation (which hinges on the occurrence or non-occurrence of asymptotic strong coupling—strong randomness under rescaling), there is a new “hyperuniversality” at phase transitions with asymptotic strong coupling—strong randomness behavior, for example assigning the same critical exponents to random- bond tricriticality and random- field criticality.
The space transformation in the simulation of multidimensional random fields
Christakos, G.
1987-01-01
Space transformations are proposed as a mathematically meaningful and practically comprehensive approach to simulate multidimensional random fields. Within this context the turning bands method of simulation is reconsidered and improved in both the space and frequency domains. ?? 1987.
Small-World Network Spectra in Mean-Field Theory
NASA Astrophysics Data System (ADS)
Grabow, Carsten; Grosskinsky, Stefan; Timme, Marc
2012-05-01
Collective dynamics on small-world networks emerge in a broad range of systems with their spectra characterizing fundamental asymptotic features. Here we derive analytic mean-field predictions for the spectra of small-world models that systematically interpolate between regular and random topologies by varying their randomness. These theoretical predictions agree well with the actual spectra (obtained by numerical diagonalization) for undirected and directed networks and from fully regular to strongly random topologies. These results may provide analytical insights to empirically found features of dynamics on small-world networks from various research fields, including biology, physics, engineering, and social science.
Random field assessment of nanoscopic inhomogeneity of bone
Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu
2010-01-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128
Galuschka, Katharina; Ise, Elena; Krick, Kathrin; Schulte-Körne, Gerd
2014-01-01
Children and adolescents with reading disabilities experience a significant impairment in the acquisition of reading and spelling skills. Given the emotional and academic consequences for children with persistent reading disorders, evidence-based interventions are critically needed. The present meta-analysis extracts the results of all available randomized controlled trials. The aims were to determine the effectiveness of different treatment approaches and the impact of various factors on the efficacy of interventions. The literature search for published randomized-controlled trials comprised an electronic search in the databases ERIC, PsycINFO, PubMed, and Cochrane, and an examination of bibliographical references. To check for unpublished trials, we searched the websites clinicaltrials.com and ProQuest, and contacted experts in the field. Twenty-two randomized controlled trials with a total of 49 comparisons of experimental and control groups could be included. The comparisons evaluated five reading fluency trainings, three phonemic awareness instructions, three reading comprehension trainings, 29 phonics instructions, three auditory trainings, two medical treatments, and four interventions with coloured overlays or lenses. One trial evaluated the effectiveness of sunflower therapy and another investigated the effectiveness of motor exercises. The results revealed that phonics instruction is not only the most frequently investigated treatment approach, but also the only approach whose efficacy on reading and spelling performance in children and adolescents with reading disabilities is statistically confirmed. The mean effect sizes of the remaining treatment approaches did not reach statistical significance. The present meta-analysis demonstrates that severe reading and spelling difficulties can be ameliorated with appropriate treatment. In order to be better able to provide evidence-based interventions to children and adolescent with reading disabilities, research should intensify the application of blinded randomized controlled trials. PMID:24587110
NASA Astrophysics Data System (ADS)
Tampubolon, Togi; Hutahaean, Juniar; Siregar, Suryani N. J.
2018-03-01
Underwater research often uses geomagnets. It is one of the geophysical methods for measuring magnetic field variations. This research was done to identify how the subsurface rock structure is and determine kinds of rock based on its susceptibility value in Siogung-ogung geothermal area, Pangururan, Samosir District. The tool measurement of total magnetic field called Proton Precission Magnetometer, positioning using Global Position System, and north axis determination using geological compass. Data collection was done randomly with total 51 measuring points obtained. Data analysis started with International geomagnetics Reference Field correction to obtain the total magnetic field anomaly. Then, the data analysis of total magnetic anomaly was done by using surfer program 12. To get a magnetic anomaly cross section used Magdc For Windows program. Magnetic measurement results indicated that the variation of magnetic field strength in each point with the lowest magnetic intensity value of 41785.67 nano tesla. The highest magnetic intensity value is 43140, 33. From the results of qualitative interpretation, the magnetic anomaly value is at -200.92 to 1154.45 whereas the quantitative interpretive results of model show the existence of degradation and andesitic rocks, with the value of susceptibility
Hafdahl, Adam R; Williams, Michelle A
2009-03-01
In 2 Monte Carlo studies of fixed- and random-effects meta-analysis for correlations, A. P. Field (2001) ostensibly evaluated Hedges-Olkin-Vevea Fisher-z and Schmidt-Hunter Pearson-r estimators and tests in 120 conditions. Some authors have cited those results as evidence not to meta-analyze Fisher-z correlations, especially with heterogeneous correlation parameters. The present attempt to replicate Field's simulations included comparisons with analytic values as well as results for efficiency and confidence-interval coverage. Field's results under homogeneity were mostly replicable, but those under heterogeneity were not: The latter exhibited up to over .17 more bias than ours and, for tests of the mean correlation and homogeneity, respectively, nonnull rejection rates up to .60 lower and .65 higher. Changes to Field's observations and conclusions are recommended, and practical guidance is offered regarding simulation evidence and choices among methods. Most cautions about poor performance of Fisher-z methods are largely unfounded, especially with a more appropriate z-to-r transformation. The Appendix gives a computer program for obtaining Pearson-r moments from a normal Fisher-z distribution, which is used to demonstrate distortion due to direct z-to-r transformation of a mean Fisher-z correlation.
Global enhancement and structure formation of the magnetic field in spiral galaxies
NASA Astrophysics Data System (ADS)
Khoperskov, Sergey A.; Khrapov, Sergey S.
2018-01-01
In this paper we study numerically large-scale magnetic field evolution and its enhancement in gaseous disks of spiral galaxies. We consider a set of models with the various spiral pattern parameters and the initial magnetic field strength with taking into account gas self-gravity and cooling and heating processes. In agreement with previous studies we find out that galactic magnetic field is mostly aligned with gaseous structures, however small-scale gaseous structures (spurs and clumps) are more chaotic than the magnetic field structure. In spiral arms magnetic field often coexists with the gas distribution, in the inter-arm region we see filamentary magnetic field structure. These filaments connect several isolated gaseous clumps. Simulations reveal the presence of the small-scale irregularities of the magnetic field as well as the reversal of magnetic field at the outer edge of the large-scale spurs. We provide evidences that the magnetic field in the spiral arms has a stronger mean-field component, and there is a clear inverse correlation between gas density and plasma-beta parameter, compared to the rest of the disk with a more turbulent component of the field and an absence of correlation between gas density and plasma-beta. We show the mean field growth up to >3-10 μG in the cold gas during several rotation periods (>500-800 Myr), whereas ratio between azimuthal and radial field is equal to >4/1. We find an enhancement of random and ordered components of the magnetic field. Mean field strength increases by a factor of >1.5-2.5 for models with various spiral pattern parameters. Random magnetic field component can reach up to 25% from the total strength. By making an analysis of the time-dependent evolution of the radial Poynting flux, we point out that the magnetic field strength is enhanced more strongly at the galactic outskirts which is due to the radial transfer of magnetic energy by the spiral arms pushing the magnetic field outward. Our results also support the presence of sufficient conditions for the development of magnetorotational instability at distances >11 kpc after >300 Myr of evolution.
Coscia Requena, C; Muriel, A; Peñuelas, O
2018-02-28
Random allocation of treatment or intervention is the key feature of clinical trials and divides patients into treatment groups that are approximately balanced for baseline, and therefore comparable covariates except for the variable treatment of the study. However, in observational studies, where treatment allocation is not random, patients in the treatment and control groups often differ in covariates that are related to intervention variables. These imbalances in covariates can lead to biased estimates of the treatment effect. However, randomized clinical trials are sometimes not feasible for ethical, logistical, economic or other reasons. To resolve these situations, interest in the field of clinical research has grown in designing studies that are most similar to randomized experiments using observational (i.e. non-random) data. Observational studies using propensity score analysis methods have been increasing in the scientific papers of Intensive Care. Propensity score analyses attempt to control for confounding in non-experimental studies by adjusting for the likelihood that a given patient is exposed. However, studies with propensity indexes may be confusing, and intensivists are not familiar with this methodology and may not fully understand the importance of this technique. The objectives of this review are: to describe the fundamentals of propensity index methods; to present the techniques to adequately evaluate propensity index models; to discuss the advantages and disadvantages of these techniques. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
Multiscale hidden Markov models for photon-limited imaging
NASA Astrophysics Data System (ADS)
Nowak, Robert D.
1999-06-01
Photon-limited image analysis is often hindered by low signal-to-noise ratios. A novel Bayesian multiscale modeling and analysis method is developed in this paper to assist in these challenging situations. In addition to providing a very natural and useful framework for modeling an d processing images, Bayesian multiscale analysis is often much less computationally demanding compared to classical Markov random field models. This paper focuses on a probabilistic graph model called the multiscale hidden Markov model (MHMM), which captures the key inter-scale dependencies present in natural image intensities. The MHMM framework presented here is specifically designed for photon-limited imagin applications involving Poisson statistics, and applications to image intensity analysis are examined.
The importance of replication in wildlife research
Johnson, D.H.
2002-01-01
Wildlife ecology and management studies have been widely criticized for deficiencies in design or analysis. Manipulative experiments--with controls, randomization, and replication in space and time--provide powerful ways of learning about natural systems and establishing causal relationships, but such studies are rare in our field. Observational studies and sample surveys are more common; they also require appropriate design and analysis. More important than the design and analysis of individual studies is metareplication: replication of entire studies. Similar conclusions obtained from studies of the same phenomenon conducted under widely differing conditions will give us greater confidence in the generality of those findings than would any single study, however well designed and executed.
Gorobets, Yu I; Gorobets, O Yu
2015-01-01
The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.
Approximate ground states of the random-field Potts model from graph cuts
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Kumar, Ravinder; Weigel, Martin; Banerjee, Varsha; Janke, Wolfhard; Puri, Sanjay
2018-05-01
While the ground-state problem for the random-field Ising model is polynomial, and can be solved using a number of well-known algorithms for maximum flow or graph cut, the analog random-field Potts model corresponds to a multiterminal flow problem that is known to be NP-hard. Hence an efficient exact algorithm is very unlikely to exist. As we show here, it is nevertheless possible to use an embedding of binary degrees of freedom into the Potts spins in combination with graph-cut methods to solve the corresponding ground-state problem approximately in polynomial time. We benchmark this heuristic algorithm using a set of quasiexact ground states found for small systems from long parallel tempering runs. For a not-too-large number q of Potts states, the method based on graph cuts finds the same solutions in a fraction of the time. We employ the new technique to analyze the breakup length of the random-field Potts model in two dimensions.
Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382
Heisenberg spin-glass behaviour in Ga0.99Yb0.01FeO3
NASA Astrophysics Data System (ADS)
Neacsa, Daniela Maria; Gruener, Gisèle; Hebert, Sylvie; Soret, Jean-Claude
2017-06-01
The dynamic and static magnetic properties of Ga0.99Yb0.01FeO3 are studied in detail using ac susceptibility and dc magnetization measurements. The study shows that the compound undergoes a spin-glass freezing at Tg ≈ 213 K . The dynamic scaling analysis of ac susceptibility data reveals typical features characteristic of canonical spin-glasses, i.e., relaxation time τ∗ ∼10-14 s , critical exponent νz = 4.1 ± 0.2 , and frequency sensitivity parameter δf ∼10-3 within three frequency decades. The analysis of the critical behaviour of the static nonlinear susceptibility yields the critical exponents γ = 4.3 ± 0.1, β = 1.0 ± 0.1 , and δ = 5.5 ± 0.5 , which lie between those typical of three-dimensional (3D) weakly anisotropic Heisenberg and Ising spin glasses. The analysis of the field-cooled and zero-field-cooled magnetization data allows to define two characteristic temperatures depending on the applied magnetic field. The upper one, Tirr(H) , is the threshold temperature corresponding to the appearance of weak irreversibility, whereas the lower one, Ts(H) , marks the onset of strong irreversibility. The resulting field-temperature phase diagram turns out to be in good quantitative agreement with the mean-field predictions for 3D Heisenberg spin-glass with random magnetic anisotropy, and appears consistent with the chiral driven freezing scenario.
Corrected Mean-Field Model for Random Sequential Adsorption on Random Geometric Graphs
NASA Astrophysics Data System (ADS)
Dhara, Souvik; van Leeuwaarden, Johan S. H.; Mukherjee, Debankur
2018-03-01
A notorious problem in mathematics and physics is to create a solvable model for random sequential adsorption of non-overlapping congruent spheres in the d-dimensional Euclidean space with d≥ 2 . Spheres arrive sequentially at uniformly chosen locations in space and are accepted only when there is no overlap with previously deposited spheres. Due to spatial correlations, characterizing the fraction of accepted spheres remains largely intractable. We study this fraction by taking a novel approach that compares random sequential adsorption in Euclidean space to the nearest-neighbor blocking on a sequence of clustered random graphs. This random network model can be thought of as a corrected mean-field model for the interaction graph between the attempted spheres. Using functional limit theorems, we characterize the fraction of accepted spheres and its fluctuations.
Analysis of reliable sub-ns spin-torque switching under transverse bias magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Aquino, M., E-mail: daquino@uniparthenope.it; Perna, S.; Serpico, C.
2015-05-07
The switching process of a magnetic spin-valve nanosystem subject to spin-polarized current pulses is considered. The dependence of the switching probability on the current pulse duration is investigated. The further application of a transverse field along the intermediate anisotropy axis of the particle is used to control the quasi-random relaxation of magnetization to the reversed magnetization state. The critical current amplitudes to realize the switching are determined by studying the phase portrait of the Landau-Lifshtz-Slonczewski dynamics. Macrospin numerical simulations are in good agreement with the theoretical prediction and demonstrate reliable switching even for very short (below 100 ps) current pulses.
Multi-field inflation with a random potential
NASA Astrophysics Data System (ADS)
Tye, S.-H. Henry; Xu, Jiajun; Zhang, Yang
2009-04-01
Motivated by the possibility of inflation in the cosmic landscape, which may be approximated by a complicated potential, we study the density perturbations in multi-field inflation with a random potential. The random potential causes the inflaton to undergo a Brownian-like motion with a drift in the D-dimensional field space, allowing entropic perturbation modes to continuously and randomly feed into the adiabatic mode. To quantify such an effect, we employ a stochastic approach to evaluate the two-point and three-point functions of primordial perturbations. We find that in the weakly random scenario where the stochastic scatterings are frequent but mild, the resulting power spectrum resembles that of the single field slow-roll case, with up to 2% more red tilt. The strongly random scenario, in which the coarse-grained motion of the inflaton is significantly slowed down by the scatterings, leads to rich phenomenologies. The power spectrum exhibits primordial fluctuations on all angular scales. Such features may already be hiding in the error bars of observed CMB TT (as well as TE and EE) power spectrum and have been smoothed out by binning of data points. With more data coming in the future, we expect these features can be detected or falsified. On the other hand the tensor power spectrum itself is free of fluctuations and the tensor to scalar ratio is enhanced by the large ratio of the Brownian-like motion speed over the drift speed. In addition a large negative running of the power spectral index is possible. Non-Gaussianity is generically suppressed by the growth of adiabatic perturbations on super-horizon scales, and is negligible in the weakly random scenario. However, non-Gaussianity can possibly be enhanced by resonant effects in the strongly random scenario or arise from the entropic perturbations during the onset of (p)reheating if the background inflaton trajectory exhibits particular properties. The formalism developed in this paper can be applied to a wide class of multi-field inflation models including, e.g. the N-flation scenario.
NASA Astrophysics Data System (ADS)
Ding, Jian; Li, Li
2018-05-01
We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.
NASA Astrophysics Data System (ADS)
Ding, Jian; Li, Li
2018-06-01
We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.
Rational group decision making: A random field Ising model at T = 0
NASA Astrophysics Data System (ADS)
Galam, Serge
1997-02-01
A modified version of a finite random field Ising ferromagnetic model in an external magnetic field at zero temperature is presented to describe group decision making. Fields may have a non-zero average. A postulate of minimum inter-individual conflicts is assumed. Interactions then produce a group polarization along one very choice which is however randomly selected. A small external social pressure is shown to have a drastic effect on the polarization. Individual bias related to personal backgrounds, cultural values and past experiences are introduced via quenched local competing fields. They are shown to be instrumental in generating a larger spectrum of collective new choices beyond initial ones. In particular, compromise is found to results from the existence of individual competing bias. Conflict is shown to weaken group polarization. The model yields new psychosociological insights about consensus and compromise in groups.
Random-anisotropy model: Monotonic dependence of the coercive field on D/J
NASA Astrophysics Data System (ADS)
Saslow, W. M.; Koon, N. C.
1994-02-01
We present the results of a numerical study of the zero-temperature remanence and coercivity for the random anisotropy model (RAM), showing that, contrary to early calculations for this model, the coercive field increases monotonically with increases in the strength D of the random anisotropy relative to the strength J at the exchange field. Local-field adjustments with and without spin flips are considered. Convergence is difficult to obtain for small values of the anisotropy, suggesting that this is the likely source of the nonmonotonic behavior found in earlier studies. For both large and small anisotropy, each spin undergoes about one flip per hysteresis cycle, and about half of the spin flips occur in the vicinity of the coercive field. When only non-spin-flip adjustments are considered, at large anisotropy the coercivity is proportional to the anisotropy. At small anisotropy, the rate of convergence is comparable to that when spin flips are included.
Ablation effects of noninvasive radiofrequency field-induced hyperthermia on liver cancer cells.
Chen, Kaiyun; Zhu, Shuguang; Xiang, Guoan; Duan, Xiaopeng; He, Jiwen; Chen, Guihua
2016-05-01
To have in-depth analysis of clinical ablation effect of noninvasive radiofrequency field-induced hyperthermia on liver cancer cells, this paper collected liver cancer patients' treatment information from 10 hospitals during January 2010 and December 2011, from which 1050 cases of patients were randomly selected as study object of observation group who underwent noninvasive radiofrequency field-induced hyperthermia treatment; in addition, 500 cases of liver cancer patients were randomly selected as study object of control group who underwent clinical surgical treatment. After treatment was completed, three years of return visit were done, survival rates of the two groups of patients after 1 year, 2 years, and 3 years were compared, and clinical effects of radiofrequency ablation of liver cancer were evaluated. Zoom results show that the two groups are similar in terms of survival rate, and the difference is without statistical significance. 125 patients in observation group had varying degrees of adverse reactions, while 253 patients in control group had adverse reactions. There was difference between groups P < 0.05, with significant statistical significance. It can be concluded that radiofrequency ablation of liver cancer is more secure. Therefore, the results of this study fully demonstrate that liver cancer treatment with noninvasive radiofrequency field-induced hyperthermia is with safety effect and satisfactory survival rate, thus with relatively high clinical value in clinical practice.
New description of charged particle propagation in random magnetic fields
NASA Technical Reports Server (NTRS)
Earl, James A.
1994-01-01
When charged particles spiral along a large constant magnetic field, their trajectories are scattered by random components that are superposed on the guiding field. In the simplest analysis of this situation, scattering causes the particles to diffuse parallel to the guiding field. At the next level of approximation, moving pulses that correspond to a coherent mode of propagation are present, but they are represented by delta-functions whose infinitely narrow width makes no sense physically and is inconsistent with the finite duration of coherent pulses observed in solar energetic particle events. To derive a more realistic description, the transport problem is formulated in terms of 4 x 4 matrices, which derive from a representation of the particle distribution function in terms of eigenfunctions of the scattering operator, and which lead to useful approximations that give explicit predictions of the detailed evolution not only of the coherent pulses, but also of the diffusive wake. More specifically, the new description embodies a simple convolution of a narrow Gaussian with the solutions above that involve delta-functions, but with a slightly reduced coherent velocity. The validity of these approximations, which can easily be calculated on a desktop computer, has been exhaustively confirmed by comparison with results of Monte Carlo simulations which kept track of 50 million particles and which were carried out on the Maspar computer at Goddard Space Flight Center.
Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge
2015-01-01
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
In Defense of the Randomized Controlled Trial for Health Promotion Research
Rosen, Laura; Manor, Orly; Engelhard, Dan; Zucker, David
2006-01-01
The overwhelming evidence about the role lifestyle plays in mortality, morbidity, and quality of life has pushed the young field of modern health promotion to center stage. The field is beset with intense debate about appropriate evaluation methodologies. Increasingly, randomized designs are considered inappropriate for health promotion research. We have reviewed criticisms against randomized trials that raise philosophical and practical issues, and we will show how most of these criticisms can be overcome with minor design modifications. By providing rebuttal to arguments against randomized trials, our work contributes to building a sound methodological base for health promotion research. PMID:16735622
Note: The design of thin gap chamber simulation signal source based on field programmable gate array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Kun; Wang, Xu; Li, Feng
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
NASA Technical Reports Server (NTRS)
Earl, James A.
1992-01-01
When charged particles spiral along a large constant magnetic field, their trajectories are scattered by any random field components that are superposed on the guiding field. If the random field configuration embodies helicity, the scattering is asymmetrical with respect to a plane perpendicular to the guiding field, for particles moving into the forward hemisphere are scattered at different rates from those moving into the backward hemisphere. This asymmetry gives rise to new terms in the transport equations that describe propagation of charged particles. Helicity has virtually no impact on qualitative features of the diffusive mode of propagation. However, characteristic velocities of the coherent modes that appear after a highly anisotropic injection exhibit an asymmetry related to helicity. Explicit formulas, which embody the effects of helicity, are given for the anisotropies, the coefficient diffusion, and the coherent velocities. Predictions derived from these expressions are in good agreement with Monte Carlo simulations of particle transport, but the simulations reveal certain phenomena whose explanation calls for further analytical work.
The spectral expansion of the elasticity random field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malyarenko, Anatoliy; Ostoja-Starzewski, Martin
2014-12-10
We consider a deformable body that occupies a region D in the plane. In our model, the body’s elasticity tensor H(x) is the restriction to D of a second-order mean-square continuous random field. Under translation, the expected value and the correlation tensor of the field H(x) do not change. Under action of an arbitrary element k of the orthogonal group O(2), they transform according to the reducible orthogonal representation k ⟼ S{sup 2}(S{sup 2}(k)) of the above group. We find the spectral expansion of the correlation tensor R(x) of the elasticity field as well as the expansion of the fieldmore » itself in terms of stochastic integrals with respect to a family of orthogonal scattered random measures.« less
Random field assessment of nanoscopic inhomogeneity of bone.
Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu
2010-12-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Tsukanov, A. A.; Gorbatnikov, A. V.
2018-01-01
Study of the statistical parameters of the Earth's random microseismic field makes it possible to obtain estimates of the properties and structure of the Earth's crust and upper mantle. Different approaches are used to observe and process the microseismic records, which are divided into several groups of passive seismology methods. Among them are the well-known methods of surface-wave tomography, the spectral H/ V ratio of the components in the surface wave, and microseismic sounding, currently under development, which uses the spectral ratio V/ V 0 of the vertical components between pairs of spatially separated stations. In the course of previous experiments, it became clear that these ratios are stable statistical parameters of the random field that do not depend on the properties of microseism sources. This paper proposes to expand the mentioned approach and study the possibilities for using the ratio of the horizontal components H 1/ H 2 of the microseismic field. Numerical simulation was used to study the influence of an embedded velocity inhomogeneity on the spectral ratio of the horizontal components of the random field of fundamental Rayleigh modes, based on the concept that the Earth's microseismic field is represented by these waves in a significant part of the frequency spectrum.
Quenched bond randomness: Superfluidity in porous media and the strong violation of universality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Falicov, A.; Berker, A.N.
1997-04-01
The effects of quenched bond randomness are most readily studied with superfluidity immersed in a porous medium. A lattice model for {sup 3}He-{sup 4}He mixtures and incomplete {sup 4}He fillings in aerogel yields the signature effect of bond randomness, namely the conversion of symmetry-breaking first-order phase transitions into second-order phase transitions, the A-line reaching zero temperature, and the elimination of non-symmetry-breaking first-order phase transitions. The model recognizes the importance of the connected nature of aerogel randomness and thereby yields superfluidity at very low {sup 4}He concentrations, a phase separation entirely within the superfluid phase, and the order-parameter contrast between mixturesmore » and incomplete fillings, all in agreement with experiments. The special properties of the helium mixture/aerogel system are distinctly linked to the aerogel properties of connectivity, randomness, and tenuousness, via the additional study of a regularized {open_quote}jungle-gym{close_quotes} aerogel. Renormalization-group calculations indicate that a strong violation of the empirical universality principle of critical phenomena occurs under quenched bond randomness. It is argued that helium/aerogel critical properties reflect this violation and further experiments are suggested. Renormalization-group analysis also shows that, adjoiningly to the strong universality violation (which hinges on the occurrence or non-occurrence of asymptotic strong coupling-strong randomness under resealing), there is a new {open_quotes}hyperuniversality{close_quotes} at phase transitions with asymptotic strong coupling-strong randomness behavior, for example assigning the same critical exponents to random-bond tricriticality and random-field criticality.« less
NASA Technical Reports Server (NTRS)
Wang, J. R.; Shiue, J.; Chuang, S. L.; Dombrowski, M.
1980-01-01
The radiometric measurements over bare field and fields covered with grass, soybean, corn, and alfalfa were made with 1.4 GHz and 5 GHz microwave radiometers during August - October 1978. The measured results are compared with radiative transfer theory treating the vegetated fields as a two layer random medium. It is found that the presence of a vegetation cover generally gives a higher brightness temperature T(B) than that expected from a bare soil. The amount of this T(B) excess increases in the vegetation biomass and in the frequency of the observed radiation. The results of radiative transfer calculations generally match well with the experimental data, however, a detailed analysis also strongly suggests the need of incorporating soil surface roughness effect into the radiative transfer theory in order to better interpret the experimental data.
On the extensive unification of digital-to-analog converters and kernels
NASA Astrophysics Data System (ADS)
Liao, Yanchu
2012-09-01
System administrators agree that scalable communication is an interesting new topic in the field of steganography, and leading analysts concur. After years of unfortunate re-search into context-free grammar, we argue the intuitive unification of fiber-optic cables and context-free grammar. Our focus here is not on whether sensor networks and randomized algorithms can collaborate to accomplish this aim, but rather on introducing an analysis of DHTs [2] (Soupy Coil).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pingenot, J; Rieben, R; White, D
2004-12-06
We present a computational study of signal propagation and attenuation of a 200 MHz dipole antenna in a cave environment. The cave is modeled as a straight and lossy random rough wall. To simulate a broad frequency band, the full wave Maxwell equations are solved directly in the time domain via a high order vector finite element discretization using the massively parallel CEM code EMSolve. The simulation is performed for a series of random meshes in order to generate statistical data for the propagation and attenuation properties of the cave environment. Results for the power spectral density and phase ofmore » the electric field vector components are presented and discussed.« less
Noise reduction of a composite cylinder subjected to random acoustic excitation
NASA Technical Reports Server (NTRS)
Grosveld, Ferdinand W.; Beyer, T.
1989-01-01
Interior and exterior noise measurements were conducted on a stiffened composite floor-equipped cylinder, with and without an interior trim installed. Noise reduction was obtained for the case of random acoustic excitation in a diffuse field; the frequency range of interest was 100-800-Hz one-third octave bands. The measured data were compared with noise reduction predictions from the Propeller Aircraft Interior Noise (PAIN) program and from a statistical energy analysis. Structural model parameters were not predicted well by the PAIN program for the given input parameters; this resulted in incorrect noise reduction predictions for the lower one-third octave bands where the power flow into the interior of the cylinder was predicted on a mode-per-mode basis.
Lun, Z R; Desser, S S
1996-01-01
The patterns of random amplified fragments and molecular karyotypes of 12 isolates of anuran trypanosomes continuously cultured in vitro were compared by random amplified polymorphic DNA (RAPD) analysis and pulsed field gradient gel electrophoresis (PFGE). The time interval between preparation of two series of samples was one year. Changes were not observed in the number and size of sharp, amplified fragments of DNA samples from both series examined with the ten primers used. Likewise, changes in the molecular karyotypes were not detected between the two samples of these isolates. These results suggest that the molecular karyotype and the RAPD patterns of the anuran trypanosomes remain stable after being cultured continuously in vitro for one year.
1993-11-01
field X(t) at time 1. Ti. is the set of all times when both pi and pi have been observed and ni. is the number of elements in T Definition Eq. (22) is...termed contour analysis, for melding of oceanic data and for space-time interpolation of gappy frontal data sets . The key elements of contour analysis...plane and let fl(1) be the set of all straight lines intersecting F. Directly measuring the number of intersections between a random element W E 11(F) and
Discrimination surfaces with application to region-specific brain asymmetry analysis.
Martos, Gabriel; de Carvalho, Miguel
2018-05-20
Discrimination surfaces are here introduced as a diagnostic tool for localizing brain regions where discrimination between diseased and nondiseased participants is higher. To estimate discrimination surfaces, we introduce a Mann-Whitney type of statistic for random fields and present large-sample results characterizing its asymptotic behavior. Simulation results demonstrate that our estimator accurately recovers the true surface and corresponding interval of maximal discrimination. The empirical analysis suggests that in the anterior region of the brain, schizophrenic patients tend to present lower local asymmetry scores in comparison with participants in the control group. Copyright © 2018 John Wiley & Sons, Ltd.
Relativistic diffusive motion in random electromagnetic fields
NASA Astrophysics Data System (ADS)
Haba, Z.
2011-08-01
We show that the relativistic dynamics in a Gaussian random electromagnetic field can be approximated by the relativistic diffusion of Schay and Dudley. Lorentz invariant dynamics in the proper time leads to the diffusion in the proper time. The dynamics in the laboratory time gives the diffusive transport equation corresponding to the Jüttner equilibrium at the inverse temperature β-1 = mc2. The diffusion constant is expressed by the field strength correlation function (Kubo's formula).
Propagation of terahertz pulses in random media.
Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M
2004-02-15
We describe measurements of single-cycle terahertz pulse propagation in a random medium. The unique capabilities of terahertz time-domain spectroscopy permit the characterization of a multiply scattered field with unprecedented spatial and temporal resolution. With these results, we can develop a framework for understanding the statistics of broadband laser speckle. Also, the ability to extract information on the phase of the field opens up new possibilities for characterizing multiply scattered waves. We illustrate this with a simple example, which involves computing a time-windowed temporal correlation between fields measured at different spatial locations. This enables the identification of individual scattering events, and could lead to a new method for imaging in random media.
Random electric field instabilities of relaxor ferroelectrics
NASA Astrophysics Data System (ADS)
Arce-Gamboa, José R.; Guzmán-Verri, Gian G.
2017-06-01
Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. We compare and reproduce several key experimental observations in the well-studied relaxor PbMg1/3Nb2/3O3-PbTiO3.
Random crystal field effects on the integer and half-integer mixed-spin system
NASA Astrophysics Data System (ADS)
Yigit, Ali; Albayrak, Erhan
2018-05-01
In this work, we have focused on the random crystal field effects on the phase diagrams of the mixed spin-1 and spin-5/2 Ising system obtained by utilizing the exact recursion relations (ERR) on the Bethe lattice (BL). The distribution function P(Di) = pδ [Di - D(1 + α) ] +(1 - p) δ [Di - D(1 - α) ] is used to randomize the crystal field.The phase diagrams are found to exhibit second- and first-order phase transitions depending on the values of α, D and p. It is also observed that the model displays tricritical point, isolated point, critical end point and three compensation temperatures for suitable values of the system parameters.
Zhu, Jay-Jiguang; Demireva, Petya; Kanner, Andrew A; Pannullo, Susan; Mehdorn, Maximilian; Avgeropoulos, Nicholas; Salmaggi, Andrea; Silvani, Antonio; Goldlust, Samuel; David, Carlos; Benouaich-Amiel, Alexandra
2017-12-01
We characterized health-related quality of life (HRQoL), cognitive, and functional status in newly diagnosed glioblastoma (GBM) patients receiving Tumor treating fields (TTFields) with temozolomide (TMZ) versus TMZ alone in a planned interim analysis of a randomized phase III trial [NCT00916409], which showed significant improvement in progression-free and overall survival with TTFields/TMZ. After radiotherapy with concomitant TMZ, newly diagnosed GBM patients were randomized (2:1) to TTFields/TMZ (n = 210) or TMZ (n = 105). Interim analysis was performed in 315 patients with ≥18 months of follow-up. HRQoL, a secondary endpoint, was evaluated in per-protocol patient population and expressed as change from baseline (CFB) at 3, 6, and 9 months for each subscale in the EORTC QLQ-C30/BN20. Karnofsky performance scores (KPS) and Mini-Mental State Examination scores (MMSE) were assessed. CFB in HRQoL was balanced in treatment groups at the 12-month time point. Initially, HRQoL improved in patients treated with TTFields/TMZ (CFB3: 24% and CFB6: 13%) versus TMZ (CFB3: -7% and CFB6: -17%), though this difference was no longer evident at the 9-month point. General scales, including physical and social functioning, showed no difference at 9 and 12 months. TTFields/TMZ group reported higher concerns of "itchy skin". KPS over 12 months was just below 90 in both groups. Cognitive status (MMSE) was stable over time. HRQoL, KPS, and MMSE were balanced in both groups over time. There was no preliminary evidence that HRQoL, cognitive, and functional status is adversely affected by the continuous use of TTFields.
Industry Bias in Randomized Controlled Trials in General and Abdominal Surgery: An Empirical Study.
Probst, Pascal; Knebel, Phillip; Grummich, Kathrin; Tenckhoff, Solveig; Ulrich, Alexis; Büchler, Markus W; Diener, Markus K
2016-07-01
Industry sponsorship has been identified as a source of bias in several fields of medical science. To date, the influence of industry sponsorship in the field of general and abdominal surgery has not been evaluated. A systematic literature search (1985-2014) was performed in the Cochrane Library, MEDLINE, and EMBASE to identify randomized controlled trials in general and abdominal surgery. Information on funding source, outcome, and methodological quality was extracted. Association of industry sponsorship and positive outcome was expressed as odds ratio (OR) with 95% confidence interval (CI). A χ test and a multivariate logistic regression analysis with study characteristics and known sources of bias were performed. A total of 7934 articles were screened and 165 randomized controlled trials were included. No difference in methodological quality was found. Industry-funded trials more often presented statistically significant results for the primary endpoint (OR, 2.44; CI, 1.04-5.71; P = 0.04). Eighty-eight of 115 (76.5%) industry-funded trials and 19 of 50 (38.0%) non-industry-funded trials reported a positive outcome (OR, 5.32; CI, 2.60-10.88; P < 0.001). Industry-funded trials more often reported a positive outcome without statistical justification (OR, 5.79; CI, 2.13-15.68; P < 0.001). In a multivariate analysis, funding source remained significantly associated with reporting of positive outcome (P < 0.001). Industry funding of surgical trials leads to exaggerated positive reporting of outcomes. This study emphasizes the necessity for declaration of funding source. Industry involvement in surgical research has to ensure scientific integrity and independence and has to be based on full transparency.
Hierarchical Bayesian spatial models for alcohol availability, drug "hot spots" and violent crime.
Zhu, Li; Gorman, Dennis M; Horel, Scott
2006-12-07
Ecologic studies have shown a relationship between alcohol outlet densities, illicit drug use and violence. The present study examined this relationship in the City of Houston, Texas, using a sample of 439 census tracts. Neighborhood sociostructural covariates, alcohol outlet density, drug crime density and violent crime data were collected for the year 2000, and analyzed using hierarchical Bayesian models. Model selection was accomplished by applying the Deviance Information Criterion. The counts of violent crime in each census tract were modelled as having a conditional Poisson distribution. Four neighbourhood explanatory variables were identified using principal component analysis. The best fitted model was selected as the one considering both unstructured and spatial dependence random effects. The results showed that drug-law violation explained a greater amount of variance in violent crime rates than alcohol outlet densities. The relative risk for drug-law violation was 2.49 and that for alcohol outlet density was 1.16. Of the neighbourhood sociostructural covariates, males of age 15 to 24 showed an effect on violence, with a 16% decrease in relative risk for each increase the size of its standard deviation. Both unstructured heterogeneity random effect and spatial dependence need to be included in the model. The analysis presented suggests that activity around illicit drug markets is more strongly associated with violent crime than is alcohol outlet density. Unique among the ecological studies in this field, the present study not only shows the direction and magnitude of impact of neighbourhood sociostructural covariates as well as alcohol and illicit drug activities in a neighbourhood, it also reveals the importance of applying hierarchical Bayesian models in this research field as both spatial dependence and heterogeneity random effects need to be considered simultaneously.
Scattering of electromagnetic wave by the layer with one-dimensional random inhomogeneities
NASA Astrophysics Data System (ADS)
Kogan, Lev; Zaboronkova, Tatiana; Grigoriev, Gennadii., IV.
A great deal of attention has been paid to the study of probability characteristics of electro-magnetic waves scattered by one-dimensional fluctuations of medium dielectric permittivity. However, the problem of a determination of a density of a probability and average intensity of the field inside the stochastically inhomogeneous medium with arbitrary extension of fluc-tuations has not been considered yet. It is the purpose of the present report to find and to analyze the indicated functions for the plane electromagnetic wave scattered by the layer with one-dimensional fluctuations of permittivity. We assumed that the length and the amplitude of individual fluctuations as well the interval between them are random quantities. All of indi-cated fluctuation parameters are supposed as independent random values possessing Gaussian distribution. We considered the stationary time cases both small-scale and large-scale rarefied inhomogeneities. Mathematically such problem can be reduced to the solution of integral Fred-holm equation of second kind for Hertz potential (U). Using the decomposition of the field into the series of multiply scattered waves we obtained the expression for a probability density of the field of the plane wave and determined the moments of the scattered field. We have shown that all odd moments of the centered field (U-¡U¿) are equal to zero and the even moments depend on the intensity. It was obtained that the probability density of the field possesses the Gaussian distribution. The average field is small compared with the standard fluctuation of scattered field for all considered cases of inhomogeneities. The value of average intensity of the field is an order of a standard of fluctuations of field intensity and drops with increases the inhomogeneities length in the case of small-scale inhomogeneities. The behavior of average intensity is more complicated in the case of large-scale medium inhomogeneities. The value of average intensity is the oscillating function versus the average fluctuations length if the standard of fluctuations of inhomogeneities length is greater then the wave length. When the standard of fluctuations of medium inhomogeneities extension is smaller then the wave length, the av-erage intensity value weakly depends from the average fluctuations extension. The obtained results may be used for analysis of the electromagnetic wave propagation into the media with the fluctuating parameters caused by such factors as leafs of trees, cumulus, internal gravity waves with a chaotic phase and etc. Acknowledgment: This work was supported by the Russian Foundation for Basic Research (projects 08-02-97026 and 09-05-00450).
Fradin, Cécile
2013-01-01
Magnetotactic bacteria possess organelles called magnetosomes that confer a magnetic moment on the cells, resulting in their partial alignment with external magnetic fields. Here we show that analysis of the trajectories of cells exposed to an external magnetic field can be used to measure the average magnetic dipole moment of a cell population in at least five different ways. We apply this analysis to movies of Magnetospirillum magneticum AMB-1 cells, and compare the values of the magnetic moment obtained in this way to that obtained by direct measurements of magnetosome dimension from electron micrographs. We find that methods relying on the viscous relaxation of the cell orientation give results comparable to that obtained by magnetosome measurements, whereas methods relying on statistical mechanics assumptions give systematically lower values of the magnetic moment. Since the observed distribution of magnetic moments in the population is not sufficient to explain this discrepancy, our results suggest that non-thermal random noise is present in the system, implying that a magnetotactic bacterial population should not be considered as similar to a paramagnetic material. PMID:24349185
NASA Astrophysics Data System (ADS)
Zaim, N.; Zaim, A.; Kerouad, M.
2017-02-01
In this work, the magnetic behavior of the cylindrical nanowire, consisting of a ferromagnetic core of spin-1 atoms surrounded by a ferromagnetic shell of spin-1 atoms is studied in the presence of a random crystal field interaction. Based on Metropolis algorithm, the Monte Carlo simulation has been used to investigate the effects of the concentration of the random crystal field p, the crystal field D and the shell exchange interaction Js on the phase diagrams and the hysteresis behavior of the system. Some characteristic behaviors have been found, such as the first and second-order phase transitions joined by tricritical point for appropriate values of the system parameters, triple and isolated critical points can be also found. Depending on the Hamiltonian parameters, single, double and para hysteresis regions are explicitly determined.
3D vector distribution of the electro-magnetic fields on a random gold film
NASA Astrophysics Data System (ADS)
Canneson, Damien; Berini, Bruno; Buil, Stéphanie; Hermier, Jean-Pierre; Quélin, Xavier
2018-05-01
The 3D vector distribution of the electro-magnetic fields at the very close vicinity of the surface of a random gold film is studied. Such films are well known for their properties of light confinement and large fluctuations of local density of optical states. Using Finite-Difference Time-Domain simulations, we show that it is possible to determine the local orientation of the electro-magnetic fields. This allows us to obtain a complete characterization of the fields. Large fluctuations of their amplitude are observed as previously shown. Here, we demonstrate large variations of their direction depending both on the position on the random gold film, and on the distance to it. Such characterization could be useful for a better understanding of applications like the coupling of point-like dipoles to such films.
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Yurkin, Maxim A.
2017-01-01
Although the model of randomly oriented nonspherical particles has been used in a great variety of applications of far-field electromagnetic scattering, it has never been defined in strict mathematical terms. In this Letter we use the formalism of Euler rigid-body rotations to clarify the concept of statistically random particle orientations and derive its immediate corollaries in the form of most general mathematical properties of the orientation-averaged extinction and scattering matrices. Our results serve to provide a rigorous mathematical foundation for numerous publications in which the notion of randomly oriented particles and its light-scattering implications have been considered intuitively obvious.
Box-Cox Mixed Logit Model for Travel Behavior Analysis
NASA Astrophysics Data System (ADS)
Orro, Alfonso; Novales, Margarita; Benitez, Francisco G.
2010-09-01
To represent the behavior of travelers when they are deciding how they are going to get to their destination, discrete choice models, based on the random utility theory, have become one of the most widely used tools. The field in which these models were developed was halfway between econometrics and transport engineering, although the latter now constitutes one of their principal areas of application. In the transport field, they have mainly been applied to mode choice, but also to the selection of destination, route, and other important decisions such as the vehicle ownership. In usual practice, the most frequently employed discrete choice models implement a fixed coefficient utility function that is linear in the parameters. The principal aim of this paper is to present the viability of specifying utility functions with random coefficients that are nonlinear in the parameters, in applications of discrete choice models to transport. Nonlinear specifications in the parameters were present in discrete choice theory at its outset, although they have seldom been used in practice until recently. The specification of random coefficients, however, began with the probit and the hedonic models in the 1970s, and, after a period of apparent little practical interest, has burgeoned into a field of intense activity in recent years with the new generation of mixed logit models. In this communication, we present a Box-Cox mixed logit model, original of the authors. It includes the estimation of the Box-Cox exponents in addition to the parameters of the random coefficients distribution. Probability of choose an alternative is an integral that will be calculated by simulation. The estimation of the model is carried out by maximizing the simulated log-likelihood of a sample of observed individual choices between alternatives. The differences between the predictions yielded by models that are inconsistent with real behavior have been studied with simulation experiments.
Ensemble habitat mapping of invasive plant species
Stohlgren, T.J.; Ma, P.; Kumar, S.; Rocca, M.; Morisette, J.T.; Jarnevich, C.S.; Benson, N.
2010-01-01
Ensemble species distribution models combine the strengths of several species environmental matching models, while minimizing the weakness of any one model. Ensemble models may be particularly useful in risk analysis of recently arrived, harmful invasive species because species may not yet have spread to all suitable habitats, leaving species-environment relationships difficult to determine. We tested five individual models (logistic regression, boosted regression trees, random forest, multivariate adaptive regression splines (MARS), and maximum entropy model or Maxent) and ensemble modeling for selected nonnative plant species in Yellowstone and Grand Teton National Parks, Wyoming; Sequoia and Kings Canyon National Parks, California, and areas of interior Alaska. The models are based on field data provided by the park staffs, combined with topographic, climatic, and vegetation predictors derived from satellite data. For the four invasive plant species tested, ensemble models were the only models that ranked in the top three models for both field validation and test data. Ensemble models may be more robust than individual species-environment matching models for risk analysis. ?? 2010 Society for Risk Analysis.
Lead Determination and Heterogeneity Analysis in Soil from a Former Firing Range
NASA Astrophysics Data System (ADS)
Urrutia-Goyes, Ricardo; Argyraki, Ariadne; Ornelas-Soto, Nancy
2017-07-01
Public places can have an unknown past of pollutants deposition. The exposition to such contaminants can create environmental and health issues. The characterization of a former firing range in Athens, Greece will allow its monitoring and encourage its remediation. This study is focused on Pb contamination in the site due to its presence in ammunition. A dense sampling design with 91 location (10 m apart) was used to determine the spatial distribution of the element in the surface soil of the study area. Duplicates samples were also collected one meter apart from 8 random locations to estimate the heterogeneity of the site. Elemental concentrations were measured using a portable XRF device after simple sample homogenization in the field. Robust Analysis of Variance showed that the contributions to the total variance were 11% from sampling, 1% analytical, and 88% geochemical; reflecting the suitability of the technique. Moreover, the extended random uncertainty relative to the mean concentration was 91.5%; confirming the high heterogeneity of the site. Statistical analysis defined a very high contamination in the area yielding to suggest the need for more in-depth analysis of other contaminants and possible health risks.
NASA Astrophysics Data System (ADS)
Albeverio, Sergio; Tamura, Hiroshi
2018-04-01
We consider a model describing the coupling of a vector-valued and a scalar homogeneous Markovian random field over R4, interpreted as expressing the interaction between a charged scalar quantum field coupled with a nonlinear quantized electromagnetic field. Expectations of functionals of the random fields are expressed by Brownian bridges. Using this, together with Feynman-Kac-Itô type formulae and estimates on the small time and large time behaviour of Brownian functionals, we prove asymptotic upper and lower bounds on the kernel of the transition semigroup for our model. The upper bound gives faster than exponential decay for large distances of the corresponding resolvent (propagator).
Antonov, N V; Gulitskiy, N M; Kostenko, M M; Malyshev, A V
2018-03-01
In this paper we consider the model of incompressible fluid described by the stochastic Navier-Stokes equation with finite correlation time of a random force. Inertial-range asymptotic behavior of fully developed turbulence is studied by means of the field theoretic renormalization group within the one-loop approximation. It is corroborated that regardless of the values of model parameters and initial data the inertial-range behavior of the model is described by the limiting case of vanishing correlation time. This indicates that the Galilean symmetry of the model violated by the "colored" random force is restored in the inertial range. This regime corresponds to the only nontrivial fixed point of the renormalization group equation. The stability of this point depends on the relation between the exponents in the energy spectrum E∝k^{1-y} and the dispersion law ω∝k^{2-η}. The second analyzed problem is the passive advection of a scalar field by this velocity ensemble. Correlation functions of the scalar field exhibit anomalous scaling behavior in the inertial-convective range. We demonstrate that in accordance with Kolmogorov's hypothesis of the local symmetry restoration the main contribution to the operator product expansion is given by the isotropic operator, while anisotropic terms should be considered only as corrections.
NASA Astrophysics Data System (ADS)
Antonov, N. V.; Gulitskiy, N. M.; Kostenko, M. M.; Malyshev, A. V.
2018-03-01
In this paper we consider the model of incompressible fluid described by the stochastic Navier-Stokes equation with finite correlation time of a random force. Inertial-range asymptotic behavior of fully developed turbulence is studied by means of the field theoretic renormalization group within the one-loop approximation. It is corroborated that regardless of the values of model parameters and initial data the inertial-range behavior of the model is described by the limiting case of vanishing correlation time. This indicates that the Galilean symmetry of the model violated by the "colored" random force is restored in the inertial range. This regime corresponds to the only nontrivial fixed point of the renormalization group equation. The stability of this point depends on the relation between the exponents in the energy spectrum E ∝k1 -y and the dispersion law ω ∝k2 -η . The second analyzed problem is the passive advection of a scalar field by this velocity ensemble. Correlation functions of the scalar field exhibit anomalous scaling behavior in the inertial-convective range. We demonstrate that in accordance with Kolmogorov's hypothesis of the local symmetry restoration the main contribution to the operator product expansion is given by the isotropic operator, while anisotropic terms should be considered only as corrections.
NASA Astrophysics Data System (ADS)
Wang, Zuowei; Biwa, Shiro
2018-03-01
A numerical procedure is proposed for the multiple scattering analysis of flexural waves on a thin plate with circular holes based on the Kirchhoff plate theory. The numerical procedure utilizes the wave function expansion of the exciting as well as scattered fields, and the boundary conditions at the periphery of holes are incorporated as the relations between the expansion coefficients of exciting and scattered fields. A set of linear algebraic equations with respect to the wave expansion coefficients of the exciting field alone is established by the numerical collocation method. To demonstrate the applicability of the procedure, the stop band characteristics of flexural waves are analyzed for different arrangements and concentrations of circular holes on a steel plate. The energy transmission spectra of flexural waves are shown to capture the detailed features of the stop band formation of regular and random arrangements of holes. The increase of the concentration of holes is found to shift the dips of the energy transmission spectra toward higher frequencies as well as deepen them. The hexagonal hole arrangement can form a much broader stop band than the square hole arrangement for flexural wave transmission. It is also demonstrated that random arrangements of holes make the transmission spectrum more complicated.
Digital servo control of random sound fields
NASA Technical Reports Server (NTRS)
Nakich, R. B.
1973-01-01
It is necessary to place number of sensors at different positions in sound field to determine actual sound intensities to which test object is subjected. It is possible to determine whether specification is being met adequately or exceeded. Since excitation is of random nature, signals are essentially coherent and it is impossible to obtain true average.
Random potentials and cosmological attractors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linde, Andrei, E-mail: alinde@stanford.edu
I show that the problem of realizing inflation in theories with random potentials of a limited number of fields can be solved, and agreement with the observational data can be naturally achieved if at least one of these fields has a non-minimal kinetic term of the type used in the theory of cosmological α-attractors.
Random phase approximation and cluster mean field studies of hard core Bose Hubbard model
NASA Astrophysics Data System (ADS)
Alavani, Bhargav K.; Gaude, Pallavi P.; Pai, Ramesh V.
2018-04-01
We investigate zero temperature and finite temperature properties of the Bose Hubbard Model in the hard core limit using Random Phase Approximation (RPA) and Cluster Mean Field Theory (CMFT). We show that our RPA calculations are able to capture quantum and thermal fluctuations significantly better than CMFT.
NASA Astrophysics Data System (ADS)
Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki
2018-03-01
We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.
NASA Astrophysics Data System (ADS)
Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki
2017-12-01
We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2} ). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3} ) and the level sets of the Gaussian free field ({d≥ 3} ). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.
Structure of receptive fields in a computational model of area 3b of primary sensory cortex.
Detorakis, Georgios Is; Rougier, Nicolas P
2014-01-01
In a previous work, we introduced a computational model of area 3b which is built upon the neural field theory and receives input from a simplified model of the index distal finger pad populated by a random set of touch receptors (Merkell cells). This model has been shown to be able to self-organize following the random stimulation of the finger pad model and to cope, to some extent, with cortical or skin lesions. The main hypothesis of the model is that learning of skin representations occurs at the thalamo-cortical level while cortico-cortical connections serve a stereotyped competition mechanism that shapes the receptive fields. To further assess this hypothesis and the validity of the model, we reproduced in this article the exact experimental protocol of DiCarlo et al. that has been used to examine the structure of receptive fields in area 3b of the primary somatosensory cortex. Using the same analysis toolset, the model yields consistent results, having most of the receptive fields to contain a single region of excitation and one to several regions of inhibition. We further proceeded our study using a dynamic competition that deeply influences the formation of the receptive fields. We hypothesized this dynamic competition to correspond to some form of somatosensory attention that may help to precisely shape the receptive fields. To test this hypothesis, we designed a protocol where an arbitrary region of interest is delineated on the index distal finger pad and we either (1) instructed explicitly the model to attend to this region (simulating an attentional signal) (2) preferentially trained the model on this region or (3) combined the two aforementioned protocols simultaneously. Results tend to confirm that dynamic competition leads to shrunken receptive fields and its joint interaction with intensive training promotes a massive receptive fields migration and shrinkage.
Localized surface plasmon enhanced cellular imaging using random metallic structures
NASA Astrophysics Data System (ADS)
Son, Taehwang; Lee, Wonju; Kim, Donghyun
2017-02-01
We have studied fluorescence cellular imaging with randomly distributed localized near-field induced by silver nano-islands. For the fabrication of nano-islands, a 10-nm silver thin film evaporated on a BK7 glass substrate with an adhesion layer of 2-nm thick chromium. Micrometer sized silver square pattern was defined using e-beam lithography and then the film was annealed at 200°C. Raw images were restored using electric field distribution produced on the surface of random nano-islands. Nano-islands were modeled from SEM images. 488-nm p-polarized light source was set to be incident at 60°. Simulation results show that localized electric fields were created among nano-islands and that their average size was found to be 135 nm. The feasibility was tested using conventional total internal reflection fluorescence microscopy while the angle of incidence was adjusted to maximize field enhancement. Mouse microphage cells were cultured on nano-islands, and actin filaments were selectively stained with FITC-conjugated phalloidin. Acquired images were deconvolved based on linear imaging theory, in which molecular distribution was sampled by randomly distributed localized near-field and blurred by point spread function of far-field optics. The optimum fluorophore distribution was probabilistically estimated by repetitively matching a raw image. The deconvolved images are estimated to have a resolution in the range of 100-150 nm largely determined by the size of localized near-fields. We also discuss and compare the results with images acquired with periodic nano-aperture arrays in various optical configurations to excite localized plasmonic fields and to produce super-resolved molecular images.
Nanoscale live cell optical imaging of the dynamics of intracellular microvesicles in neural cells.
Lee, Sohee; Heo, Chaejeong; Suh, Minah; Lee, Young Hee
2013-11-01
Recent advances in biotechnology and imaging technology have provided great opportunities to investigate cellular dynamics. Conventional imaging methods such as transmission electron microscopy, scanning electron microscopy, and atomic force microscopy are powerful techniques for cellular imaging, even at the nanoscale level. However, these techniques have limitations applications in live cell imaging because of the experimental preparation required, namely cell fixation, and the innately small field of view. In this study, we developed a nanoscale optical imaging (NOI) system that combines a conventional optical microscope with a high resolution dark-field condenser (Cytoviva, Inc.) and halogen illuminator. The NOI system's maximum resolution for live cell imaging is around 100 nm. We utilized NOI to investigate the dynamics of intracellular microvesicles of neural cells without immunocytological analysis. In particular, we studied direct, active random, and moderate random dynamic motions of intracellular microvesicles and visualized lysosomal vesicle changes after treatment of cells with a lysosomal inhibitor (NH4Cl). Our results indicate that the NOI system is a feasible, high-resolution optical imaging system for live small organelles that does not require complicated optics or immunocytological staining processes.
NASA Astrophysics Data System (ADS)
Liu, Lian; Yang, Xiukun; Zhong, Mingliang; Liu, Yao; Jing, Xiaojun; Yang, Qin
2018-04-01
The discrete fractional Brownian incremental random (DFBIR) field is used to describe the irregular, random, and highly complex shapes of natural objects such as coastlines and biological tissues, for which traditional Euclidean geometry cannot be used. In this paper, an anisotropic variable window (AVW) directional operator based on the DFBIR field model is proposed for extracting spatial characteristics of Fourier transform infrared spectroscopy (FTIR) microscopic imaging. Probabilistic principal component analysis first extracts spectral features, and then the spatial features of the proposed AVW directional operator are combined with the former to construct a spatial-spectral structure, which increases feature-related information and helps a support vector machine classifier to obtain more efficient distribution-related information. Compared to Haralick’s grey-level co-occurrence matrix, Gabor filters, and local binary patterns (e.g. uniform LBPs, rotation-invariant LBPs, uniform rotation-invariant LBPs), experiments on three FTIR spectroscopy microscopic imaging datasets show that the proposed AVW directional operator is more advantageous in terms of classification accuracy, particularly for low-dimensional spaces of spatial characteristics.
Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs
NASA Astrophysics Data System (ADS)
Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.
2018-04-01
Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.
Greenslade, Penelope; Florentine, Singarayer K; Hansen, Brigita D; Gell, Peter A
2016-08-01
Monitoring forms the basis for understanding ecological change. It relies on repeatability of methods to ensure detected changes accurately reflect the effect of environmental drivers. However, operator bias can influence the repeatability of field and laboratory work. We tested this for invertebrates and diatoms in three trials: (1) two operators swept invertebrates from heath vegetation, (2) four operators picked invertebrates from pyrethrum knockdown samples from tree trunk and (3) diatom identifications by eight operators in three laboratories. In each trial, operators were working simultaneously and their training in the field and laboratory was identical. No variation in catch efficiency was found between the two operators of differing experience using a random number of net sweeps to catch invertebrates when sequence, location and size of sweeps were random. Number of individuals and higher taxa collected by four operators from tree trunks varied significantly between operators and with their 'experience ranking'. Diatom identifications made by eight operators were clustered together according to which of three laboratories they belonged. These three tests demonstrated significant potential bias of operators in both field and laboratory. This is the first documented case demonstrating the significant influence of observer bias on results from invertebrate field-based studies. Examples of two long-term trials are also given that illustrate further operator bias. Our results suggest that long-term ecological studies using invertebrates need to be rigorously audited to ensure that operator bias is accounted for during analysis and interpretation. Further, taxonomic harmonisation remains an important step in merging field and laboratory data collected by different operators.
Durga, Padmaja; Raavula, Parvathi; Gurajala, Indira; Gunnam, Poojita; Veerabathula, Prardhana; Reddy, Mukund; Upputuri, Omkar; Ramachandran, Gopinath
2015-09-01
To assess the effect of tranexamic acid on the quality of the surgical field. Prospective, randomized, double-blind study. Institutional, tertiary referral hospital. American Society of Anesthesiologists physical status class I patients, aged 8 to 60 months with Group II or III (Balakrishnan's classification) clefts scheduled for cleft palate repair. Children were randomized into two groups. The control group received saline, and the tranexamic acid group received tranexamic acid 10 mg/kg as a bolus, 15 minutes before incision. Grade of surgical field on a 10-point scale, surgeon satisfaction, and primary hemorrhage. Significant improvements were noted in surgeon satisfaction and median grade of assessment of the surgical field (4 [interquartile range, 4 to 6] in the control group vs. 3 [interquartile range, 2 to 4] in the test group; P = .003) in the tranexamic acid group compared to the control group. Preincision administration of 10 mg/kg of tranexamic acid significantly improved the surgical field during cleft palate repair.
NASA Astrophysics Data System (ADS)
Schießl, Stefan P.; Rother, Marcel; Lüttgens, Jan; Zaumseil, Jana
2017-11-01
The field-effect mobility is an important figure of merit for semiconductors such as random networks of single-walled carbon nanotubes (SWNTs). However, owing to their network properties and quantum capacitance, the standard models for field-effect transistors cannot be applied without modifications. Several different methods are used to determine the mobility with often very different results. We fabricated and characterized field-effect transistors with different polymer-sorted, semiconducting SWNT network densities ranging from low (≈6 μm-1) to densely packed quasi-monolayers (≈26 μm-1) with a maximum on-conductance of 0.24 μS μm-1 and compared four different techniques to evaluate the field-effect mobility. We demonstrate the limits and requirements for each method with regard to device layout and carrier accumulation. We find that techniques that take into account the measured capacitance on the active device give the most reliable mobility values. Finally, we compare our experimental results to a random-resistor-network model.
Seven lessons from manyfield inflation in random potentials
NASA Astrophysics Data System (ADS)
Dias, Mafalda; Frazer, Jonathan; Marsh, M. C. David
2018-01-01
We study inflation in models with many interacting fields subject to randomly generated scalar potentials. We use methods from non-equilibrium random matrix theory to construct the potentials and an adaption of the `transport method' to evolve the two-point correlators during inflation. This construction allows, for the first time, for an explicit study of models with up to 100 interacting fields supporting a period of `approximately saddle-point' inflation. We determine the statistical predictions for observables by generating over 30,000 models with 2–100 fields supporting at least 60 efolds of inflation. These studies lead us to seven lessons: i) Manyfield inflation is not single-field inflation, ii) The larger the number of fields, the simpler and sharper the predictions, iii) Planck compatibility is not rare, but future experiments may rule out this class of models, iv) The smoother the potentials, the sharper the predictions, v) Hyperparameters can transition from stiff to sloppy, vi) Despite tachyons, isocurvature can decay, vii) Eigenvalue repulsion drives the predictions. We conclude that many of the `generic predictions' of single-field inflation can be emergent features of complex inflation models.
NASA Astrophysics Data System (ADS)
Pradillo, Gerardo; Heintz, Aneesh; Vlahovska, Petia
2017-11-01
The spontaneous rotation of a sphere in an applied uniform DC electric field (Quincke effect) has been utilized to engineer self-propelled particles: if the sphere is initially resting on a surface, it rolls. The Quincke rollers have been widely used as a model system to study collective behavior in ``active'' suspensions. If the applied field is DC, an isolated Quincke roller follows a straight line trajectory. In this talk, we discuss the design of a Quincke roller that executes a random-walk-like behavior. We utilize AC field - upon reversal of the field direction a fluctuation in the axis of rotation (which is degenerate in the plane perpendicular to the field and parallel to the surface) introduces randomness in the direction of motion. The MSD of an isolated Quincke walker depends on frequency, amplitude, and waveform of the electric field. Experiment and theory are compared. We also investigate the collective behavior of Quincke walkers,the transport of inert particles in a bath of Quincke walkers, and the spontaneous motion of a drop containing Quincke active particle. supported by NSF Grant CBET 1437545.
Experimental nonlinear dynamical studies in cesium magneto-optical trap using time-series analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anwar, M., E-mail: mamalik2000@gmail.com; Islam, R.; Faisal, M.
2015-03-30
A magneto-optical trap of neutral atoms is essentially a dissipative quantum system. The fast thermal atoms continuously dissipate their energy to the environment via spontaneous emissions during the cooling. The atoms are, therefore, strongly coupled with the vacuum reservoir and the laser field. The vacuum fluctuations as well as the field fluctuations are imparted to the atoms as random photon recoils. Consequently, the external and internal dynamics of atoms becomes stochastic. In this paper, we have investigated the stochastic dynamics of the atoms in a magneto-optical trap during the loading process. The time series analysis of the fluorescence signal showsmore » that the dynamics of the atoms evolves, like all dissipative systems, from deterministic to the chaotic regime. The subsequent disappearance and revival of chaos was attributed to chaos synchronization between spatially different atoms in the magneto-optical trap.« less
NASA Astrophysics Data System (ADS)
Staroń, Waldemar; Herbowski, Leszek; Gurgul, Henryk
2007-04-01
The goal of the work was to determine the values of cumulative parameters of the cerebrospinal fluid. Values of the parameters characterise statistical cerebrospinal fluid obtained by puncture from the patients diagnosed due to suspicion of normotensive hydrocephalus. The cerebrospinal fluid taken by puncture for the routine examinations carried out at the patients suspected of normotensive hydrocephalus was analysed. In the paper there are presented results of examinations of several dozens of puncture samples of the cerebrospinal fluid coming from various patients. Each sample was examined under the microscope and photographed in 20 randomly chosen places. On the basis of analysis of the pictures showing the area of 100 x 100μm, the selected cumulative parameters such as count, numerical density, field area and field perimeter were determined for each sample. Then the average value of the parameters was determined as well.
Random numbers from vacuum fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com; Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543
2016-07-25
We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.
Souza, Roberto; Lucena, Oeslle; Garrafa, Julia; Gobbi, David; Saluzzi, Marina; Appenzeller, Simone; Rittner, Letícia; Frayne, Richard; Lotufo, Roberto
2018-04-15
This paper presents an open, multi-vendor, multi-field strength magnetic resonance (MR) T1-weighted volumetric brain imaging dataset, named Calgary-Campinas-359 (CC-359). The dataset is composed of images of older healthy adults (29-80 years) acquired on scanners from three vendors (Siemens, Philips and General Electric) at both 1.5 T and 3 T. CC-359 is comprised of 359 datasets, approximately 60 subjects per vendor and magnetic field strength. The dataset is approximately age and gender balanced, subject to the constraints of the available images. It provides consensus brain extraction masks for all volumes generated using supervised classification. Manual segmentation results for twelve randomly selected subjects performed by an expert are also provided. The CC-359 dataset allows investigation of 1) the influences of both vendor and magnetic field strength on quantitative analysis of brain MR; 2) parameter optimization for automatic segmentation methods; and potentially 3) machine learning classifiers with big data, specifically those based on deep learning methods, as these approaches require a large amount of data. To illustrate the utility of this dataset, we compared to the results of a supervised classifier, the results of eight publicly available skull stripping methods and one publicly available consensus algorithm. A linear mixed effects model analysis indicated that vendor (p-value<0.001) and magnetic field strength (p-value<0.001) have statistically significant impacts on skull stripping results. Copyright © 2017 Elsevier Inc. All rights reserved.
Prognostic interaction patterns in diabetes mellitus II: A random-matrix-theory relation
NASA Astrophysics Data System (ADS)
Rai, Aparna; Pawar, Amit Kumar; Jalan, Sarika
2015-08-01
We analyze protein-protein interactions in diabetes mellitus II and its normal counterpart under the combined framework of random matrix theory and network biology. This disease is the fifth-leading cause of death in high-income countries and an epidemic in developing countries, affecting around 8 % of the total adult population in the world. Treatment at the advanced stage is difficult and challenging, making early detection a high priority in the cure of the disease. Our investigation reveals specific structural patterns important for the occurrence of the disease. In addition to the structural parameters, the spectral properties reveal the top contributing nodes from localized eigenvectors, which turn out to be significant for the occurrence of the disease. Our analysis is time-efficient and cost-effective, bringing a new horizon in the field of medicine by highlighting major pathways involved in the disease. The analysis provides a direction for the development of novel drugs and therapies in curing the disease by targeting specific interaction patterns instead of a single protein.
Pala, M G; Baltazar, S; Martins, F; Hackens, B; Sellier, H; Ouisse, T; Bayot, V; Huant, S
2009-07-01
We study scanning gate microscopy (SGM) in open quantum rings obtained from buried semiconductor InGaAs/InAlAs heterostructures. By performing a theoretical analysis based on the Keldysh-Green function approach we interpret the radial fringes observed in experiments as the effect of randomly distributed charged defects. We associate SGM conductance images with the local density of states (LDOS) of the system. We show that such an association cannot be made with the current density distribution. By varying an external magnetic field we are able to reproduce recursive quasi-classical orbits in LDOS and conductance images, which bear the same periodicity as the Aharonov-Bohm effect.
A statistical model for radar images of agricultural scenes
NASA Technical Reports Server (NTRS)
Frost, V. S.; Shanmugan, K. S.; Holtzman, J. C.; Stiles, J. A.
1982-01-01
The presently derived and validated statistical model for radar images containing many different homogeneous fields predicts the probability density functions of radar images of entire agricultural scenes, thereby allowing histograms of large scenes composed of a variety of crops to be described. Seasat-A SAR images of agricultural scenes are accurately predicted by the model on the basis of three assumptions: each field has the same SNR, all target classes cover approximately the same area, and the true reflectivity characterizing each individual target class is a uniformly distributed random variable. The model is expected to be useful in the design of data processing algorithms and for scene analysis using radar images.
Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien
2018-01-01
We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.
Application of Methods of Numerical Analysis to Physical and Engineering Data.
1980-10-15
directed algorithm would seem to be called for. However, 1(0) is itself a random process, making its gradient too unreliable for such a sensitive algorithm...radiation energy on the detector . Active laser systems, on the other hand, have created now the possibility for extremely narrow path band systems...emitted by the earth and its atmosphere. The broad spectral range was selected so that the field of view of the detector could be narrowed to obtain
Robinson, Sean; Guyon, Laurent; Nevalainen, Jaakko; Toriseva, Mervi
2015-01-01
Organotypic, three dimensional (3D) cell culture models of epithelial tumour types such as prostate cancer recapitulate key aspects of the architecture and histology of solid cancers. Morphometric analysis of multicellular 3D organoids is particularly important when additional components such as the extracellular matrix and tumour microenvironment are included in the model. The complexity of such models has so far limited their successful implementation. There is a great need for automatic, accurate and robust image segmentation tools to facilitate the analysis of such biologically relevant 3D cell culture models. We present a segmentation method based on Markov random fields (MRFs) and illustrate our method using 3D stack image data from an organotypic 3D model of prostate cancer cells co-cultured with cancer-associated fibroblasts (CAFs). The 3D segmentation output suggests that these cell types are in physical contact with each other within the model, which has important implications for tumour biology. Segmentation performance is quantified using ground truth labels and we show how each step of our method increases segmentation accuracy. We provide the ground truth labels along with the image data and code. Using independent image data we show that our segmentation method is also more generally applicable to other types of cellular microscopy and not only limited to fluorescence microscopy. PMID:26630674
Jeong, Chan-Seok; Kim, Dongsup
2016-02-24
Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.
Robinson, Sean; Guyon, Laurent; Nevalainen, Jaakko; Toriseva, Mervi; Åkerfelt, Malin; Nees, Matthias
2015-01-01
Organotypic, three dimensional (3D) cell culture models of epithelial tumour types such as prostate cancer recapitulate key aspects of the architecture and histology of solid cancers. Morphometric analysis of multicellular 3D organoids is particularly important when additional components such as the extracellular matrix and tumour microenvironment are included in the model. The complexity of such models has so far limited their successful implementation. There is a great need for automatic, accurate and robust image segmentation tools to facilitate the analysis of such biologically relevant 3D cell culture models. We present a segmentation method based on Markov random fields (MRFs) and illustrate our method using 3D stack image data from an organotypic 3D model of prostate cancer cells co-cultured with cancer-associated fibroblasts (CAFs). The 3D segmentation output suggests that these cell types are in physical contact with each other within the model, which has important implications for tumour biology. Segmentation performance is quantified using ground truth labels and we show how each step of our method increases segmentation accuracy. We provide the ground truth labels along with the image data and code. Using independent image data we show that our segmentation method is also more generally applicable to other types of cellular microscopy and not only limited to fluorescence microscopy.
Controlling dispersion forces between small particles with artificially created random light fields
Brügger, Georges; Froufe-Pérez, Luis S.; Scheffold, Frank; José Sáenz, Juan
2015-01-01
Appropriate combinations of laser beams can be used to trap and manipulate small particles with optical tweezers as well as to induce significant optical binding forces between particles. These interaction forces are usually strongly anisotropic depending on the interference landscape of the external fields. This is in contrast with the familiar isotropic, translationally invariant, van der Waals and, in general, Casimir–Lifshitz interactions between neutral bodies arising from random electromagnetic waves generated by equilibrium quantum and thermal fluctuations. Here we show, both theoretically and experimentally, that dispersion forces between small colloidal particles can also be induced and controlled using artificially created fluctuating light fields. Using optical tweezers as a gauge, we present experimental evidence for the predicted isotropic attractive interactions between dielectric microspheres induced by laser-generated, random light fields. These light-induced interactions open a path towards the control of translationally invariant interactions with tuneable strength and range in colloidal systems. PMID:26096622
Driving a Superconductor to Insulator Transition with Random Gauge Fields.
Nguyen, H Q; Hollen, S M; Shainline, J; Xu, J M; Valles, J M
2016-11-30
Typically the disorder that alters the interference of particle waves to produce Anderson localization is potential scattering from randomly placed impurities. Here we show that disorder in the form of random gauge fields that act directly on particle phases can also drive localization. We present evidence of a superfluid bose glass to insulator transition at a critical level of this gauge field disorder in a nano-patterned array of amorphous Bi islands. This transition shows signs of metallic transport near the critical point characterized by a resistance , indicative of a quantum phase transition. The critical disorder depends on interisland coupling in agreement with recent Quantum Monte Carlo simulations. We discuss how this disorder tuned SIT differs from the common frustration tuned SIT that also occurs in magnetic fields. Its discovery enables new high fidelity comparisons between theoretical and experimental studies of disorder effects on quantum critical systems.
ERIC Educational Resources Information Center
Roeser, Robert W.; Schonert-Reichl, Kimberly A.; Jha, Amishi; Cullen, Margaret; Wallace, Linda; Wilensky, Rona; Oberle, Eva; Thomson, Kimberly; Taylor, Cynthia; Harrison, Jessica
2013-01-01
The effects of randomization to mindfulness training (MT) or to a waitlist-control condition on psychological and physiological indicators of teachers' occupational stress and burnout were examined in 2 field trials. The sample included 113 elementary and secondary school teachers (89% female) from Canada and the United States. Measures were…
ERIC Educational Resources Information Center
Al Otaiba, Stephanie; Lake, Vickie E.; Greulich, Luana; Folsom, Jessica S.; Guidry, Lisa
2012-01-01
This randomized-control trial examined the learning of preservice teachers taking an initial Early Literacy course in an early childhood education program and of the kindergarten or first grade students they tutored in their field experience. Preservice teachers were randomly assigned to one of two tutoring programs: Book Buddies and Tutor…
A Multisite Cluster Randomized Field Trial of Open Court Reading
ERIC Educational Resources Information Center
Borman, Geoffrey D.; Dowling, N. Maritza; Schneck, Carrie
2008-01-01
In this article, the authors report achievement outcomes of a multisite cluster randomized field trial of Open Court Reading 2005 (OCR), a K-6 literacy curriculum published by SRA/McGraw-Hill. The participants are 49 first-grade through fifth-grade classrooms from predominantly minority and poor contexts across the nation. Blocking by grade level…
Jian, Zhongping; Pearce, Jeremy; Mittleman, Daniel M
2003-07-18
We describe observations of the amplitude and phase of an electric field diffusing through a three-dimensional random medium, using terahertz time-domain spectroscopy. These measurements are spatially resolved with a resolution smaller than the speckle spot size and temporally resolved with a resolution better than one optical cycle. By computing correlation functions between fields measured at different positions and with different temporal delays, it is possible to obtain information about individual scattering events experienced by the diffusing field. This represents a new method for characterizing a multiply scattered wave.
Seabed mapping and characterization of sediment variability using the usSEABED data base
Goff, J.A.; Jenkins, C.J.; Jeffress, Williams S.
2008-01-01
We present a methodology for statistical analysis of randomly located marine sediment point data, and apply it to the US continental shelf portions of usSEABED mean grain size records. The usSEABED database, like many modern, large environmental datasets, is heterogeneous and interdisciplinary. We statistically test the database as a source of mean grain size data, and from it provide a first examination of regional seafloor sediment variability across the entire US continental shelf. Data derived from laboratory analyses ("extracted") and from word-based descriptions ("parsed") are treated separately, and they are compared statistically and deterministically. Data records are selected for spatial analysis by their location within sample regions: polygonal areas defined in ArcGIS chosen by geography, water depth, and data sufficiency. We derive isotropic, binned semivariograms from the data, and invert these for estimates of noise variance, field variance, and decorrelation distance. The highly erratic nature of the semivariograms is a result both of the random locations of the data and of the high level of data uncertainty (noise). This decorrelates the data covariance matrix for the inversion, and largely prevents robust estimation of the fractal dimension. Our comparison of the extracted and parsed mean grain size data demonstrates important differences between the two. In particular, extracted measurements generally produce finer mean grain sizes, lower noise variance, and lower field variance than parsed values. Such relationships can be used to derive a regionally dependent conversion factor between the two. Our analysis of sample regions on the US continental shelf revealed considerable geographic variability in the estimated statistical parameters of field variance and decorrelation distance. Some regional relationships are evident, and overall there is a tendency for field variance to be higher where the average mean grain size is finer grained. Surprisingly, parsed and extracted noise magnitudes correlate with each other, which may indicate that some portion of the data variability that we identify as "noise" is caused by real grain size variability at very short scales. Our analyses demonstrate that by applying a bias-correction proxy, usSEABED data can be used to generate reliable interpolated maps of regional mean grain size and sediment character.
NASA Technical Reports Server (NTRS)
Welch, R. M.; Sengupta, S. K.; Chen, D. W.
1990-01-01
Stratocumulus cloud fields in the FIRE IFO region are analyzed using LANDSAT Thematic Mapper imagery. Structural properties such as cloud cell size distribution, cell horizontal aspect ratio, fractional coverage and fractal dimension are determined. It is found that stratocumulus cloud number densities are represented by a power law. Cell horizontal aspect ratio has a tendency to increase at large cell sizes, and cells are bi-fractal in nature. Using LANDSAT Multispectral Scanner imagery for twelve selected stratocumulus scenes acquired during previous years, similar structural characteristics are obtained. Cloud field spatial organization also is analyzed. Nearest-neighbor spacings are fit with a number of functions, with Weibull and Gamma distributions providing the best fits. Poisson tests show that the spatial separations are not random. Second order statistics are used to examine clustering.
Pollina, Dean A; Dollins, Andrew B; Senter, Stuart M; Krapohl, Donald J; Ryan, Andrew H
2004-12-01
In a preliminary attempt to determine the generalizability of data from laboratory mock-crime studies, the authors examined the similarities and differences among the cardiovascular, electrodermal, and respiration responses of deceptive and nondeceptive individuals elicited to crime-relevant and crime-irrelevant questions. Participants in the laboratory group were randomly assigned to nondeceptive (n = 28) or deceptive (n = 27) treatment groups, and a mock-crime scenario was used. The field participants were confirmed nondeceptive (n = 28) or deceptive (n = 39) criminal suspects who underwent polygraph examinations between 1993 and 1997. The results indicated that there were salient differences between field and similarly obtained laboratory polygraph response measures. However, accuracy of laboratory participants' classifications using logistic regression analysis was not significantly different from field participants' classification accuracy. 2004 APA, all rights reserved
Self-excitation of a nonlinear scalar field in a random medium
Zeldovich, Ya. B.; Molchanov, S. A.; Ruzmaikin, A. A.; Sokoloff, D. D.
1987-01-01
We discuss the evolution in time of a scalar field under the influence of a random potential and diffusion. The cases of a short-correlation in time and of stationary potentials are considered. In a linear approximation and for sufficiently weak diffusion, the statistical moments of the field grow exponentially in time at growth rates that progressively increase with the order of the moment; this indicates the intermittent nature of the field. Nonlinearity halts this growth and in some cases can destroy the intermittency. However, in many nonlinear situations the intermittency is preserved: high, persistent peaks of the field exist against the background of a smooth field distribution. These widely spaced peaks may make a major contribution to the average characteristics of the field. PMID:16593872
Monte Carlo calibration of avalanches described as Coulomb fluid flows.
Ancey, Christophe
2005-07-15
The idea that snow avalanches might behave as granular flows, and thus be described as Coulomb fluid flows, came up very early in the scientific study of avalanches, but it is not until recently that field evidence has been provided that demonstrates the reliability of this idea. This paper aims to specify the bulk frictional behaviour of snow avalanches by seeking a universal friction law. Since the bulk friction coefficient cannot be measured directly in the field, the friction coefficient must be calibrated by adjusting the model outputs to closely match the recorded data. Field data are readily available but are of poor quality and accuracy. We used Bayesian inference techniques to specify the model uncertainty relative to data uncertainty and to robustly and efficiently solve the inverse problem. A sample of 173 events taken from seven paths in the French Alps was used. The first analysis showed that the friction coefficient behaved as a random variable with a smooth and bell-shaped empirical distribution function. Evidence was provided that the friction coefficient varied with the avalanche volume, but any attempt to adjust a one-to-one relationship relating friction to volume produced residual errors that could be as large as three times the maximum uncertainty of field data. A tentative universal friction law is proposed: the friction coefficient is a random variable, the distribution of which can be approximated by a normal distribution with a volume-dependent mean.
Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems
NASA Astrophysics Data System (ADS)
Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros
2015-04-01
In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).
Causal inference from observational data.
Listl, Stefan; Jürges, Hendrik; Watt, Richard G
2016-10-01
Randomized controlled trials have long been considered the 'gold standard' for causal inference in clinical research. In the absence of randomized experiments, identification of reliable intervention points to improve oral health is often perceived as a challenge. But other fields of science, such as social science, have always been challenged by ethical constraints to conducting randomized controlled trials. Methods have been established to make causal inference using observational data, and these methods are becoming increasingly relevant in clinical medicine, health policy and public health research. This study provides an overview of state-of-the-art methods specifically designed for causal inference in observational data, including difference-in-differences (DiD) analyses, instrumental variables (IV), regression discontinuity designs (RDD) and fixed-effects panel data analysis. The described methods may be particularly useful in dental research, not least because of the increasing availability of routinely collected administrative data and electronic health records ('big data'). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Tarrab, Leticia; Garcia, Carlos M.; Cantero, Mariano I.; Oberg, Kevin
2012-01-01
This work presents a systematic analysis quantifying the role of the presence of turbulence fluctuations on uncertainties (random errors) of acoustic Doppler current profiler (ADCP) discharge measurements from moving platforms. Data sets of three-dimensional flow velocities with high temporal and spatial resolution were generated from direct numerical simulation (DNS) of turbulent open channel flow. Dimensionless functions relating parameters quantifying the uncertainty in discharge measurements due to flow turbulence (relative variance and relative maximum random error) to sampling configuration were developed from the DNS simulations and then validated with field-scale discharge measurements. The validated functions were used to evaluate the role of the presence of flow turbulence fluctuations on uncertainties in ADCP discharge measurements. The results of this work indicate that random errors due to the flow turbulence are significant when: (a) a low number of transects is used for a discharge measurement, and (b) measurements are made in shallow rivers using high boat velocity (short time for the boat to cross a flow turbulence structure).
Probability Distributions for Random Quantum Operations
NASA Astrophysics Data System (ADS)
Schultz, Kevin
Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.
In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.
Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.; ...
2012-05-01
In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.
Exploring diversity in ensemble classification: Applications in large area land cover mapping
NASA Astrophysics Data System (ADS)
Mellor, Andrew; Boukir, Samia
2017-07-01
Ensemble classifiers, such as random forests, are now commonly applied in the field of remote sensing, and have been shown to perform better than single classifier systems, resulting in reduced generalisation error. Diversity across the members of ensemble classifiers is known to have a strong influence on classification performance - whereby classifier errors are uncorrelated and more uniformly distributed across ensemble members. The relationship between ensemble diversity and classification performance has not yet been fully explored in the fields of information science and machine learning and has never been examined in the field of remote sensing. This study is a novel exploration of ensemble diversity and its link to classification performance, applied to a multi-class canopy cover classification problem using random forests and multisource remote sensing and ancillary GIS data, across seven million hectares of diverse dry-sclerophyll dominated public forests in Victoria Australia. A particular emphasis is placed on analysing the relationship between ensemble diversity and ensemble margin - two key concepts in ensemble learning. The main novelty of our work is on boosting diversity by emphasizing the contribution of lower margin instances used in the learning process. Exploring the influence of tree pruning on diversity is also a new empirical analysis that contributes to a better understanding of ensemble performance. Results reveal insights into the trade-off between ensemble classification accuracy and diversity, and through the ensemble margin, demonstrate how inducing diversity by targeting lower margin training samples is a means of achieving better classifier performance for more difficult or rarer classes and reducing information redundancy in classification problems. Our findings inform strategies for collecting training data and designing and parameterising ensemble classifiers, such as random forests. This is particularly important in large area remote sensing applications, for which training data is costly and resource intensive to collect.
Microseismic response characteristics modeling and locating of underground water supply pipe leak
NASA Astrophysics Data System (ADS)
Wang, J.; Liu, J.
2015-12-01
In traditional methods of pipeline leak location, geophones must be located on the pipe wall. If the exact location of the pipeline is unknown, the leaks cannot be identified accurately. To solve this problem, taking into account the characteristics of the pipeline leak, we propose a continuous random seismic source model and construct geological models to investigate the proposed method for locating underground pipeline leaks. Based on two dimensional (2D) viscoacoustic equations and the staggered grid finite-difference (FD) algorithm, the microseismic wave field generated by a leaking pipe is modeled. Cross-correlation analysis and the simulated annealing (SA) algorithm were utilized to obtain the time difference and the leak location. We also analyze and discuss the effect of the number of recorded traces, the survey layout, and the offset and interval of the traces on the accuracy of the estimated location. The preliminary results of the simulation and data field experiment indicate that (1) a continuous random source can realistically represent the leak microseismic wave field in a simulation using 2D visco-acoustic equations and a staggered grid FD algorithm. (2) The cross-correlation method is effective for calculating the time difference of the direct wave relative to the reference trace. However, outside the refraction blind zone, the accuracy of the time difference is reduced by the effects of the refracted wave. (3) The acquisition method of time difference based on the microseismic theory and SA algorithm has a great potential for locating leaks from underground pipelines from an array located on the ground surface. Keywords: Viscoacoustic finite-difference simulation; continuous random source; simulated annealing algorithm; pipeline leak location
Effect of increasing disorder on domains of the 2d Coulomb glass.
Bhandari, Preeti; Malik, Vikas
2017-12-06
We have studied a two dimensional lattice model of Coulomb glass for a wide range of disorders at [Formula: see text]. The system was first annealed using Monte Carlo simulation. Further minimization of the total energy of the system was done using an algorithm developed by Baranovskii et al, followed by cluster flipping to obtain the pseudo-ground states. We have shown that the energy required to create a domain of linear size L in d dimensions is proportional to [Formula: see text]. Using Imry-Ma arguments given for random field Ising model, one gets critical dimension [Formula: see text] for Coulomb glass. The investigation of domains in the transition region shows a discontinuity in staggered magnetization which is an indication of a first-order type transition from charge-ordered phase to disordered phase. The structure and nature of random field fluctuations of the second largest domain in Coulomb glass are inconsistent with the assumptions of Imry and Ma, as was also reported for random field Ising model. The study of domains showed that in the transition region there were mostly two large domains, and that as disorder was increased the two large domains remained, but a large number of small domains also opened up. We have also studied the properties of the second largest domain as a function of disorder. We furthermore analysed the effect of disorder on the density of states, and showed a transition from hard gap at low disorders to a soft gap at higher disorders. At [Formula: see text], we have analysed the soft gap in detail, and found that the density of states deviates slightly ([Formula: see text]) from the linear behaviour in two dimensions. Analysis of local minima show that the pseudo-ground states have similar structure.
Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich
2009-02-10
Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.
Workshop on Incomplete Network Data Held at Sandia National Labs – Livermore
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soundarajan, Sucheta; Wendt, Jeremy D.
2016-06-01
While network analysis is applied in a broad variety of scientific fields (including physics, computer science, biology, and the social sciences), how networks are constructed and the resulting bias and incompleteness have drawn more limited attention. For example, in biology, gene networks are typically developed via experiment -- many actual interactions are likely yet to be discovered. In addition to this incompleteness, the data-collection processes can introduce significant bias into the observed network datasets. For instance, if you observe part of the World Wide Web network through a classic random walk, then high degree nodes are more likely to bemore » found than if you had selected nodes at random. Unfortunately, such incomplete and biasing data collection methods must be often used.« less
NASA Astrophysics Data System (ADS)
Posnansky, Oleg P.
2018-05-01
The measuring of dynamic magnetic susceptibility by nuclear magnetic resonance is used for revealing information about the internal structure of various magnetoactive composites. The response of such material on the applied external static and time-varying magnetic fields encodes intrinsic dynamic correlations and depends on links between macroscopic effective susceptibility and structure on the microscopic scale. In the current work we carried out computational analysis of the frequency dependent dynamic magnetic susceptibility and demonstrated its dependence on the microscopic architectural elements while also considering Euclidean dimensionality. The proposed numerical method is efficient in the simulation of nuclear magnetic resonance experiments in two- and three-dimensional random magnetic media by choosing and modeling the influence of the concentration of components and internal hierarchical characteristics of physical parameters.
Travensolo, Cristiane; Goessler, Karla; Poton, Roberto; Pinto, Roberta Ramos; Polito, Marcos Doederlein
2018-04-13
The literature concerning the effects of cardiac rehabilitation (CR) on field tests results is inconsistent. To perform a systematic review with meta-analysis on field tests results after programs of CR. Studies published in PubMed and Web of Science databases until May 2016 were analyzed. The standard difference in means correct by bias (Hedges' g) was used as effect size (g) to measure que amount of modifications in performance of field tests after CR period. Potential differences between subgroups were analyzed by Q-test based on ANOVA. Fifteen studies published between 1996 e 2016 were included in the review, 932 patients and age ranged 54,4 - 75,3 years old. Fourteen studies used the six-minutes walking test to evaluate the exercise capacity and one study used the Shuttle Walk Test. The random Hedges's g was 0.617 (P<0.001), representing a drop of 20% in the performance of field test after CR. The meta-regression showed significantly association (P=0.01) to aerobic exercise duration, i.e., for each 1-min increase in aerobic exercise duration, there is a 0.02 increase in effect size for performance in the field test. Field tests can detect physical modification after CR, and the large duration of aerobic exercise during CR was associated with a better result. Copyright © 2018 Sociedade Portuguesa de Cardiologia. Publicado por Elsevier España, S.L.U. All rights reserved.
2000-10-01
To investigate the association between control of intraocular pressure after surgical intervention for glaucoma and visual field deterioration. In the Advanced Glaucoma Intervention Study, eyes were randomly assigned to one of two sequences of glaucoma surgery, one beginning with argon laser trabeculoplasty and the other trabeculectomy. In the present article we examine the relationship between intraocular pressure and progression of visual field damage over 6 or more years of follow-up. In the first analysis, designated Predictive Analysis, we categorize 738 eyes into three groups based on intraocular pressure determinations over the first three 6-month follow-up visits. In the second analysis, designated Associative Analysis, we categorize 586 eyes into four groups based on the percent of 6-month visits over the first 6 follow-up years in which eyes presented with intraocular pressure less than 18 mm Hg. The outcome measure in both analyses is change from baseline in follow-up visual field defect score (range, 0 to 20 units). In the Predictive Analysis, eyes with early average intraocular pressure greater than 17.5 mm Hg had an estimated worsening during subsequent follow-up that was 1 unit of visual field defect score greater than eyes with average intraocular pressure less than 14 mm Hg (P =.002). This amount of worsening was greater at 7 years (1.89 units; P <.001) than at 2 years (0.64 units; P =.071). In the Associative Analysis, eyes with 100% of visits with intraocular pressure less than 18 mm Hg over 6 years had mean changes from baseline in visual field defect score close to zero during follow-up, whereas eyes with less than 50% of visits with intraocular pressure less than 18 mm Hg had an estimated worsening over follow-up of 0.63 units of visual field defect score (P =.083). This amount of worsening was greater at 7 years (1.93 units; P <.001) than at 2 years (0.25 units; P =.572). In both analyses low intraocular pressure is associated with reduced progression of visual field defect, supporting evidence from earlier studies of a protective role for low intraocular pressure in visual field deterioration.
Fretheim, Atle; Zhang, Fang; Ross-Degnan, Dennis; Oxman, Andrew D; Cheyne, Helen; Foy, Robbie; Goodacre, Steve; Herrin, Jeph; Kerse, Ngaire; McKinlay, R James; Wright, Adam; Soumerai, Stephen B
2015-03-01
There is often substantial uncertainty about the impacts of health system and policy interventions. Despite that, randomized controlled trials (RCTs) are uncommon in this field, partly because experiments can be difficult to carry out. An alternative method for impact evaluation is the interrupted time-series (ITS) design. Little is known, however, about how results from the two methods compare. Our aim was to explore whether ITS studies yield results that differ from those of randomized trials. We conducted single-arm ITS analyses (segmented regression) based on data from the intervention arm of cluster randomized trials (C-RCTs), that is, discarding control arm data. Secondarily, we included the control group data in the analyses, by subtracting control group data points from intervention group data points, thereby constructing a time series representing the difference between the intervention and control groups. We compared the results from the single-arm and controlled ITS analyses with results based on conventional aggregated analyses of trial data. The findings were largely concordant, yielding effect estimates with overlapping 95% confidence intervals (CI) across different analytical methods. However, our analyses revealed the importance of a concurrent control group and of taking baseline and follow-up trends into account in the analysis of C-RCTs. The ITS design is valuable for evaluation of health systems interventions, both when RCTs are not feasible and in the analysis and interpretation of data from C-RCTs. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling
Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David
2016-01-01
Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464
Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu
2018-05-09
The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
Mindfulness meditation for insomnia: A meta-analysis of randomized controlled trials.
Gong, Hong; Ni, Chen-Xu; Liu, Yun-Zi; Zhang, Yi; Su, Wen-Jun; Lian, Yong-Jie; Peng, Wei; Jiang, Chun-Lei
2016-10-01
Insomnia is a widespread and debilitating condition that affects sleep quality and daily productivity. Although mindfulness meditation (MM) has been suggested as a potentially effective supplement to medical treatment for insomnia, no comprehensively quantitative research has been conducted in this field. Therefore, we performed a meta-analysis on the findings of related randomized controlled trials (RCTs) to evaluate the effects of MM on insomnia. Related publications in PubMed, EMBASE, the Cochrane Library and PsycINFO were searched up to July 2015. To calculate the standardized mean differences (SMDs) and 95% confidence intervals (CIs), we used a fixed effect model when heterogeneity was negligible and a random effect model when heterogeneity was significant. A total of 330 participants in 6 RCTs that met the selection criteria were included in this meta-analysis. Analysis of overall effect revealed that MM significantly improved total wake time and sleep quality, but had no significant effects on sleep onset latency, total sleep time, wake after sleep onset, sleep efficiency, total wake time, ISI, PSQI and DBAS. Subgroup analyses showed that although there were no significant differences between MM and control groups in terms of total sleep time, significant effects were found in total wake time, sleep onset latency, sleep quality, sleep efficiency, and PSQI global score (absolute value of SMD range: 0.44-1.09, all p<0.05). The results suggest that MM may mildly improve some sleep parameters in patients with insomnia. MM can serve as an auxiliary treatment to medication for sleep complaints. Copyright © 2016 Elsevier Inc. All rights reserved.
Ratchadaporn, Janthasri; Sureeporn, Katengam; Khumcha, U
2007-09-15
The experiment was carried out at the Department of Horticulture, Ubon Ratchathani University, Ubon Ratchathani province, Northeast Thailand during June 2002 to May 2003 aims to identify DNA fingerprints of thirty papaya cultivars with the use of Amplified Fragment Length Polymorphisms (AFLP) technique. Papaya cultivars were collected from six different research centers in Thailand. Papaya plants of each cultivar were grown under field conditions up to four months then leaf numbers 2 and 3 of each cultivar (counted from top) were chosen for DNA extraction and the samples were used for AFLP analysis. Out of 64 random primers being used, 55 pairs gave an increase in DNA bands but only 12 pairs of random primers were randomly chosen for the final analysis of the experiment. The results showed that AFLP markers gave Polymorphic Information Contents (PIC) of three ranges i.e., AFLP markers of 235 lied on a PIC range of 0.003-0.05, 47 for a PIC range of 0.15-0.20 and 12 for a PIC range of 0.35-0.40. The results on dendrogram cluster analysis revealed that the thirty papaya cultivars were classified into six groups i.e., (1) Kaeg Dum and Malador (2) Kaeg Nuan (3) Pakchong and Solo (4) Taiwan (5) Co Coa Hai Nan and (6) Sitong. Nevertheless, in spite of the six papaya groups all papaya cultivars were genetically related to each other where diversity among the cultivars was not significantly found.
Statistical simulations of the dust foreground to cosmic microwave background polarization
NASA Astrophysics Data System (ADS)
Vansyngel, F.; Boulanger, F.; Ghosh, T.; Wandelt, B.; Aumont, J.; Bracco, A.; Levrier, F.; Martin, P. G.; Montier, L.
2017-07-01
The characterization of the dust polarization foreground to the cosmic microwave background (CMB) is a necessary step toward the detection of the B-mode signal associated with primordial gravitational waves. We present a method to simulate maps of polarized dust emission on the sphere that is similar to the approach used for CMB anisotropies. This method builds on the understanding of Galactic polarization stemming from the analysis of Planck data. It relates the dust polarization sky to the structure of the Galactic magnetic field and its coupling with interstellar matter and turbulence. The Galactic magnetic field is modeled as a superposition of a mean uniform field and a Gaussian random (turbulent) component with a power-law power spectrum of exponent αM. The integration along the line of sight carried out to compute Stokes maps is approximated by a sum over a small number of emitting layers with different realizations of the random component of the magnetic field. The model parameters are constrained to fit the power spectra of dust polarization EE, BB, and TE measured using Planck data. We find that the slopes of the E and B power spectra of dust polarization are matched for αM = -2.5, an exponent close to that measured for total dust intensity but larger than the Kolmogorov exponent - 11/3. The model allows us to compute multiple realizations of the Stokes Q and U maps for different realizations of the random component of the magnetic field, and to quantify the variance of dust polarization spectra for any given sky area outside of the Galactic plane. The simulations reproduce the scaling relation between the dust polarization power and the mean total dust intensity including the observed dispersion around the mean relation. We also propose a method to carry out multifrequency simulations, including the decorrelation measured recently by Planck, using a given covariance matrix of the polarization maps. These simulations are well suited to optimize component separation methods and to quantify the confidence with which the dust and CMB B-modes can be separated in present and future experiments. We also provide an astrophysical perspective on our phenomenological modeling of the dust polarization spectra.
ERIC Educational Resources Information Center
Al Otaiba, Stephanie; Connor, Carol M.; Folsom, Jessica S.; Greulich, Luana; Meadows, Jane; Li, Zhi
2011-01-01
The purpose of this cluster-randomized control field trial was to examine whether kindergarten teachers could learn to differentiate classroom reading instruction using Individualized Student Instruction for Kindergarten (ISI-K) and to test the efficacy of differentiation on reading outcomes. The study involved 14 schools, 23 ISI-K (n = 305…
Group field theory and tensor networks: towards a Ryu–Takayanagi formula in full quantum gravity
NASA Astrophysics Data System (ADS)
Chirco, Goffredo; Oriti, Daniele; Zhang, Mingyi
2018-06-01
We establish a dictionary between group field theory (thus, spin networks and random tensors) states and generalized random tensor networks. Then, we use this dictionary to compute the Rényi entropy of such states and recover the Ryu–Takayanagi formula, in two different cases corresponding to two different truncations/approximations, suggested by the established correspondence.
ERIC Educational Resources Information Center
Kraft, Matthew A.; Dougherty, Shaun M.
2013-01-01
In this study, we evaluate the efficacy of teacher communication with parents and students as a means of increasing student engagement. We estimate the causal effect of teacher communication by conducting a randomized field experiment in which sixth- and ninth-grade students were assigned to receive a daily phone call home and a text/written…
Makarov, D V; Kon'kov, L E; Uleysky, M Yu; Petrov, P S
2013-01-01
The problem of sound propagation in a randomly inhomogeneous oceanic waveguide is considered. An underwater sound channel in the Sea of Japan is taken as an example. Our attention is concentrated on the domains of finite-range ray stability in phase space and their influence on wave dynamics. These domains can be found by means of the one-step Poincare map. To study manifestations of finite-range ray stability, we introduce the finite-range evolution operator (FREO) describing transformation of a wave field in the course of propagation along a finite segment of a waveguide. Carrying out statistical analysis of the FREO spectrum, we estimate the contribution of regular domains and explore their evanescence with increasing length of the segment. We utilize several methods of spectral analysis: analysis of eigenfunctions by expanding them over modes of the unperturbed waveguide, approximation of level-spacing statistics by means of the Berry-Robnik distribution, and the procedure used by A. Relano and coworkers [Relano et al., Phys. Rev. Lett. 89, 244102 (2002); Relano, Phys. Rev. Lett. 100, 224101 (2008)]. Comparing the results obtained with different methods, we find that the method based on the statistical analysis of FREO eigenfunctions is the most favorable for estimating the contribution of regular domains. It allows one to find directly the waveguide modes whose refraction is regular despite the random inhomogeneity. For example, it is found that near-axial sound propagation in the Sea of Japan preserves stability even over distances of hundreds of kilometers due to the presence of a shearless torus in the classical phase space. Increasing the acoustic wavelength degrades scattering, resulting in recovery of eigenfunction localization near periodic orbits of the one-step Poincaré map.
NASA Technical Reports Server (NTRS)
Kwon, Jin H.; Lee, Ja H.
1989-01-01
The far-field beam pattern and the power-collection efficiency are calculated for a multistage laser-diode-array amplifier consisting of about 200,000 5-W laser diode arrays with random distributions of phase and orientation errors and random diode failures. From the numerical calculation it is found that the far-field beam pattern is little affected by random failures of up to 20 percent of the laser diodes with reference of 80 percent receiving efficiency in the center spot. The random differences in phases among laser diodes due to probable manufacturing errors is allowed to about 0.2 times the wavelength. The maximum allowable orientation error is about 20 percent of the diffraction angle of a single laser diode aperture (about 1 cm). The preliminary results indicate that the amplifier could be used for space beam-power transmission with an efficiency of about 80 percent for a moderate-size (3-m-diameter) receiver placed at a distance of less than 50,000 km.
Kawamoto, Hirokazu; Takayasu, Hideki; Jensen, Henrik Jeldtoft; Takayasu, Misako
2015-01-01
Through precise numerical analysis, we reveal a new type of universal loopless percolation transition in randomly removed complex networks. As an example of a real-world network, we apply our analysis to a business relation network consisting of approximately 3,000,000 links among 300,000 firms and observe the transition with critical exponents close to the mean-field values taking into account the finite size effect. We focus on the largest cluster at the critical point, and introduce survival probability as a new measure characterizing the robustness of each node. We also discuss the relation between survival probability and k-shell decomposition.
Random electric field instabilities of relaxor ferroelectrics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arce-Gamboa, Jose R.; Guzman-Verri, Gian G.
Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. As a result, we compare and reproduce severalmore » key experimental observations in the well-studied relaxor PbMg 1/3Nb 2/3O 3–PbTiO 3.« less
Random electric field instabilities of relaxor ferroelectrics
Arce-Gamboa, Jose R.; Guzman-Verri, Gian G.
2017-06-13
Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. As a result, we compare and reproduce severalmore » key experimental observations in the well-studied relaxor PbMg 1/3Nb 2/3O 3–PbTiO 3.« less
Possible Statistics of Two Coupled Random Fields: Application to Passive Scalar
NASA Technical Reports Server (NTRS)
Dubrulle, B.; He, Guo-Wei; Bushnell, Dennis M. (Technical Monitor)
2000-01-01
We use the relativity postulate of scale invariance to derive the similarity transformations between two coupled scale-invariant random elds at different scales. We nd the equations leading to the scaling exponents. This formulation is applied to the case of passive scalars advected i) by a random Gaussian velocity field; and ii) by a turbulent velocity field. In the Gaussian case, we show that the passive scalar increments follow a log-Levy distribution generalizing Kraichnan's solution and, in an appropriate limit, a log-normal distribution. In the turbulent case, we show that when the velocity increments follow a log-Poisson statistics, the passive scalar increments follow a statistics close to log-Poisson. This result explains the experimental observations of Ruiz et al. about the temperature increments.
[Homeopathy in cancer patients: What does the "best" evidence tell us?
de Nonneville, Alexandre; Gonçalves, Anthony
2018-04-01
Homeopathic medicines are used by many patients with cancer, usually alongside conventional treatment. A recent report by the European Academies' Science Advisory Council concluded that "that there are no robust and reproducible evidence that homeopathy is effective". This literature review aims to make the analysis of published controlled randomized trials involving homeopathic treatment in the field of oncology. Copyright © 2018 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.
Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel
2016-07-20
A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel
A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less
Random-phase metasurfaces at optical wavelengths
NASA Astrophysics Data System (ADS)
Pors, Anders; Ding, Fei; Chen, Yiting; Radko, Ilya P.; Bozhevolnyi, Sergey I.
2016-06-01
Random-phase metasurfaces, in which the constituents scatter light with random phases, have the property that an incident plane wave will diffusely scatter, hereby leading to a complex far-field response that is most suitably described by statistical means. In this work, we present and exemplify the statistical description of the far-field response, particularly highlighting how the response for polarised and unpolarised light might be alike or different depending on the correlation of scattering phases for two orthogonal polarisations. By utilizing gap plasmon-based metasurfaces, consisting of an optically thick gold film overlaid by a subwavelength thin glass spacer and an array of gold nanobricks, we design and realize random-phase metasurfaces at a wavelength of 800 nm. Optical characterisation of the fabricated samples convincingly demonstrates the diffuse scattering of reflected light, with statistics obeying the theoretical predictions. We foresee the use of random-phase metasurfaces for camouflage applications and as high-quality reference structures in dark-field microscopy, while the control of the statistics for polarised and unpolarised light might find usage in security applications. Finally, by incorporating a certain correlation between scattering by neighbouring metasurface constituents new types of functionalities can be realised, such as a Lambertian reflector.
Regression analysis of longitudinal data with correlated censoring and observation times.
Li, Yang; He, Xin; Wang, Haiying; Sun, Jianguo
2016-07-01
Longitudinal data occur in many fields such as the medical follow-up studies that involve repeated measurements. For their analysis, most existing approaches assume that the observation or follow-up times are independent of the response process either completely or given some covariates. In practice, it is apparent that this may not be true. In this paper, we present a joint analysis approach that allows the possible mutual correlations that can be characterized by time-dependent random effects. Estimating equations are developed for the parameter estimation and the resulted estimators are shown to be consistent and asymptotically normal. The finite sample performance of the proposed estimators is assessed through a simulation study and an illustrative example from a skin cancer study is provided.
NASA Astrophysics Data System (ADS)
Blanc-Benon, Philippe; Lipkens, Bart; Dallois, Laurent; Hamilton, Mark F.; Blackstock, David T.
2002-01-01
Sonic boom propagation can be affected by atmospheric turbulence. It has been shown that turbulence affects the perceived loudness of sonic booms, mainly by changing its peak pressure and rise time. The models reported here describe the nonlinear propagation of sound through turbulence. Turbulence is modeled as a set of individual realizations of a random temperature or velocity field. In the first model, linear geometrical acoustics is used to trace rays through each realization of the turbulent field. A nonlinear transport equation is then derived along each eigenray connecting the source and receiver. The transport equation is solved by a Pestorius algorithm. In the second model, the KZK equation is modified to account for the effect of a random temperature field and it is then solved numerically. Results from numerical experiments that simulate the propagation of spark-produced N waves through turbulence are presented. It is observed that turbulence decreases, on average, the peak pressure of the N waves and increases the rise time. Nonlinear distortion is less when turbulence is present than without it. The effects of random vector fields are stronger than those of random temperature fields. The location of the caustics and the deformation of the wave front are also presented. These observations confirm the results from the model experiment in which spark-produced N waves are used to simulate sonic boom propagation through a turbulent atmosphere.
Blanc-Benon, Philippe; Lipkens, Bart; Dallois, Laurent; Hamilton, Mark F; Blackstock, David T
2002-01-01
Sonic boom propagation can be affected by atmospheric turbulence. It has been shown that turbulence affects the perceived loudness of sonic booms, mainly by changing its peak pressure and rise time. The models reported here describe the nonlinear propagation of sound through turbulence. Turbulence is modeled as a set of individual realizations of a random temperature or velocity field. In the first model, linear geometrical acoustics is used to trace rays through each realization of the turbulent field. A nonlinear transport equation is then derived along each eigenray connecting the source and receiver. The transport equation is solved by a Pestorius algorithm. In the second model, the KZK equation is modified to account for the effect of a random temperature field and it is then solved numerically. Results from numerical experiments that simulate the propagation of spark-produced N waves through turbulence are presented. It is observed that turbulence decreases, on average, the peak pressure of the N waves and increases the rise time. Nonlinear distortion is less when turbulence is present than without it. The effects of random vector fields are stronger than those of random temperature fields. The location of the caustics and the deformation of the wave front are also presented. These observations confirm the results from the model experiment in which spark-produced N waves are used to simulate sonic boom propagation through a turbulent atmosphere.
Flight-path estimation in passive low-altitude flight by visual cues
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Kohn, S.
1993-01-01
A series of experiments was conducted, in which subjects had to estimate the flight path while passively being flown in straight or in curved motion over several types of nominally flat, textured terrain. Three computer-generated terrain types were investigated: (1) a random 'pole' field, (2) a flat field consisting of random rectangular patches, and (3) a field of random parallelepipeds. Experimental parameters were the velocity-to-height (V/h) ratio, the viewing distance, and the terrain type. Furthermore, the effect of obscuring parts of the visual field was investigated. Assumptions were made about the basic visual-field information by analyzing the pattern of line-of-sight (LOS) rate vectors in the visual field. The experimental results support these assumptions and show that, for both a straight as well as a curved flight path, the estimation accuracy and estimation times improve with the V/h ratio. Error scores for the curved flight path are found to be about 3 deg in visual angle higher than for the straight flight path, and the sensitivity to the V/h ratio is found to be considerably larger. For the straight motion, the flight path could be estimated successfully from local areas in the far field. Curved flight-path estimates have to rely on the entire LOS rate pattern.
NASA Astrophysics Data System (ADS)
Angeli, Andrea; Cornelis, Bram; Troncossi, Marco
2018-03-01
In many real life environments, mechanical and electronic systems are subjected to vibrations that may induce dynamic loads and potentially lead to an early failure due to fatigue damage. Thus, qualification tests by means of shakers are advisable for the most critical components in order to verify their durability throughout the entire life cycle. Nowadays the trend is to tailor the qualification tests according to the specific application of the tested component, considering the measured field data as reference to set up the experimental campaign, for example through the so called "Mission Synthesis" methodology. One of the main issues is to define the excitation profiles for the tests, that must have, besides the (potentially scaled) frequency content, also the same damage potential of the field data despite being applied for a limited duration. With this target, the current procedures generally provide the test profile as a stationary random vibration specified by a Power Spectral Density (PSD). In certain applications this output may prove inadequate to represent the nature of the reference signal, and the procedure could result in an unrealistic qualification test. For instance when a rotating part is present in the system the component under analysis may be subjected to Sine-on-Random (SoR) vibrations, namely excitations composed of sinusoidal contributions superimposed to random vibrations. In this case, the synthesized test profile should preserve not only the induced fatigue damage but also the deterministic components of the environmental vibration. In this work, the potential advantages of a novel procedure to synthesize SoR profiles instead of PSDs for qualification tests are presented and supported by the results of an experimental campaign.
Intermittency and random matrices
NASA Astrophysics Data System (ADS)
Sokoloff, Dmitry; Illarionov, E. A.
2015-08-01
A spectacular phenomenon of intermittency, i.e. a progressive growth of higher statistical moments of a physical field excited by an instability in a random medium, attracted the attention of Zeldovich in the last years of his life. At that time, the mathematical aspects underlying the physical description of this phenomenon were still under development and relations between various findings in the field remained obscure. Contemporary results from the theory of the product of independent random matrices (the Furstenberg theory) allowed the elaboration of the phenomenon of intermittency in a systematic way. We consider applications of the Furstenberg theory to some problems in cosmology and dynamo theory.
Smooth invariant densities for random switching on the torus
NASA Astrophysics Data System (ADS)
Bakhtin, Yuri; Hurth, Tobias; Lawley, Sean D.; Mattingly, Jonathan C.
2018-04-01
We consider a random dynamical system obtained by switching between the flows generated by two smooth vector fields on the 2d-torus, with the random switchings happening according to a Poisson process. Assuming that the driving vector fields are transversal to each other at all points of the torus and that each of them allows for a smooth invariant density and no periodic orbits, we prove that the switched system also has a smooth invariant density, for every switching rate. Our approach is based on an integration by parts formula inspired by techniques from Malliavin calculus.
Space-time models based on random fields with local interactions
NASA Astrophysics Data System (ADS)
Hristopulos, Dionissios T.; Tsantili, Ivi C.
2016-08-01
The analysis of space-time data from complex, real-life phenomena requires the use of flexible and physically motivated covariance functions. In most cases, it is not possible to explicitly solve the equations of motion for the fields or the respective covariance functions. In the statistical literature, covariance functions are often based on mathematical constructions. In this paper, we propose deriving space-time covariance functions by solving “effective equations of motion”, which can be used as statistical representations of systems with diffusive behavior. In particular, we propose to formulate space-time covariance functions based on an equilibrium effective Hamiltonian using the linear response theory. The effective space-time dynamics is then generated by a stochastic perturbation around the equilibrium point of the classical field Hamiltonian leading to an associated Langevin equation. We employ a Hamiltonian which extends the classical Gaussian field theory by including a curvature term and leads to a diffusive Langevin equation. Finally, we derive new forms of space-time covariance functions.
NASA Astrophysics Data System (ADS)
Youn, Dong Joon
This thesis presents the development and validation of an advanced hydro-mechanical coupled finite element program analyzing hydraulic fracture propagation within unconventional hydrocarbon formations under various conditions. The realistic modeling of hydraulic fracturing is necessarily required to improve the understanding and efficiency of the stimulation technique. Such modeling remains highly challenging, however, due to factors including the complexity of fracture propagation mechanisms, the coupled behavior of fracture displacement and fluid pressure, the interactions between pre-existing natural and initiated hydraulic fractures and the formation heterogeneity of the target reservoir. In this research, an eXtended Finite Element Method (XFEM) scheme is developed allowing for representation of single or multiple fracture propagations without any need for re-meshing. Also, the coupled flows through the fracture are considered in the program to account for their influence on stresses and deformations along the hydraulic fracture. In this research, a sequential coupling scheme is applied to estimate fracture aperture and fluid pressure with the XFEM. Later, the coupled XFEM program is used to estimate wellbore bottomhole pressure during fracture propagation, and the pressure variations are analyzed to determine the geometry and performance of the hydraulic fracturing as pressure leak-off test. Finally, material heterogeneity is included into the XFEM program to check the effect of random formation property distributions to the hydraulic fracture geometry. Random field theory is used to create the random realization of the material heterogeneity with the consideration of mean, standard deviation, and property correlation length. These analyses lead to probabilistic information on the response of unconventional reservoirs and offer a more scientific approach regarding risk management for the unconventional reservoir stimulation. The new stochastic approach combining XFEM and random field is named as eXtended Random Finite Element Method (XRFEM). All the numerical analysis codes in this thesis are written in Fortran 2003, and these codes are applicable as a series of sub-modules within a suite of finite element codes developed by Smith and Griffiths (2004).
Thought Field Therapy (TFT) as a treatment for anxiety symptoms: a randomized controlled trial.
Irgens, Audun; Dammen, Toril; Nysæter, Tor Erik; Hoffart, Asle
2012-01-01
To investigate whether thought field therapy (TFT) has an impact on anxiety symptoms in patients with a variety of anxiety disorders. Forty-five patients were randomized to either TFT (n = 23) or a waiting list (n = 22) condition. The wait-list group was reassessed and compared with the TFT group two and a half months after the initial evaluation. After the reassessment, the wait-list patients received treatment with TFT. All 45 patients were followed up one to two weeks after TFT treatment, as well as at three and 12 months after treatment. Patients with an anxiety disorder, mostly outpatients. TFT aims to influence the body's bioenergy field by tapping on specific points along energy meridians, thereby relieving anxiety and other symptoms. Symptom Checklist 90-Revised, Hospital Anxiety and Depression Scale, the Sheehan Disability Scale. Repeated-measures analysis of variance was used to compare the TFT and the wait-list group. The TFT group had a significantly better outcome on two measures of anxiety and one measure of function. Follow-up data for all patients taken together showed a significant decrease in all symptoms during the one to two weeks between the pretreatment and the post-treatment assessments. The significant improvement seen after treatment was maintained at the three- and 12-month assessments. The results suggest that TFT may have an enduring anxiety-reducing effect. Copyright © 2012 Elsevier Inc. All rights reserved.
Probability distribution of the entanglement across a cut at an infinite-randomness fixed point
NASA Astrophysics Data System (ADS)
Devakul, Trithep; Majumdar, Satya N.; Huse, David A.
2017-03-01
We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.
Carlos Alberto Silva; Carine Klauberg; Andrew Thomas Hudak; Lee Alexander Vierling; Wan Shafrina Wan Mohd Jaafar; Midhun Mohan; Mariano Garcia; Antonio Ferraz; Adrian Cardil; Sassan Saatchi
2017-01-01
Improvements in the management of pine plantations result in multiple industrial and environmental benefits. Remote sensing techniques can dramatically increase the efficiency of plantation management by reducing or replacing time-consuming field sampling. We tested the utility and accuracy of combining field and airborne lidar data with Random Forest, a supervised...
Rippled graphene in an in-plane magnetic field: effects of a random vector potential.
Lundeberg, Mark B; Folk, Joshua A
2010-10-01
We report measurements of the effects of a random vector potential generated by applying an in-plane magnetic field to a graphene flake. Magnetic flux through the ripples cause orbital effects: Phase-coherent weak localization is suppressed, while quasirandom Lorentz forces lead to anisotropic magnetoresistance. Distinct signatures of these two effects enable the ripple size to be characterized.
Effect of non-ionizing electromagnetic field on the alteration of ovarian follicles in rats.
Ahmadi, Seyed Shahin; Khaki, Amir Afshin; Ainehchi, Nava; Alihemmati, Alireza; Khatooni, Azam Asghari; Khaki, Arash; Asghari, Ali
2016-03-01
In recent years, there has been an increase in the attention paid to safety effects, environmental and society's health, extremely low frequency electromagnetic fields (ELF-EMF), and radio frequency electromagnetic fields (RF-EMF). The aim of this research was to determine the effect of EMF on the alteration of ovarian follicles. In this experimental study at Tabriz Medical University in 2015, we did EMF exposures and assessed the alteration of rats' ovarian follicles. Thirty three-month old rats were selected randomly from laboratory animals, and, after their ages and weights were determined, they were divided randomly into three groups. The control group consisted of 10 rats without any treatment, and they were kept in normal conditions. The second group of rats was influenced by a magnetic field of 50 Hz for eight weeks (three weeks intrauterine and five weeks ectopic). The third group of rats was influenced by a magnetic field of 50 Hz for 13 weeks (three weeks intrauterine and ten weeks ectopic). Samples were fixed in 10% buffered formaldehyde and cleared with Xylol and embedded in paraffin. After sectioning and staining, samples were studied by optic microscopy. Finally, SPSS version 17, were used for data analysis. EMF radiation increased the harmful effects on the formation of ovarian follicles and oocytes implantation. Studies on the effects of electromagnetic fields on ovarian follicles have shown that the nuclei of the oocytes become smaller and change shape. There were significant, harmful changes in the groups affected by electromagnetic waves. Atresia of ovarian follicles was significantly significant in both study groups compared to the control group (p < 0.05). Exposure to electromagnetic fields during embryonic development can cause morphological changes in oocytes and affect the differentiation of oocytes and folliculogenesis, resulting in decreased ovarian reserve leading to infertility or reduced fertility.
Mean-Field-Game Model for Botnet Defense in Cyber-Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolokoltsov, V. N., E-mail: v.kolokoltsov@warwick.ac.uk; Bensoussan, A.
We initiate the analysis of the response of computer owners to various offers of defence systems against a cyber-hacker (for instance, a botnet attack), as a stochastic game of a large number of interacting agents. We introduce a simple mean-field game that models their behavior. It takes into account both the random process of the propagation of the infection (controlled by the botner herder) and the decision making process of customers. Its stationary version turns out to be exactly solvable (but not at all trivial) under an additional natural assumption that the execution time of the decisions of the customersmore » (say, switch on or out the defence system) is much faster that the infection rates.« less
Modeling epidemic spread with awareness and heterogeneous transmission rates in networks.
Shang, Yilun
2013-06-01
During an epidemic outbreak in a human population, susceptibility to infection can be reduced by raising awareness of the disease. In this paper, we investigate the effects of three forms of awareness (i.e., contact, local, and global) on the spread of a disease in a random network. Connectivity-correlated transmission rates are assumed. By using the mean-field theory and numerical simulation, we show that both local and contact awareness can raise the epidemic thresholds while the global awareness cannot, which mirrors the recent results of Wu et al. The obtained results point out that individual behaviors in the presence of an infectious disease has a great influence on the epidemic dynamics. Our method enriches mean-field analysis in epidemic models.
Towards a high-speed quantum random number generator
NASA Astrophysics Data System (ADS)
Stucki, Damien; Burri, Samuel; Charbon, Edoardo; Chunnilall, Christopher; Meneghetti, Alessio; Regazzoni, Francesco
2013-10-01
Randomness is of fundamental importance in various fields, such as cryptography, numerical simulations, or the gaming industry. Quantum physics, which is fundamentally probabilistic, is the best option for a physical random number generator. In this article, we will present the work carried out in various projects in the context of the development of a commercial and certified high speed random number generator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, A.B.; Clothiaux, E.
Because of Earth`s gravitational field, its atmosphere is strongly anisotropic with respect to the vertical; the effect of the Earth`s rotation on synoptic wind patterns also causes a more subtle form of anisotropy in the horizontal plane. The authors survey various approaches to statistically robust anisotropy from a wavelet perspective and present a new one adapted to strongly non-isotropic fields that are sampled on a rectangular grid with a large aspect ratio. This novel technique uses an anisotropic version of Multi-Resolution Analysis (MRA) in image analysis; the authors form a tensor product of the standard dyadic Haar basis, where themore » dividing ratio is {lambda}{sub z} = 2, and a nonstandard triadic counterpart, where the dividing ratio is {lambda}{sub x} = 3. The natural support of the field is therefore 2{sup n} pixels (vertically) by 3{sup n} pixels (horizontally) where n is the number of levels in the MRA. The natural triadic basis includes the French top-hat wavelet which resonates with bumps in the field whereas the Haar wavelet responds to ramps or steps. The complete 2D basis has one scaling function and five wavelets. The resulting anisotropic MRA is designed for application to the liquid water content (LWC) field in boundary-layer clouds, as the prevailing wind advects them by a vertically pointing mm-radar system. Spatial correlations are notoriously long-range in cloud structure and the authors use the wavelet coefficients from the new MRA to characterize these correlations in a multifractal analysis scheme. In the present study, the MRA is used (in synthesis mode) to generate fields that mimic cloud structure quite realistically although only a few parameters are used to control the randomness of the LWC`s wavelet coefficients.« less
Stochastic analysis of pitch angle scattering of charged particles by transverse magnetic waves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lemons, Don S.; Liu Kaijun; Winske, Dan
2009-11-15
This paper describes a theory of the velocity space scattering of charged particles in a static magnetic field composed of a uniform background field and a sum of transverse, circularly polarized, magnetic waves. When that sum has many terms the autocorrelation time required for particle orbits to become effectively randomized is small compared with the time required for the particle velocity distribution to change significantly. In this regime the deterministic equations of motion can be transformed into stochastic differential equations of motion. The resulting stochastic velocity space scattering is described, in part, by a pitch angle diffusion rate that ismore » a function of initial pitch angle and properties of the wave spectrum. Numerical solutions of the deterministic equations of motion agree with the theory at all pitch angles, for wave energy densities up to and above the energy density of the uniform field, and for different wave spectral shapes.« less
Gardiner, Stuart K; Demirel, Shaban; De Moraes, Carlos Gustavo; Liebmann, Jeffrey M; Cioffi, George A; Ritch, Robert; Gordon, Mae O; Kass, Michael A
2013-02-15
Trend analysis techniques to detect glaucomatous progression typically assume a constant rate of change. This study uses data from the Ocular Hypertension Treatment Study to assess whether this assumption decreases sensitivity to changes in progression rate, by including earlier periods of stability. Series of visual fields (mean 24 per eye) completed at 6-month intervals from participants randomized initially to observation were split into subseries before and after the initiation of treatment (the "split-point"). The mean deviation rate of change (MDR) was derived using these entire subseries, and using only the window length (W) tests nearest the split-point, for different window lengths of W tests. A generalized estimating equation model was used to detect changes in MDR occurring at the split-point. Using shortened subseries with W = 7 tests, the MDR slowed by 0.142 dB/y upon initiation of treatment (P < 0.001), and the proportion of eyes showing "rapid deterioration" (MDR <-0.5 dB/y with P < 5%) decreased from 11.8% to 6.5% (P < 0.001). Using the entire sequence, no significant change in MDR was detected (P = 0.796), and there was no change in the proportion of eyes progressing (P = 0.084). Window lengths 6 ≤ W ≤ 9 produced similar benefits. Event analysis revealed a beneficial treatment effect in this dataset. This effect was not detected by linear trend analysis applied to entire series, but was detected when using shorter subseries of length between six and nine fields. Using linear trend analysis on the entire field sequence may not be optimal for detecting and monitoring progression. Nonlinear analyses may be needed for long series of fields. (ClinicalTrials.gov number, NCT00000125.).
A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields
Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto
2017-10-26
In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less
A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto
In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less
NASA Astrophysics Data System (ADS)
Rychlik, Igor; Mao, Wengang
2018-02-01
The wind speed variability in the North Atlantic has been successfully modelled using a spatio-temporal transformed Gaussian field. However, this type of model does not correctly describe the extreme wind speeds attributed to tropical storms and hurricanes. In this study, the transformed Gaussian model is further developed to include the occurrence of severe storms. In this new model, random components are added to the transformed Gaussian field to model rare events with extreme wind speeds. The resulting random field is locally stationary and homogeneous. The localized dependence structure is described by time- and space-dependent parameters. The parameters have a natural physical interpretation. To exemplify its application, the model is fitted to the ECMWF ERA-Interim reanalysis data set. The model is applied to compute long-term wind speed distributions and return values, e.g., 100- or 1000-year extreme wind speeds, and to simulate random wind speed time series at a fixed location or spatio-temporal wind fields around that location.
NASA Astrophysics Data System (ADS)
Duan, Xueyang
The objective of this dissertation is to develop forward scattering models for active microwave remote sensing of natural features represented by layered media with rough interfaces. In particular, soil profiles are considered, for which a model of electromagnetic scattering from multilayer rough surfaces with or without buried random media is constructed. Starting from a single rough surface, radar scattering is modeled using the stabilized extended boundary condition method (SEBCM). This method solves the long-standing instability issue of the classical EBCM, and gives three-dimensional full wave solutions over large ranges of surface roughnesses with higher computational efficiency than pure numerical solutions, e.g., method of moments (MoM). Based on this single surface solution, multilayer rough surface scattering is modeled using the scattering matrix approach and the model is used for a comprehensive sensitivity analysis of the total ground scattering as a function of layer separation, subsurface statistics, and sublayer dielectric properties. The buried inhomogeneities such as rocks and vegetation roots are considered for the first time in the forward scattering model. Radar scattering from buried random media is modeled by the aggregate transition matrix using either the recursive transition matrix approach for spherical or short-length cylindrical scatterers, or the generalized iterative extended boundary condition method we developed for long cylinders or root-like cylindrical clusters. These approaches take the field interactions among scatterers into account with high computational efficiency. The aggregate transition matrix is transformed to a scattering matrix for the full solution to the layered-medium problem. This step is based on the near-to-far field transformation of the numerical plane wave expansion of the spherical harmonics and the multipole expansion of plane waves. This transformation consolidates volume scattering from the buried random medium with the scattering from layered structure in general. Combined with scattering from multilayer rough surfaces, scattering contributions from subsurfaces and vegetation roots can be then simulated. Solutions of both the rough surface scattering and random media scattering are validated numerically, experimentally, or both. The experimental validations have been carried out using a laboratory-based transmit-receive system for scattering from random media and a new bistatic tower-mounted radar system for field-based surface scattering measurements.
NASA Astrophysics Data System (ADS)
Aksoy, A.; Lee, J. H.; Kitanidis, P. K.
2016-12-01
Heterogeneity in hydraulic conductivity (K) impacts the transport and fate of contaminants in subsurface as well as design and operation of managed aquifer recharge (MAR) systems. Recently, improvements in computational resources and availability of big data through electrical resistivity tomography (ERT) and remote sensing have provided opportunities to better characterize the subsurface. Yet, there is need to improve prediction and evaluation methods in order to obtain information from field measurements for better field characterization. In this study, genetic algorithm optimization, which has been widely used in optimal aquifer remediation designs, was used to determine the spatial distribution of K. A hypothetical 2 km by 2 km aquifer was considered. A genetic algorithm library, PGAPack, was linked with a fast Fourier transform based random field generator as well as a groundwater flow and contaminant transport simulation model (BIO2D-KE). The objective of the optimization model was to minimize the total squared error between measured and predicted field values. It was assumed measured K values were available through ERT. Performance of genetic algorithm in predicting the distribution of K was tested for different cases. In the first one, it was assumed that observed K values were evaluated using the random field generator only as the forward model. In the second case, as well as K-values obtained through ERT, measured head values were incorporated into evaluation in which BIO2D-KE and random field generator were used as the forward models. Lastly, tracer concentrations were used as additional information in the optimization model. Initial results indicated enhanced performance when random field generator and BIO2D-KE are used in combination in predicting the spatial distribution in K.
Osborn, Sarah; Zulian, Patrick; Benson, Thomas; ...
2018-01-30
This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, Sarah; Zulian, Patrick; Benson, Thomas
This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less
On resonant coupling of acoustic waves and gravity waves
NASA Astrophysics Data System (ADS)
Millet, Christophe
2017-11-01
Acoustic propagation in the atmosphere is often modeled using modes that are confined within waveguides causing the sound to propagate through multiple paths to the receiver. On the other hand, direct observations in the lower stratosphere show that the gravity wave field is intermittent, and is often dominated by rather well defined large-amplitude wave packets. In the present work, we use normal modes to describe both the gravity wave field and the acoustic field. The gravity wave spectrum is obtained by launching few monochromatic waves whose properties are chosen stochastically to mimic the intermittency. Owing to the disparity of the gravity and acoustic length scales, the interactions between the gravity wave field and each of the acoustic modes can be described using a multiple-scale analysis. The appropriate amplitude evolution equation for the acoustic field involves certain random terms that can be directly related to the gravity wave sources. We will show that the cumulative effect of gravity wave breakings makes the sensitivity of ground-based acoustic signals large, in that small changes in the gravity wave parameterization can create or destroy specific acoustic features.
Transport of Charged Particles in Turbulent Magnetic Fields
NASA Astrophysics Data System (ADS)
Parashar, T.; Subedi, P.; Sonsrettee, W.; Blasi, P.; Ruffolo, D. J.; Matthaeus, W. H.; Montgomery, D.; Chuychai, P.; Dmitruk, P.; Wan, M.; Chhiber, R.
2017-12-01
Magnetic fields permeate the Universe. They are found in planets, stars, galaxies, and the intergalactic medium. The magnetic field found in these astrophysical systems are usually chaotic, disordered, and turbulent. The investigation of the transport of cosmic rays in magnetic turbulence is a subject of considerable interest. One of the important aspects of cosmic ray transport is to understand their diffusive behavior and to calculate the diffusion coefficient in the presence of these turbulent fields. Research has most frequently concentrated on determining the diffusion coefficient in the presence of a mean magnetic field. Here, we will particularly focus on calculating diffusion coefficients of charged particles and magnetic field lines in a fully three-dimensional isotropic turbulent magnetic field with no mean field, which may be pertinent to many astrophysical situations. For charged particles in isotropic turbulence we identify different ranges of particle energy depending upon the ratio of the Larmor radius of the charged particle to the characteristic outer length scale of the turbulence. Different theoretical models are proposed to calculate the diffusion coefficient, each applicable to a distinct range of particle energies. The theoretical ideas are tested against results of detailed numerical experiments using Monte-Carlo simulations of particle propagation in stochastic magnetic fields. We also discuss two different methods of generating random magnetic field to study charged particle propagation using numerical simulation. One method is the usual way of generating random fields with a specified power law in wavenumber space, using Gaussian random variables. Turbulence, however, is non-Gaussian, with variability that comes in bursts called intermittency. We therefore devise a way to generate synthetic intermittent fields which have many properties of realistic turbulence. Possible applications of such synthetically generated intermittent fields are discussed.
Nature of magnetization and lateral spin-orbit interaction in gated semiconductor nanowires.
Karlsson, H; Yakimenko, I I; Berggren, K-F
2018-05-31
Semiconductor nanowires are interesting candidates for realization of spintronics devices. In this paper we study electronic states and effects of lateral spin-orbit coupling (LSOC) in a one-dimensional asymmetrically biased nanowire using the Hartree-Fock method with Dirac interaction. We have shown that spin polarization can be triggered by LSOC at finite source-drain bias,as a result of numerical noise representing a random magnetic field due to wiring or a random background magnetic field by Earth magnetic field, for instance. The electrons spontaneously arrange into spin rows in the wire due to electron interactions leading to a finite spin polarization. The direction of polarization is, however, random at zero source-drain bias. We have found that LSOC has an effect on orientation of spin rows only in the case when source-drain bias is applied.
Nature of magnetization and lateral spin–orbit interaction in gated semiconductor nanowires
NASA Astrophysics Data System (ADS)
Karlsson, H.; Yakimenko, I. I.; Berggren, K.-F.
2018-05-01
Semiconductor nanowires are interesting candidates for realization of spintronics devices. In this paper we study electronic states and effects of lateral spin–orbit coupling (LSOC) in a one-dimensional asymmetrically biased nanowire using the Hartree–Fock method with Dirac interaction. We have shown that spin polarization can be triggered by LSOC at finite source-drain bias,as a result of numerical noise representing a random magnetic field due to wiring or a random background magnetic field by Earth magnetic field, for instance. The electrons spontaneously arrange into spin rows in the wire due to electron interactions leading to a finite spin polarization. The direction of polarization is, however, random at zero source-drain bias. We have found that LSOC has an effect on orientation of spin rows only in the case when source-drain bias is applied.
A multiple scattering theory for EM wave propagation in a dense random medium
NASA Technical Reports Server (NTRS)
Karam, M. A.; Fung, A. K.; Wong, K. W.
1985-01-01
For a dense medium of randomly distributed scatterers an integral formulation for the total coherent field has been developed. This formulation accounts for the multiple scattering of electromagnetic waves including both the twoand three-particle terms. It is shown that under the Markovian assumption the total coherent field and the effective field have the same effective wave number. As an illustration of this theory, the effective wave number and the extinction coefficient are derived in terms of the polarizability tensor and the pair distribution function for randomly distributed small spherical scatterers. It is found that the contribution of the three-particle term increases with the particle size, the volume fraction, the frequency and the permittivity of the particle. This increase is more significant with frequency and particle size than with other parameters.
Large-scale inverse model analyses employing fast randomized data reduction
NASA Astrophysics Data System (ADS)
Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan
2017-08-01
When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.
Clustering and Hazard Estimation in the Auckland Volcanic Field, New Zealand
NASA Astrophysics Data System (ADS)
Cronin, S. J.; Bebbington, M. S.
2009-12-01
The Auckland Volcanic Field (AVF) with its 49 eruptive centres formed over the last c. 250 ka presents several unique challenges to our understanding of distributed volcanic field construction and evolution. Due to the youth of the field, high-resolution stratigraphy of eruption centres and ash-fall sequences is possible, allowing time-breaks, soil and peat formation between eruption units to be identified. Radiocarbon dating of sediments between volcanic deposits shows that at least five of the centres have erupted on more than one occasion, with time breaks of 50-100 years between episodes. In addition, paleomagnetic and ash fall evidence implies that there has been strong clustering of eruption events over time, with a specific “flare-up” event involving over possibly up to 19 eruptions occurring between 35-25 ka, in spatially disparate locations. An additional complicating factor is that the only centre that shows any major evidence for evolution out of standard alkali basaltic compositions is also the youngest and largest in volume by several orders of magnitude. All of these features of the AVF, along with relatively poor age-control for many of the vents make spatio-temporal hazard forecasting for the field based on assumptions of past behaviour extremely difficult. Any relationships that take volumetric considerations into account are particularly difficult, since any trend analysis produces unreasonably large future eruptions. The most reasonable model is spatial, via eruption location. We have re-examined the age progression of eruptive events in the AVF, incorporating the most reliable sources of age and stratigraphic data, including developing new correlations between ashfall records in cores and likely vent locations via a probabilistic model of tephra dispersal. A Monte Carlo procedure using the age-progression, stratigraphy and dating constraints can then randomly reproduce likely orderings of events in the field. These were fitted by a clustering-based model of vent locations as originally applied by Magill et al (2005: Mathematical Geol. 37: 227-242) to the Allen and Smith (1994; Geosci. Report Shizuoka Univ 20: 5-14) age ordering of volcanism at AVF. Applying this model, modified by allowing continuation of activity at or around the youngest event, to sampled age orderings from the Monte Carlo procedure shows a very different spatial forecast to the earlier analysis. It is also different to the distribution from randomly ordered events, implying there is at least some clustering control on the location of eruptions in the field. Further iterations of this modelling approach will be tested in relation to eruptive volume and applied to other comparative volcanic fields.
Modelling of squall with the generalised kinetic equation
NASA Astrophysics Data System (ADS)
Annenkov, Sergei; Shrira, Victor
2014-05-01
We study the long-term evolution of random wind waves using the new generalised kinetic equation (GKE). The GKE derivation [1] does not assume the quasi-stationarity of a random wave field. In contrast with the Hasselmann kinetic equation, the GKE can describe fast spectral changes occurring when a wave field is driven out of a quasi-equilibrium state by a fast increase or decrease of wind, or by other factors. In these cases, a random wave field evolves on the dynamic timescale typical of coherent wave processes, rather than on the kinetic timescale predicted by the conventional statistical theory. Besides that, the generalised theory allows to trace the evolution of higher statistical moments of the field, notably the kurtosis, which is important for assessing the risk of freak waves and other applications. A new efficient and highly parallelised algorithm for the numerical simulation of the generalised kinetic equation is presented and discussed. Unlike in the case of the Hasselmann equation, the algorithm takes into account all (resonant and non-resonant) nonlinear wave interactions, but only approximately resonant interactions contribute to the spectral evolution. However, counter-intuitively, all interactions contribute to the kurtosis. Without forcing or dissipation, the algorithm is shown to conserve the relevant integrals. We show that under steady wind forcing the wave field evolution predicted by the GKE is close to the predictions of the conventional statistical theory, which is applicable in this case. In particular, we demonstrate the known long-term asymptotics for the evolution of the spectrum. When the wind forcing is not steady (in the simplest case, an instant increase or decrease of wind occurs), the generalised theory is the only way to study the spectral evolution, apart from the direct numerical simulation. The focus of the work is a detailed analysis of the fast evolution after an instant change of forcing, and of the subsequent transition to the new quasi-stationary state of a wave field. It is shown that both increase and decrease of wind lead to a significant transient increase of the dynamic kurtosis, although these changes remain small compared to the changes of the other component of the kurtosis, which is due to bound harmonics. A special consideration is given to the case of the squall, i.e. an instant and large (by a factor of 2-4) increase of wind, which lasts for O(102) characteristic wave periods. We show that fast adjustment processes lead to the formation of a transient spectrum, which has a considerably narrower peak than the spectra developed under a steady forcing. These transient spectra differ qualitatively from those predicted by the Hasselmann kinetic equation under the squall with the same parameters. 1. S.Annenkov, V.Shrira (2006) Role of non-resonant interactions in evolution of nonlinear random water wave fields, J. Fluid Mech. 561, 181-207.
Extended self-similarity in the two-dimensional metal-insulator transition
NASA Astrophysics Data System (ADS)
Moriconi, L.
2003-09-01
We show that extended self-similarity, a scaling phenomenon first observed in classical turbulent flows, holds for a two-dimensional metal-insulator transition that belongs to the universality class of random Dirac fermions. Deviations from multifractality, which in turbulence are due to the dominance of diffusive processes at small scales, appear in the condensed-matter context as a large-scale, finite-size effect related to the imposition of an infrared cutoff in the field theory formulation. We propose a phenomenological interpretation of extended self-similarity in the metal-insulator transition within the framework of the random β-model description of multifractal sets. As a natural step, our discussion is bridged to the analysis of strange attractors, where crossovers between multifractal and nonmultifractal regimes are found and extended self-similarity turns out to be verified as well.
RFMix: A Discriminative Modeling Approach for Rapid and Robust Local-Ancestry Inference
Maples, Brian K.; Gravel, Simon; Kenny, Eimear E.; Bustamante, Carlos D.
2013-01-01
Local-ancestry inference is an important step in the genetic analysis of fully sequenced human genomes. Current methods can only detect continental-level ancestry (i.e., European versus African versus Asian) accurately even when using millions of markers. Here, we present RFMix, a powerful discriminative modeling approach that is faster (∼30×) and more accurate than existing methods. We accomplish this by using a conditional random field parameterized by random forests trained on reference panels. RFMix is capable of learning from the admixed samples themselves to boost performance and autocorrect phasing errors. RFMix shows high sensitivity and specificity in simulated Hispanics/Latinos and African Americans and admixed Europeans, Africans, and Asians. Finally, we demonstrate that African Americans in HapMap contain modest (but nonzero) levels of Native American ancestry (∼0.4%). PMID:23910464
A controlled field trial of the effectiveness of phenol and alcohol typhoid vaccines*
1962-01-01
In order to determine the effectiveness of anti-typhoid vaccines in man a controlled field trial, the first of its kind, was organized in 1954-60 in the town and district of Osijek, Yugoslavia. Heat-killed, phenol-preserved and alcohol-killed, alcohol-preserved anti-typhoid monovaccines were used, with phenolized dysentery vaccine as a control. Some 36000 volunteers were allocated at random into three comparable groups: a phenol-vaccine group, an alcohol-vaccine group and a control group. Immunization consisted of a primary course of two injections. A reinforcing dose was given a year later. The effectiveness of the vaccines was measured by comparing specific morbidity in the three groups, and for greater accuracy only bacteriologically proved typhoid cases were taken into consideration in the statistical analysis. In addition, some 11 000 persons were vaccinated in 1955; these were divided at random into two groups, which received phenolized and alcoholized vaccine respectively; there was no control group. It was found that phenolized vaccine gave relatively high protection while alcoholized vaccine was of little, if any, efficacy. The efficacy of phenolized vaccine was of rather long duration. These field results obtained on man are not in agreement with the results of laboratory tests known at present. Neither laboratory assays on mice nor serological tests on man could be correlated with the results of the field trial. For this reason further studies are necessary in order to determine the value of various other types of anti-typhoid vaccines and to develop reliable laboratory tests for the measurement of the potency of anti-typhoid vaccines. PMID:20604110
NASA Astrophysics Data System (ADS)
Gjetvaj, Filip; Russian, Anna; Gouze, Philippe; Dentz, Marco
2015-10-01
Both flow field heterogeneity and mass transfer between mobile and immobile domains have been studied separately for explaining observed anomalous transport. Here we investigate non-Fickian transport using high-resolution 3-D X-ray microtomographic images of Berea sandstone containing microporous cement with pore size below the setup resolution. Transport is computed for a set of representative elementary volumes and results from advection and diffusion in the resolved macroporosity (mobile domain) and diffusion in the microporous phase (immobile domain) where the effective diffusion coefficient is calculated from the measured local porosity using a phenomenological model that includes a porosity threshold (ϕθ) below which diffusion is null and the exponent n that characterizes tortuosity-porosity power-law relationship. We show that both flow field heterogeneity and microporosity trigger anomalous transport. Breakthrough curve (BTC) tailing is positively correlated to microporosity volume and mobile-immobile interface area. The sensitivity analysis showed that the BTC tailing increases with the value of ϕθ, due to the increase of the diffusion path tortuosity until the volume of the microporosity becomes negligible. Furthermore, increasing the value of n leads to an increase in the standard deviation of the distribution of effective diffusion coefficients, which in turn results in an increase of the BTC tailing. Finally, we propose a continuous time random walk upscaled model where the transition time is the sum of independently distributed random variables characterized by specific distributions. It allows modeling a 1-D equivalent macroscopic transport honoring both the control of the flow field heterogeneity and the multirate mass transfer between mobile and immobile domains.
Subject-Adaptive Real-Time Sleep Stage Classification Based on Conditional Random Field
Luo, Gang; Min, Wanli
2007-01-01
Sleep staging is the pattern recognition task of classifying sleep recordings into sleep stages. This task is one of the most important steps in sleep analysis. It is crucial for the diagnosis and treatment of various sleep disorders, and also relates closely to brain-machine interfaces. We report an automatic, online sleep stager using electroencephalogram (EEG) signal based on a recently-developed statistical pattern recognition method, conditional random field, and novel potential functions that have explicit physical meanings. Using sleep recordings from human subjects, we show that the average classification accuracy of our sleep stager almost approaches the theoretical limit and is about 8% higher than that of existing systems. Moreover, for a new subject snew with limited training data Dnew, we perform subject adaptation to improve classification accuracy. Our idea is to use the knowledge learned from old subjects to obtain from Dnew a regulated estimate of CRF’s parameters. Using sleep recordings from human subjects, we show that even without any Dnew, our sleep stager can achieve an average classification accuracy of 70% on snew. This accuracy increases with the size of Dnew and eventually becomes close to the theoretical limit. PMID:18693884
NASA Astrophysics Data System (ADS)
Guerrout, EL-Hachemi; Ait-Aoudia, Samy; Michelucci, Dominique; Mahiou, Ramdane
2018-05-01
Many routine medical examinations produce images of patients suffering from various pathologies. With the huge number of medical images, the manual analysis and interpretation became a tedious task. Thus, automatic image segmentation became essential for diagnosis assistance. Segmentation consists in dividing the image into homogeneous and significant regions. We focus on hidden Markov random fields referred to as HMRF to model the problem of segmentation. This modelisation leads to a classical function minimisation problem. Broyden-Fletcher-Goldfarb-Shanno algorithm referred to as BFGS is one of the most powerful methods to solve unconstrained optimisation problem. In this paper, we investigate the combination of HMRF and BFGS algorithm to perform the segmentation operation. The proposed method shows very good segmentation results comparing with well-known approaches. The tests are conducted on brain magnetic resonance image databases (BrainWeb and IBSR) largely used to objectively confront the results obtained. The well-known Dice coefficient (DC) was used as similarity metric. The experimental results show that, in many cases, our proposed method approaches the perfect segmentation with a Dice Coefficient above .9. Moreover, it generally outperforms other methods in the tests conducted.
Weak scattering of scalar and electromagnetic random fields
NASA Astrophysics Data System (ADS)
Tong, Zhisong
This dissertation encompasses several studies relating to the theory of weak potential scattering of scalar and electromagnetic random, wide-sense statistically stationary fields from various types of deterministic or random linear media. The proposed theory is largely based on the first Born approximation for potential scattering and on the angular spectrum representation of fields. The main focus of the scalar counterpart of the theory is made on calculation of the second-order statistics of scattered light fields in cases when the scattering medium consists of several types of discrete particles with deterministic or random potentials. It is shown that the knowledge of the correlation properties for the particles of the same and different types, described with the newly introduced pair-scattering matrix, is crucial for determining the spectral and coherence states of the scattered radiation. The approach based on the pair-scattering matrix is then used for solving an inverse problem of determining the location of an "alien" particle within the scattering collection of "normal" particles, from several measurements of the spectral density of scattered light. Weak scalar scattering of light from a particulate medium in the presence of optical turbulence existing between the scattering centers is then approached using the combination of the Born's theory for treating the light interaction with discrete particles and the Rytov's theory for light propagation in extended turbulent medium. It is demonstrated how the statistics of scattered radiation depend on scattering potentials of particles and the power spectra of the refractive index fluctuations of turbulence. This theory is of utmost importance for applications involving atmospheric and oceanic light transmission. The second part of the dissertation includes the theoretical procedure developed for predicting the second-order statistics of the electromagnetic random fields, such as polarization and linear momentum, scattered from static media. The spatial distribution of these properties of scattered fields is shown to be substantially dependent on the correlation and polarization properties of incident fields and on the statistics of the refractive index distribution within the scatterers. Further, an example is considered which illustrates the usefulness of the electromagnetic scattering theory of random fields in the case when the scattering medium is a thin bio-tissue layer with the prescribed power spectrum of the refractive index fluctuations. The polarization state of the scattered light is shown to be influenced by correlation and polarization states of the illumination as well as by the particle size distribution of the tissue slice.
NASA Astrophysics Data System (ADS)
Nuzhnaya, Tatyana; Bakic, Predrag; Kontos, Despina; Megalooikonomou, Vasileios; Ling, Haibin
2012-02-01
This work is a part of our ongoing study aimed at understanding a relation between the topology of anatomical branching structures with the underlying image texture. Morphological variability of the breast ductal network is associated with subsequent development of abnormalities in patients with nipple discharge such as papilloma, breast cancer and atypia. In this work, we investigate complex dependence among ductal components to perform segmentation, the first step for analyzing topology of ductal lobes. Our automated framework is based on incorporating a conditional random field with texture descriptors of skewness, coarseness, contrast, energy and fractal dimension. These features are selected to capture the architectural variability of the enhanced ducts by encoding spatial variations between pixel patches in galactographic image. The segmentation algorithm was applied to a dataset of 20 x-ray galactograms obtained at the Hospital of the University of Pennsylvania. We compared the performance of the proposed approach with fully and semi automated segmentation algorithms based on neural network classification, fuzzy-connectedness, vesselness filter and graph cuts. Global consistency error and confusion matrix analysis were used as accuracy measurements. For the proposed approach, the true positive rate was higher and the false negative rate was significantly lower compared to other fully automated methods. This indicates that segmentation based on CRF incorporated with texture descriptors has potential to efficiently support the analysis of complex topology of the ducts and aid in development of realistic breast anatomy phantoms.
Variogram methods for texture classification of atherosclerotic plaque ultrasound images
NASA Astrophysics Data System (ADS)
Jeromin, Oliver M.; Pattichis, Marios S.; Pattichis, Constantinos; Kyriacou, Efthyvoulos; Nicolaides, Andrew
2006-03-01
Stroke is the third leading cause of death in the western world and the major cause of disability in adults. The type and stenosis of extracranial carotid artery disease is often responsible for ischemic strokes, transient ischemic attacks (TIAs) or amaurosis fugax (AF). The identification and grading of stenosis can be done using gray scale ultrasound scans. The appearance of B-scan pictures containing various granular structures makes the use of texture analysis techniques suitable for computer assisted tissue characterization purposes. The objective of this study is to investigate the usefulness of variogram analysis in the assessment of ultrasound plague morphology. The variogram estimates the variance of random fields, from arbitrary samples in space. We explore stationary random field models based on the variogram, which can be applied in ultrasound plaque imaging leading to a Computer Aided Diagnosis (CAD) system for the early detection of symptomatic atherosclerotic plaques. Non-parametric tests on the variogram coefficients show that the cofficients coming from symptomatic versus asymptomatic plaques come from distinct distributions. Furthermore, we show significant improvement in class separation, when a log point-transformation is applied to the images, prior to variogram estimation. Model fitting using least squares is explored for anisotropic variograms along specific directions. Comparative classification results, show that variogram coefficients can be used for the early detection of symptomatic cases, and also exhibit the largest class distances between symptomatic and asymptomatic plaque images, as compared to over 60 other texture features, used in the literature.
Response of space shuttle insulation panels to acoustic noise pressure
NASA Technical Reports Server (NTRS)
Vaicaitis, R.
1976-01-01
The response of reusable space shuttle insulation panels to random acoustic pressure fields are studied. The basic analytical approach in formulating the governing equations of motion uses a Rayleigh-Ritz technique. The input pressure field is modeled as a stationary Gaussian random process for which the cross-spectral density function is known empirically from experimental measurements. The response calculations are performed in both frequency and time domain.
Sparse Forward-Backward for Fast Training of Conditional Random Fields
2006-01-01
knowledge- based systems. Proceedings of the 6th Conference on Uncertainty in Artifcial Intelligence , 1990. Appears to be unavailable. [4] Michael I...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...task, the NetTalk text-to-speech data set [5], we can now train a conditional random field (CRF) in about 6 hours for which training previously
Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators
NASA Astrophysics Data System (ADS)
Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.
2015-11-01
A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.
Vector solution for the mean electromagnetic fields in a layer of random particles
NASA Technical Reports Server (NTRS)
Lang, R. H.; Seker, S. S.; Levine, D. M.
1986-01-01
The mean electromagnetic fields are found in a layer of randomly oriented particles lying over a half space. A matrix-dyadic formulation of Maxwell's equations is employed in conjunction with the Foldy-Lax approximation to obtain equations for the mean fields. A two variable perturbation procedure, valid in the limit of small fractional volume, is then used to derive uncoupled equations for the slowly varying amplitudes of the mean wave. These equations are solved to obtain explicit expressions for the mean electromagnetic fields in the slab region in the general case of arbitrarily oriented particles and arbitrary polarization of the incident radiation. Numerical examples are given for the application to remote sensing of vegetation.
Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit
2016-02-01
The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.
NASA Astrophysics Data System (ADS)
Panunzio, Alfonso M.; Puel, G.; Cottereau, R.; Simon, S.; Quost, X.
2017-03-01
This paper describes the construction of a stochastic model of urban railway track geometry irregularities, based on experimental data. The considered irregularities are track gauge, superelevation, horizontal and vertical curvatures. They are modelled as random fields whose statistical properties are extracted from a large set of on-track measurements of the geometry of an urban railway network. About 300-1000 terms are used in the Karhunen-Loève/Polynomial Chaos expansions to represent the random fields with appropriate accuracy. The construction of the random fields is then validated by comparing on-track measurements of the contact forces and numerical dynamics simulations for different operational conditions (train velocity and car load) and horizontal layouts (alignment, curve). The dynamics simulations are performed both with and without randomly generated geometrical irregularities for the track. The power spectrum densities obtained from the dynamics simulations with the model of geometrical irregularities compare extremely well with those obtained from the experimental contact forces. Without irregularities, the spectrum is 10-50 dB too low.
Cheong, Jadeera Phaik Geok; Lay, Brendan; Grove, J. Robert; Medic, Nikola; Razman, Rizal
2012-01-01
To overcome the weakness of the contextual interference (CI) effect within applied settings, Brady, 2008 recommended that the amount of interference be manipulated. This study investigated the effect of five practice schedules on the learning of three field hockey skills. Fifty-five pre-university students performed a total of 90 trials for each skill under blocked, mixed or random practice orders. Results showed a significant time effect with all five practice conditions leading to improvements in acquisition and learning of the skills. No significant differences were found between the groups. The findings of the present study did not support the CI effect and suggest that either blocked, mixed, or random practice schedules can be used effectively when structuring practice for beginners. Key pointsThe contextual interference effect did not surface when using sport skills.There appears to be no difference between blocked and random practice schedules in the learning of field hockey skills.Low (blocked), moderate (mixed) or high (random) interference practice schedules can be used effectively when conducting a multiple skill practice session for beginners. PMID:24149204
Cheong, Jadeera Phaik Geok; Lay, Brendan; Grove, J Robert; Medic, Nikola; Razman, Rizal
2012-01-01
To overcome the weakness of the contextual interference (CI) effect within applied settings, Brady, 2008 recommended that the amount of interference be manipulated. This study investigated the effect of five practice schedules on the learning of three field hockey skills. Fifty-five pre-university students performed a total of 90 trials for each skill under blocked, mixed or random practice orders. Results showed a significant time effect with all five practice conditions leading to improvements in acquisition and learning of the skills. No significant differences were found between the groups. The findings of the present study did not support the CI effect and suggest that either blocked, mixed, or random practice schedules can be used effectively when structuring practice for beginners. Key pointsThe contextual interference effect did not surface when using sport skills.There appears to be no difference between blocked and random practice schedules in the learning of field hockey skills.Low (blocked), moderate (mixed) or high (random) interference practice schedules can be used effectively when conducting a multiple skill practice session for beginners.
Benford's law and continuous dependent random variables
NASA Astrophysics Data System (ADS)
Becker, Thealexa; Burt, David; Corcoran, Taylor C.; Greaves-Tunnell, Alec; Iafrate, Joseph R.; Jing, Joy; Miller, Steven J.; Porfilio, Jaclyn D.; Ronan, Ryan; Samranvedhya, Jirapat; Strauch, Frederick W.; Talbut, Blaine
2018-01-01
Many mathematical, man-made and natural systems exhibit a leading-digit bias, where a first digit (base 10) of 1 occurs not 11% of the time, as one would expect if all digits were equally likely, but rather 30%. This phenomenon is known as Benford's Law. Analyzing which datasets adhere to Benford's Law and how quickly Benford behavior sets in are the two most important problems in the field. Most previous work studied systems of independent random variables, and relied on the independence in their analyses. Inspired by natural processes such as particle decay, we study the dependent random variables that emerge from models of decomposition of conserved quantities. We prove that in many instances the distribution of lengths of the resulting pieces converges to Benford behavior as the number of divisions grow, and give several conjectures for other fragmentation processes. The main difficulty is that the resulting random variables are dependent. We handle this by using tools from Fourier analysis and irrationality exponents to obtain quantified convergence rates as well as introducing and developing techniques to measure and control the dependencies. The construction of these tools is one of the major motivations of this work, as our approach can be applied to many other dependent systems. As an example, we show that the n ! entries in the determinant expansions of n × n matrices with entries independently drawn from nice random variables converges to Benford's Law.
Cavity master equation for the continuous time dynamics of discrete-spin models.
Aurell, E; Del Ferraro, G; Domínguez, E; Mulet, R
2017-05-01
We present an alternate method to close the master equation representing the continuous time dynamics of interacting Ising spins. The method makes use of the theory of random point processes to derive a master equation for local conditional probabilities. We analytically test our solution studying two known cases, the dynamics of the mean-field ferromagnet and the dynamics of the one-dimensional Ising system. We present numerical results comparing our predictions with Monte Carlo simulations in three different models on random graphs with finite connectivity: the Ising ferromagnet, the random field Ising model, and the Viana-Bray spin-glass model.
Cavity master equation for the continuous time dynamics of discrete-spin models
NASA Astrophysics Data System (ADS)
Aurell, E.; Del Ferraro, G.; Domínguez, E.; Mulet, R.
2017-05-01
We present an alternate method to close the master equation representing the continuous time dynamics of interacting Ising spins. The method makes use of the theory of random point processes to derive a master equation for local conditional probabilities. We analytically test our solution studying two known cases, the dynamics of the mean-field ferromagnet and the dynamics of the one-dimensional Ising system. We present numerical results comparing our predictions with Monte Carlo simulations in three different models on random graphs with finite connectivity: the Ising ferromagnet, the random field Ising model, and the Viana-Bray spin-glass model.
Martínez-Alonso, E; Ramos, J M
2016-06-01
Randomized controlled trials (RCT) are a key component in clinical research and they provide the highest quality clinical results. The objective of this study was to describe the main characteristics of RCTs published in Malaria Journal, including research topics, study population and design, funding sources and collaboration between institutions. This may help researchers and funders define future research priorities in this field. A retrospective analysis was performed on the RCTs published in Malaria Journal between January 1, 2008 and December 31, 2013. A key-word search by "Randomized controlled trial" or "Random*" was carried out in PubMed. RCT indexed to MEDLINE were selected for the analysis. A total of 108 published articles containing RCTs were analysed. Treatment of uncomplicated Plasmodium falciparum malaria (n=45, 41.6%), especially the efficacy and safety of antimalarial drugs, and malaria prevention (n=34, 31.5%) were the two main research topics. The majority of trials were conducted in Africa (62.2%) and Asia (27%) and received external funding (private, 42.3% and/or public, 38.6%). Paediatric population was the primary study group (n=63, 58.3%), followed by adults (n=29, 26.9%). Pregnant women (n=7) and geriatric population (n=1) remain underrepresented. Nearly 75% of trials were conducted in individual subjects and 25% in groups of subjects (cluster RCTs). A considerable collaboration between researchers and institutions is noteworthy. RCTs published in Malaria Journal address a wide range of research topics. Paediatric trials conducted in Africa and Asia are frequently performed, and a significant worldwide collaboration to fight against malaria has been identified.
Nonlinear probabilistic finite element models of laminated composite shells
NASA Technical Reports Server (NTRS)
Engelstad, S. P.; Reddy, J. N.
1993-01-01
A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.
Exploring and reducing stress in young restaurant workers: results of a randomized field trial.
Petree, Robyn D; Broome, Kirk M; Bennett, Joel B
2012-01-01
Young adult restaurant workers face the dual stressors of work adjustment and managing personal responsibilities. We assessed a new psychosocial/health promotion training designed to reduce these stressors in the context of restaurant work. DESIGN . A cluster-randomized trial of a training program, with surveys administered approximately 2 weeks before training and both 6 and 12 months after training. A national restaurant chain. A total of 947 restaurant workers in 28 restaurants. Personal stress, exposure to problem coworkers, and personal and job characteristics. Team Resilience (TR) is an interactive program for stress management, teamwork, and work-life balance. TR focuses on "five Cs" of resilience: compassion, commitment, centering, community, and confidence. ANALYSIS . Mixed-model (multilevel) analysis of covariances. Compared with workers in control stores, workers in TR-trained stores showed significant reductions over time in exposure to problem coworkers (F[2, 80.60] = 4.48; p = .01) and in personal stress (F[2, 75.28] = 6.12; p = .003). The TR program may help young workers who face the challenges of emerging adulthood and work-life balance.
Grabska-Barwińska, Agnieszka; Latham, Peter E
2014-06-01
We use mean field techniques to compute the distribution of excitatory and inhibitory firing rates in large networks of randomly connected spiking quadratic integrate and fire neurons. These techniques are based on the assumption that activity is asynchronous and Poisson. For most parameter settings these assumptions are strongly violated; nevertheless, so long as the networks are not too synchronous, we find good agreement between mean field prediction and network simulations. Thus, much of the intuition developed for randomly connected networks in the asynchronous regime applies to mildly synchronous networks.
Experimental benchmarking of quantum control in zero-field nuclear magnetic resonance.
Jiang, Min; Wu, Teng; Blanchard, John W; Feng, Guanru; Peng, Xinhua; Budker, Dmitry
2018-06-01
Demonstration of coherent control and characterization of the control fidelity is important for the development of quantum architectures such as nuclear magnetic resonance (NMR). We introduce an experimental approach to realize universal quantum control, and benchmarking thereof, in zero-field NMR, an analog of conventional high-field NMR that features less-constrained spin dynamics. We design a composite pulse technique for both arbitrary one-spin rotations and a two-spin controlled-not (CNOT) gate in a heteronuclear two-spin system at zero field, which experimentally demonstrates universal quantum control in such a system. Moreover, using quantum information-inspired randomized benchmarking and partial quantum process tomography, we evaluate the quality of the control, achieving single-spin control for 13 C with an average fidelity of 0.9960(2) and two-spin control via a CNOT gate with a fidelity of 0.9877(2). Our method can also be extended to more general multispin heteronuclear systems at zero field. The realization of universal quantum control in zero-field NMR is important for quantum state/coherence preparation, pulse sequence design, and is an essential step toward applications to materials science, chemical analysis, and fundamental physics.
Experimental benchmarking of quantum control in zero-field nuclear magnetic resonance
Feng, Guanru
2018-01-01
Demonstration of coherent control and characterization of the control fidelity is important for the development of quantum architectures such as nuclear magnetic resonance (NMR). We introduce an experimental approach to realize universal quantum control, and benchmarking thereof, in zero-field NMR, an analog of conventional high-field NMR that features less-constrained spin dynamics. We design a composite pulse technique for both arbitrary one-spin rotations and a two-spin controlled-not (CNOT) gate in a heteronuclear two-spin system at zero field, which experimentally demonstrates universal quantum control in such a system. Moreover, using quantum information–inspired randomized benchmarking and partial quantum process tomography, we evaluate the quality of the control, achieving single-spin control for 13C with an average fidelity of 0.9960(2) and two-spin control via a CNOT gate with a fidelity of 0.9877(2). Our method can also be extended to more general multispin heteronuclear systems at zero field. The realization of universal quantum control in zero-field NMR is important for quantum state/coherence preparation, pulse sequence design, and is an essential step toward applications to materials science, chemical analysis, and fundamental physics. PMID:29922714
Pambo, Kennedy O; Okello, Julius J; Mbeche, Robert M; Kinyuru, John N
2017-01-01
This study used a field experiment and means-end chain analysis to examine the effects of positive and perceived negative nutrition information on the households' motivations to consume insect-based foods. It used a random sample of households drawn from rural communities in Kenya. The study found that provision of nutrition information on benefits of edible insects and perceived negative aspects of insect-based foods influences participants' perceptions of insect-based foods and hence acceptance. We also found that tasting real products influenced the nature of mental constructs. The results provide marketers of edible insects with potential marketing messages for promotion.
Efficacy of killed whole-parasite vaccines in the prevention of leishmaniasis: a meta-analysis.
Noazin, Sassan; Khamesipour, Ali; Moulton, Lawrence H; Tanner, Marcel; Nasseri, Kiumarss; Modabber, Farrokh; Sharifi, Iraj; Khalil, E A G; Bernal, Ivan Dario Velez; Antunes, Carlos M F; Smith, Peter G
2009-07-30
Despite decades of investigation in countries on three continents, an efficacious vaccine against Leishmania infections has not been developed. Although some indication of protection was observed in some of the controlled trials conducted with "first-generation" whole, inactivated Leishmania parasite vaccines, convincing evidence of protection was lacking. After reviewing all previously published or unpublished randomized, controlled field efficacy clinical trials of prophylactic candidate vaccines, a meta-analysis of qualified trials was conducted to evaluate whether there was some evidence of protection revealed by considering the results of all trials together. The findings indicate that the whole-parasite vaccine candidates tested do not confer significant protection against human leishmaniasis.
Scaling properties of the two-dimensional randomly stirred Navier-Stokes equation.
Mazzino, Andrea; Muratore-Ginanneschi, Paolo; Musacchio, Stefano
2007-10-05
We inquire into the scaling properties of the 2D Navier-Stokes equation sustained by a force field with Gaussian statistics, white noise in time, and with a power-law correlation in momentum space of degree 2 - 2 epsilon. This is at variance with the setting usually assumed to derive Kraichnan's classical theory. We contrast accurate numerical experiments with the different predictions provided for the small epsilon regime by Kraichnan's double cascade theory and by renormalization group analysis. We give clear evidence that for all epsilon, Kraichnan's theory is consistent with the observed phenomenology. Our results call for a revision in the renormalization group analysis of (2D) fully developed turbulence.
Kawamoto, Hirokazu; Takayasu, Hideki; Jensen, Henrik Jeldtoft; Takayasu, Misako
2015-01-01
Through precise numerical analysis, we reveal a new type of universal loopless percolation transition in randomly removed complex networks. As an example of a real-world network, we apply our analysis to a business relation network consisting of approximately 3,000,000 links among 300,000 firms and observe the transition with critical exponents close to the mean-field values taking into account the finite size effect. We focus on the largest cluster at the critical point, and introduce survival probability as a new measure characterizing the robustness of each node. We also discuss the relation between survival probability and k-shell decomposition. PMID:25885791
Theory of Dielectric Breakdown in Randomly Inhomogeneous Materials
NASA Astrophysics Data System (ADS)
Gyure, Mark Franklin
1990-01-01
Two models of dielectric breakdown in disordered metal-insulator composites have been developed in an attempt to explain in detail the greatly reduced breakdown electric field observed in these materials. The first model is a two dimensional model in which the composite is treated as a random array of conducting cylinders embedded in an otherwise uniform dielectric background. The two dimensional samples are generated by the Monte Carlo method and a discretized version of the integral form of Laplace's equation is solved to determine the electric field in each sample. Breakdown is modeled as a quasi-static process by which one breakdown at a time occurs at the point of maximum electric field in the system. A cascade of these local breakdowns leads to complete dielectric failure of the system after which the breakdown field can be determined. A second model is developed that is similar to the first in terms of breakdown dynamics, but uses coupled multipole expansions of the electrostatic potential centered at each particle to obtain a more computationally accurate and faster solution to the problem of determining the electric field at an arbitrary point in a random medium. This new algorithm allows extension of the model to three dimensions and treats conducting spherical inclusions as well as cylinders. Successful implementation of this algorithm relies on the use of analytical forms for off-centered expansions of cylindrical and spherical harmonics. Scaling arguments similar to those used in theories of phase transitions are developed for the breakdown field and these arguments are discussed in context with other theories that have been developed to explain the break-down behavior of random resistor and fuse networks. Finally, one of the scaling arguments is used to predict the breakdown field for some samples of solid fuel rocket propellant tested at the China Lake Naval Weapons Center and is found to compare quite well with the experimentally measured breakdown fields.
Visualization of Flows in Packed Beds of Twisted Tapes
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Braun, M. J.; Peloso, D.; Athavale, M. M.; Mullen, R. L.
2002-01-01
A videotape presentation of the flow field in a packed bed of 48 twisted tapes which can be simulated by very thin virtual cylinders has been assembled. The indices of refraction of the oil and the Lucite twisted tapes were closely matched, and the flow was seeded with magnesium oxide particles. Planar laser light projected the flow field in two dimensions both along and transverse to the flow axis. The flow field was three dimensional and complex to describe, yet the most prominent finding was flow threads. It appeared that axial flow spiraled along either within the confines of a virtual cylindrical boundary or within the exterior region, between the tangency points, of the virtual cylinders. Random packing and bed voids created vortices and disrupted the laminar flow but minimized the entrance effects. The flow-pressure drops in the packed bed fell below the Ergun model for porous-media flows. Single-twisted-tape results of Smithberg and Landis (1964) were used to guide the analysis. In appendix A the results of several investigators are scaled to the Ergun model. Further investigations including different geometric configurations, computational fluid dynamic (CFD) gridding, and analysis are required.
Fluid Physics in a Fluctuating Acceleration Environment
NASA Technical Reports Server (NTRS)
Thomson, J. Ross; Drolet, Francois; Vinals, Jorge
1996-01-01
We summarize several aspects of an ongoing investigation of the effects that stochastic residual accelerations (g-jitter) onboard spacecraft can have on experiments conducted in a microgravity environment. The residual acceleration field is modeled as a narrow band noise, characterized by three independent parameters: intensity (g(exp 2)), dominant angular frequency Omega, and characteristic correlation time tau. Realistic values for these parameters are obtained from an analysis of acceleration data corresponding to the SL-J mission, as recorded by the SAMS instruments. We then use the model to address the random motion of a solid particle suspended in an incompressible fluid subjected to such random accelerations. As an extension, the effect of jitter on coarsening of a solid-liquid mixture is briefly discussed, and corrections to diffusion controlled coarsening evaluated. We conclude that jitter will not be significant in the experiment 'Coarsening of solid-liquid mixtures' to be conducted in microgravity. Finally, modifications to the location of onset of instability in systems driven by a random force are discussed by extending the standard reduction to the center manifold to the stochastic case. Results pertaining to time-modulated oscillatory convection are briefly discussed.
The Lévy flight paradigm: random search patterns and mechanisms.
Reynolds, A M; Rhodes, C J
2009-04-01
Over recent years there has been an accumulation of evidence from a variety of experimental, theoretical, and field studies that many organisms use a movement strategy approximated by Lévy flights when they are searching for resources. Lévy flights are random movements that can maximize the efficiency of resource searches in uncertain environments. This is a highly significant finding because it suggests that Lévy flights provide a rigorous mathematical basis for separating out evolved, innate behaviors from environmental influences. We discuss recent developments in random-search theory, as well as the many different experimental and data collection initiatives that have investigated search strategies. Methods for trajectory construction and robust data analysis procedures are presented. The key to prediction and understanding does, however, lie in the elucidation of mechanisms underlying the observed patterns. We discuss candidate neurological, olfactory, and learning mechanisms for the emergence of Lévy flight patterns in some organisms, and note that convergence of behaviors along such different evolutionary pathways is not surprising given the energetic efficiencies that Lévy flight movement patterns confer.
Rossi, Marco; Stockman, Gert-Jan; Rogier, Hendrik; Vande Ginste, Dries
2016-01-01
The efficiency of a wireless power transfer (WPT) system in the radiative near-field is inevitably affected by the variability in the design parameters of the deployed antennas and by uncertainties in their mutual position. Therefore, we propose a stochastic analysis that combines the generalized polynomial chaos (gPC) theory with an efficient model for the interaction between devices in the radiative near-field. This framework enables us to investigate the impact of random effects on the power transfer efficiency (PTE) of a WPT system. More specifically, the WPT system under study consists of a transmitting horn antenna and a receiving textile antenna operating in the Industrial, Scientific and Medical (ISM) band at 2.45 GHz. First, we model the impact of the textile antenna’s variability on the WPT system. Next, we include the position uncertainties of the antennas in the analysis in order to quantify the overall variations in the PTE. The analysis is carried out by means of polynomial-chaos-based macromodels, whereas a Monte Carlo simulation validates the complete technique. It is shown that the proposed approach is very accurate, more flexible and more efficient than a straightforward Monte Carlo analysis, with demonstrated speedup factors up to 2500. PMID:27447632
Rossi, Marco; Stockman, Gert-Jan; Rogier, Hendrik; Vande Ginste, Dries
2016-07-19
The efficiency of a wireless power transfer (WPT) system in the radiative near-field is inevitably affected by the variability in the design parameters of the deployed antennas and by uncertainties in their mutual position. Therefore, we propose a stochastic analysis that combines the generalized polynomial chaos (gPC) theory with an efficient model for the interaction between devices in the radiative near-field. This framework enables us to investigate the impact of random effects on the power transfer efficiency (PTE) of a WPT system. More specifically, the WPT system under study consists of a transmitting horn antenna and a receiving textile antenna operating in the Industrial, Scientific and Medical (ISM) band at 2.45 GHz. First, we model the impact of the textile antenna's variability on the WPT system. Next, we include the position uncertainties of the antennas in the analysis in order to quantify the overall variations in the PTE. The analysis is carried out by means of polynomial-chaos-based macromodels, whereas a Monte Carlo simulation validates the complete technique. It is shown that the proposed approach is very accurate, more flexible and more efficient than a straightforward Monte Carlo analysis, with demonstrated speedup factors up to 2500.
NASA Astrophysics Data System (ADS)
Sabra, K.
2006-12-01
The random nature of noise and scattered fields tends to suggest limited utility. Indeed, seismic or acoustic fields from random sources or scatterers are often considered to be incoherent, but there is some coherence between two sensors that receive signals from the same individual source or scatterer. An estimate of the Green's function (or impulse response) between two points can be obtained from the cross-correlation of random wavefields recorded at these two points. Recent theoretical and experimental studies in ultrasonics, underwater acoustics, structural monitoring and seismology have investigated this technique in various environments and frequency ranges. These results provide a means for passive imaging using only the random wavefields, without the use of active sources. The coherent wavefronts emerge from a correlation process that accumulates contributions over time from random sources whose propagation paths pass through both receivers. Results will be presented from experiments using ambient noise cross-correlations for the following applications: 1) passive surface waves tomography from ocean microseisms and 2) structural health monitoring of marine and airborne structures embedded in turbulent flow.
Applied mediation analyses: a review and tutorial.
Lange, Theis; Hansen, Kim Wadt; Sørensen, Rikke; Galatius, Søren
2017-01-01
In recent years, mediation analysis has emerged as a powerful tool to disentangle causal pathways from an exposure/treatment to clinically relevant outcomes. Mediation analysis has been applied in scientific fields as diverse as labour market relations and randomized clinical trials of heart disease treatments. In parallel to these applications, the underlying mathematical theory and computer tools have been refined. This combined review and tutorial will introduce the reader to modern mediation analysis including: the mathematical framework; required assumptions; and software implementation in the R package medflex. All results are illustrated using a recent study on the causal pathways stemming from the early invasive treatment of acute coronary syndrome, for which the rich Danish population registers allow us to follow patients' medication use and more after being discharged from hospital.
Volpe, Giorgio; Volpe, Giovanni; Gigan, Sylvain
2014-01-01
The motion of particles in random potentials occurs in several natural phenomena ranging from the mobility of organelles within a biological cell to the diffusion of stars within a galaxy. A Brownian particle moving in the random optical potential associated to a speckle pattern, i.e., a complex interference pattern generated by the scattering of coherent light by a random medium, provides an ideal model system to study such phenomena. Here, we derive a theory for the motion of a Brownian particle in a speckle field and, in particular, we identify its universal characteristic timescale. Based on this theoretical insight, we show how speckle light fields can be used to control the anomalous diffusion of a Brownian particle and to perform some basic optical manipulation tasks such as guiding and sorting. Our results might broaden the perspectives of optical manipulation for real-life applications. PMID:24496461
Role of random electric fields in relaxors
Phelan, Daniel; Stock, Christopher; Rodriguez-Rivera, Jose A.; Chi, Songxue; Leão, Juscelino; Long, Xifa; Xie, Yujuan; Bokov, Alexei A.; Ye, Zuo-Guang; Ganesh, Panchapakesan; Gehring, Peter M.
2014-01-01
PbZr1–xTixO3 (PZT) and Pb(Mg1/3Nb2/3)1–xTixO3 (PMN-xPT) are complex lead-oxide perovskites that display exceptional piezoelectric properties for pseudorhombohedral compositions near a tetragonal phase boundary. In PZT these compositions are ferroelectrics, but in PMN-xPT they are relaxors because the dielectric permittivity is frequency dependent and exhibits non-Arrhenius behavior. We show that the nanoscale structure unique to PMN-xPT and other lead-oxide perovskite relaxors is absent in PZT and correlates with a greater than 100% enhancement of the longitudinal piezoelectric coefficient in PMN-xPT relative to that in PZT. By comparing dielectric, structural, lattice dynamical, and piezoelectric measurements on PZT and PMN-xPT, two nearly identical compounds that represent weak and strong random electric field limits, we show that quenched (static) random fields establish the relaxor phase and identify the order parameter. PMID:24449912
Linear velocity fields in non-Gaussian models for large-scale structure
NASA Technical Reports Server (NTRS)
Scherrer, Robert J.
1992-01-01
Linear velocity fields in two types of physically motivated non-Gaussian models are examined for large-scale structure: seed models, in which the density field is a convolution of a density profile with a distribution of points, and local non-Gaussian fields, derived from a local nonlinear transformation on a Gaussian field. The distribution of a single component of the velocity is derived for seed models with randomly distributed seeds, and these results are applied to the seeded hot dark matter model and the global texture model with cold dark matter. An expression for the distribution of a single component of the velocity in arbitrary local non-Gaussian models is given, and these results are applied to such fields with chi-squared and lognormal distributions. It is shown that all seed models with randomly distributed seeds and all local non-Guassian models have single-component velocity distributions with positive kurtosis.
Munhoz, R E F; Prioli, A J; Amaral, A T; Scapim, C A; Simon, G A
2009-08-11
Diallel analysis was used to obtain information on combining ability, heterosis, estimates of genetic distances by random amplified polymorphic DNA (RAPD) and on their correlations with heterosis, for the popcorn varieties RS 20, UNB2, CMS 43, CMS 42, Zélia, UEM J1, UEM M2, Beija-Flor, and Viçosa, which were crossed to obtain all possible combinations, without reciprocals. The genitors and the 36 F(1) hybrids were evaluated in field trials in Maringá during two growing seasons in a randomized complete block design with three replications. Based on the results, strategies for further studies were developed, including the construction of composites by joining varieties with high general combining ability for grain yield (UNB2 and CMS 42) with those with high general combining ability for popping expansion (Zélia, RS 20 and UEM M2). Based on the RAPD markers, UEM J1 and Zélia were the most genetically distant and RS 20 and UNB2 were the most similar. The low correlation between heterosis and genetic distances may be explained by the random dispersion of the RAPD markers, which were insufficient for the exploitation of the popcorn genome. We concluded that an association between genetic dissimilarity and heterosis based only on genetic distance is not expected without considering the effect of dominant loci.
Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm
NASA Astrophysics Data System (ADS)
Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong
2015-02-01
Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.
Inverse random source scattering for the Helmholtz equation in inhomogeneous media
NASA Astrophysics Data System (ADS)
Li, Ming; Chen, Chuchu; Li, Peijun
2018-01-01
This paper is concerned with an inverse random source scattering problem in an inhomogeneous background medium. The wave propagation is modeled by the stochastic Helmholtz equation with the source driven by additive white noise. The goal is to reconstruct the statistical properties of the random source such as the mean and variance from the boundary measurement of the radiated random wave field at multiple frequencies. Both the direct and inverse problems are considered. We show that the direct problem has a unique mild solution by a constructive proof. For the inverse problem, we derive Fredholm integral equations, which connect the boundary measurement of the radiated wave field with the unknown source function. A regularized block Kaczmarz method is developed to solve the ill-posed integral equations. Numerical experiments are included to demonstrate the effectiveness of the proposed method.
Effects of field variables on infield biomass bales aggregation strategies
USDA-ARS?s Scientific Manuscript database
Infield aggregation of bales, an essential logistics operation of clearing the field for subsequent cropping, is influenced by several field variables, such as field shape, area, randomness on bale layout, biomass yield per unit area, bale row spacing, number of bales handled simultaneously, collect...
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-01-01
Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing. PMID:24567836
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-08-01
To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing.
NASA Astrophysics Data System (ADS)
Bera, Anindita; Rakshit, Debraj; SenDe, Aditi; Sen, Ujjwal
2017-06-01
We investigate equilibrium statistical properties of the isotropic quantum XY spin-1/2 model in an external magnetic field when the interaction and field parts are subjected to quenched or annealed disorder or both. The randomness present in the system are termed annealed or quenched depending on the relation between two different time scales—the time scale associated with the equilibration of the randomness and the time of observation. Within a mean-field framework, we study the effects of disorders on spontaneous magnetization, both by perturbative and numerical techniques. Our primary interest is to understand the differences between quenched and annealed cases, and also to investigate the interplay when both of them are present in a system. We find that the magnetization survives in the presence of a unidirectional random field, irrespective of its nature, i.e., whether it is quenched or annealed. However, the field breaks the circular symmetry of the magnetization, and the system magnetizes in specific directions, parallel or transverse to the applied magnetic field. Interestingly, while the transverse magnetization is affected by the annealed disordered field, the parallel one remains unfazed by the same. Moreover, the annealed disorder present in the interaction term does not affect the system's spontaneous magnetization and the corresponding critical temperature, irrespective of the presence or absence of quenched or annealed disorder in the field term. We carry out a comparative study of these and all other different combinations of the disorders in the interaction and field terms, and point out their generic features.
A probabilistic Hu-Washizu variational principle
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
New poly(butylene succinate)/layered silicate nanocomposites: preparation and mechanical properties.
Ray, Suprakas Sinha; Okamoto, Kazuaki; Maiti, Pralay; Okamoto, Masami
2002-04-01
New poly(butylene succinate) (PBS)/layered silicate nanocomposites have been successfully prepared by simple melt extrusion of PBS and octadecylammonium modified montmorillonite (C18-mmt) at 150 degrees C. The d-spacing of both C18-mmt and intercalated nanocomposites was investigated by wide-angle X-ray diffraction analysis. Bright-field transmission electron microscopic study showed several stacked silicate layers with random orientation in the PBS matrix. The intercalated nanocomposites exhibited remarkable improvement of mechanical properties in both solid and melt states as compared with that of PBS matrix without clay.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; Piburn, Jesse O; McManamay, Ryan A
2017-01-01
Monte Carlo simulation is a popular numerical experimentation technique used in a range of scientific fields to obtain the statistics of unknown random output variables. Despite its widespread applicability, it can be difficult to infer required input probability distributions when they are related to population counts unknown at desired spatial resolutions. To overcome this challenge, we propose a framework that uses a dasymetric model to infer the probability distributions needed for a specific class of Monte Carlo simulations which depend on population counts.
Microscopic Lagrangian description of warm plasmas. III - Nonlinear wave-particle interaction
NASA Technical Reports Server (NTRS)
Galloway, J. J.; Crawford, F. W.
1977-01-01
The averaged-Lagrangian method is applied to nonlinear wave-particle interactions in an infinite, homogeneous, magnetic-field-free plasma. The specific example of Langmuir waves is considered, and the combined effects of four-wave interactions and wave-particle interactions are treated. It is demonstrated how the latter lead to diffusion in velocity space, and the quasilinear diffusion equation is derived. The analysis is generalized to the random phase approximation. The paper concludes with a summary of the method as applied in Parts 1-3 of the paper.
Local properties of the large-scale peaks of the CMB temperature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcos-Caballero, A.; Martínez-González, E.; Vielva, P., E-mail: marcos@ifca.unican.es, E-mail: martinez@ifca.unican.es, E-mail: vielva@ifca.unican.es
2017-05-01
In the present work, we study the largest structures of the CMB temperature measured by Planck in terms of the most prominent peaks on the sky, which, in particular, are located in the southern galactic hemisphere. Besides these large-scale features, the well-known Cold Spot anomaly is included in the analysis. All these peaks would contribute significantly to some of the CMB large-scale anomalies, as the parity and hemispherical asymmetries, the dipole modulation, the alignment between the quadrupole and the octopole, or in the case of the Cold Spot, to the non-Gaussianity of the field. The analysis of the peaks ismore » performed by using their multipolar profiles, which characterize the local shape of the peaks in terms of the discrete Fourier transform of the azimuthal angle. In order to quantify the local anisotropy of the peaks, the distribution of the phases of the multipolar profiles is studied by using the Rayleigh random walk methodology. Finally, a direct analysis of the 2-dimensional field around the peaks is performed in order to take into account the effect of the galactic mask. The results of the analysis conclude that, once the peak amplitude and its first and second order derivatives at the centre are conditioned, the rest of the field is compatible with the standard model. In particular, it is observed that the Cold Spot anomaly is caused by the large value of curvature at the centre.« less
NASA Astrophysics Data System (ADS)
Hu, Zhaoying; Tulevski, George S.; Hannon, James B.; Afzali, Ali; Liehr, Michael; Park, Hongsik
2015-06-01
Carbon nanotubes (CNTs) have been widely studied as a channel material of scaled transistors for high-speed and low-power logic applications. In order to have sufficient drive current, it is widely assumed that CNT-based logic devices will have multiple CNTs in each channel. Understanding the effects of the number of CNTs on device performance can aid in the design of CNT field-effect transistors (CNTFETs). We have fabricated multi-CNT-channel CNTFETs with an 80-nm channel length using precise self-assembly methods. We describe compact statistical models and Monte Carlo simulations to analyze failure probability and the variability of the on-state current and threshold voltage. The results show that multichannel CNTFETs are more resilient to process variation and random environmental fluctuations than single-CNT devices.
Wavelet synthetic method for turbulent flow.
Zhou, Long; Rauh, Cornelia; Delgado, Antonio
2015-07-01
Based on the idea of random cascades on wavelet dyadic trees and the energy cascade model known as the wavelet p model, a series of velocity increments in two-dimensional space are constructed in different levels of scale. The dynamics is imposed on the generated scales by solving the Euler equation in the Lagrangian framework. A dissipation model is used in order to cover the shortage of the p model, which only predicts in inertial range. Wavelet reconstruction as well as the multiresolution analysis are then performed on each scales. As a result, a type of isotropic velocity field is created. The statistical properties show that the constructed velocity fields share many important features with real turbulence. The pertinence of this approach in the prediction of flow intermittency is also discussed.
DNA based random key generation and management for OTP encryption.
Zhang, Yunpeng; Liu, Xin; Sun, Manhui
2017-09-01
One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.
Collision Models for Particle Orbit Code on SSX
NASA Astrophysics Data System (ADS)
Fisher, M. W.; Dandurand, D.; Gray, T.; Brown, M. R.; Lukin, V. S.
2011-10-01
Coulomb collision models are being developed and incorporated into the Hamiltonian particle pushing code (PPC) for applications to the Swarthmore Spheromak eXperiment (SSX). A Monte Carlo model based on that of Takizuka and Abe [JCP 25, 205 (1977)] performs binary collisions between test particles and thermal plasma field particles randomly drawn from a stationary Maxwellian distribution. A field-based electrostatic fluctuation model scatters particles from a spatially uniform random distribution of positive and negative spherical potentials generated throughout the plasma volume. The number, radii, and amplitude of these potentials are chosen to mimic the correct particle diffusion statistics without the use of random particle draws or collision frequencies. An electromagnetic fluctuating field model will be presented, if available. These numerical collision models will be benchmarked against known analytical solutions, including beam diffusion rates and Spitzer resistivity, as well as each other. The resulting collisional particle orbit models will be used to simulate particle collection with electrostatic probes in the SSX wind tunnel, as well as particle confinement in typical SSX fields. This work has been supported by US DOE, NSF and ONR.
Wen, J. -J.; Koohpayeh, S. M.; Ross, K. A.; ...
2017-03-08
Inelastic neutron scattering reveals a broad continuum of excitations in Pr 2 Zr 2 O 7 , the temperature and magnetic field dependence of which indicate a continuous distribution of quenched transverse fields ( Δ ) acting on the non-Kramers Pr 3 + crystal field ground state doublets. Spin-ice correlations are apparent within 0.2 meV of the Zeeman energy. In a random phase approximation an excellent account of the data is provided and contains a transverse field distribution ρ ( Δ ) ∝ ( Δ 2 + Γ 2 ) - 1 , where Γ = 0.27 ( 1 )more » meV . Established during high temperature synthesis due to an underlying structural instability, it appears disorder in Pr 2 Zr 2 O 7 actually induces a quantum spin liquid.« less
1988-04-15
granules typically last 10-15 minutes. measure- the divergence of the flow field, and (d) the SOUP flow field muerts must be made in a time short...the magnetograms and ary. If so, the random-walk diffusion of magnetic field dii- AV . I, I68 PHOTOSPIIERIC FLOW FIELDS ON SOLAR SURFACE 967 0011 cussd
Sun, Min; Zhang, Zhi-Qiang; Ma, Chi-Yuan; Chen, Sui-Hua; Chen, Xin-Jian
2017-01-01
To determine the dominant predictive factors of postoperative visual recovery for patients with pituitary adenoma. PubMed, Google Scholar, Web of Science and Cochrane Library were searched for relevant human studies, which investigated the prediction of the postoperative visual recovery of patients with pituitary adenoma, from January 2000 to May 2017. Meta-analyses were performed on the primary outcomes. After the related data were extracted by two independent investigators, pooled weighted mean difference (WMD) and odds ratio (OR) with 95% confidence interval (CI) were estimated using a random-effects or a fixed-effects model. Nineteen studies were included in the literature review, and nine trials were included in the Meta-analysis, which comprised 530 patients (975 eyes) with pituitary adenoma. For the primary outcomes, there was a significant difference between preoperative and postoperative mean deviation (MD) values of the visual field (WMD -5.85; 95%CI: -8.19 to -3.51; P <0.00001). Predictive characteristics of four factors were revealed in this Meta-analysis by assigning the patients to sufficient and insufficient groups according to postoperative visual field improvements, including preoperative visual field defect (WMD 10.09; 95%CI: 6.17 to 14.02; P <0.00001), patient age (WMD -12.32; 95%CI: -18.42 to -6.22; P <0.0001), symptom duration (WMD -5.04; 95%CI: -9.71 to -0.37; P =0.03), and preoperative peripapillary retinal nerve fiber layer (pRNFL) thickness (OR 0.1; 95% CI: 0.04 to 0.23; P <0.00001). Preoperative visual field defect, symptom duration, patient age, and preoperative pRNFL thickness are the dominant predictive factors of the postoperative recovery of the visual field for patients with pituitary adenoma.
Zhang, J; Chen, X; Zhu, Q; Cui, J; Cao, L; Su, J
2016-11-01
In recent years, the number of randomized controlled trials (RCTs) in the field of orthopaedics is increasing in Mainland China. However, randomized controlled trials (RCTs) are inclined to bias if they lack methodological quality. Therefore, we performed a survey of RCT to assess: (1) What about the quality of RCTs in the field of orthopedics in Mainland China? (2) Whether there is difference between the core journals of the Chinese department of orthopedics and Orthopaedics Traumatology Surgery & Research (OTSR). This research aimed to evaluate the methodological reporting quality according to the CONSORT statement of randomized controlled trials (RCTs) in seven key orthopaedic journals published in Mainland China over 5 years from 2010 to 2014. All of the articles were hand researched on Chongqing VIP database between 2010 and 2014. Studies were considered eligible if the words "random", "randomly", "randomization", "randomized" were employed to describe the allocation way. Trials including animals, cadavers, trials published as abstracts and case report, trials dealing with subgroups analysis, or trials without the outcomes were excluded. In addition, eight articles selected from Orthopaedics Traumatology Surgery & Research (OTSR) between 2010 and 2014 were included in this study for comparison. The identified RCTs are analyzed using a modified version of the Consolidated Standards of Reporting Trials (CONSORT), including the sample size calculation, allocation sequence generation, allocation concealment, blinding and handling of dropouts. A total of 222 RCTs were identified in seven core orthopaedic journals. No trials reported adequate sample size calculation, 74 (33.4%) reported adequate allocation generation, 8 (3.7%) trials reported adequate allocation concealment, 18 (8.1%) trials reported adequate blinding and 16 (7.2%) trials reported handling of dropouts. In OTSR, 1 (12.5%) trial reported adequate sample size calculation, 4 (50.0%) reported adequate allocation generation, 1 (12.5%) trials reported adequate allocation concealment, 2 (25.0%) trials reported adequate blinding and 5 (62.5%) trials reported handling of dropouts. There were statistical differences as for sample size calculation and handling of dropouts between papers from Mainland China and OTSR (P<0.05). The findings of this study show that the methodological reporting quality of RCTs in seven core orthopaedic journals from the Mainland China is far from satisfaction and it needs to further improve to keep up with the standards of the CONSORT statement. Level III case control. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Zero field reversal probability in thermally assisted magnetization reversal
NASA Astrophysics Data System (ADS)
Prasetya, E. B.; Utari; Purnama, B.
2017-11-01
This paper discussed about zero field reversal probability in thermally assisted magnetization reversal (TAMR). Appearance of reversal probability in zero field investigated through micromagnetic simulation by solving stochastic Landau-Lifshitz-Gibert (LLG). The perpendicularly anisotropy magnetic dot of 50×50×20 nm3 is considered as single cell magnetic storage of magnetic random acces memory (MRAM). Thermally assisted magnetization reversal was performed by cooling writing process from near/almost Curie point to room temperature on 20 times runs for different randomly magnetized state. The results show that the probability reversal under zero magnetic field decreased with the increase of the energy barrier. The zero-field probability switching of 55% attained for energy barrier of 60 k B T and the reversal probability become zero noted at energy barrier of 2348 k B T. The higest zero-field switching probability of 55% attained for energy barrier of 60 k B T which corespond to magnetif field of 150 Oe for switching.
The random energy model in a magnetic field and joint source channel coding
NASA Astrophysics Data System (ADS)
Merhav, Neri
2008-09-01
We demonstrate that there is an intimate relationship between the magnetic properties of Derrida’s random energy model (REM) of spin glasses and the problem of joint source-channel coding in Information Theory. In particular, typical patterns of erroneously decoded messages in the coding problem have “magnetization” properties that are analogous to those of the REM in certain phases, where the non-uniformity of the distribution of the source in the coding problem plays the role of an external magnetic field applied to the REM. We also relate the ensemble performance (random coding exponents) of joint source-channel codes to the free energy of the REM in its different phases.
Mass media influence spreading in social networks with community structure
NASA Astrophysics Data System (ADS)
Candia, Julián; Mazzitello, Karina I.
2008-07-01
We study an extension of Axelrod's model for social influence, in which cultural drift is represented as random perturbations, while mass media are introduced by means of an external field. In this scenario, we investigate how the modular structure of social networks affects the propagation of mass media messages across a society. The community structure of social networks is represented by coupled random networks, in which two random graphs are connected by intercommunity links. Considering inhomogeneous mass media fields, we study the conditions for successful message spreading and find a novel phase diagram in the multidimensional parameter space. These findings show that social modularity effects are of paramount importance for designing successful, cost-effective advertising campaigns.
The Role of Prosocialness and Trust in the Consumption of Water as a Limited Resource.
Cuadrado, Esther; Tabernero, Carmen; García, Rocío; Luque, Bárbara; Seibert, Jan
2017-01-01
This research analyzes the role of prosocialness and trust in the use of water as a limited resource under situations of competition or cooperation. For this purpose, 107 participants played the role of farmers and made decisions about irrigating their fields in the web-based multiplayer game Irrigania. Before the simulation exercise, participants' prosocialness and trust levels were evaluated and they were randomly assigned to an experimental condition (competition or cooperation). Repeated measures analysis, using the 10 fields and the experimental conditions as factors, showed that, in the cooperation condition, farmers and their villages used a less selfish strategy to cultivate their fields, which produced greater benefits. Under competition, benefits to farmers and their villages were reduced over time. Mediational analysis shows that the selfish irrigation strategy fully mediated the relationship between prosocialness and accumulated profits; prosocial individuals choose less selfish irrigation strategies and, in turn, accumulated more benefit. Moreover, moderation analysis shows that trust moderated the link between prosocialness and water use strategy by strengthening the negative effect of prosocialness on selection of selfish strategies. The implications of these results highlight the importance of promoting the necessary trust to develop prosocial strategies in collectives; therefore, the efficacy of interventions, such as the creation of cooperative educational contexts or organization of collective actions with groups affected by water scarcity, are discussed.
Pertl, Laura; Steinwender, Gernot; Mayer, Christoph; Hausberger, Silke; Pöschl, Eva-Maria; Wackernagel, Werner; Wedrich, Andreas; El-Shabrawi, Yosuf; Haas, Anton
2015-01-01
Introduction Laser photocoagulation is the current gold standard treatment for proliferative retinopathy of prematurity (ROP). However, it permanently reduces the visual field and might induce myopia. Vascular endothelial growth factor (VEGF) inhibitors for the treatment of ROP may enable continuing vascularization of the retina, potentially allowing the preservation of the visual field. However, for their use in infants concern remains. This meta-analysis explores the safety of VEGF inhibitors. Methods The Ovid Interface was used to perform a systematic review of the literature in the databases PubMed, EMBASE and the Cochrane Library. Results This meta-analysis included 24 original reports (including 1.457 eyes) on VEGF inhibitor treatment for ROP. The trials were solely observational except for one randomized and two case-control studies. We estimated a 6-month risk of retreatment per eye of 2.8%, and a 6-month risk of ocular complication without the need of retreatment of 1.6% per eye. Systemic complications were only reported as isolated incidents. Discussion VEGF inhibitors seem to be associated with low recurrence rates and ocular complication rates. They may have the benefit of potentially allowing the preservation of visual field and lower rates of myopia. Due to the lack of data, the risk of systemic side effects cannot be assessed. PMID:26083024
The Role of Prosocialness and Trust in the Consumption of Water as a Limited Resource
Cuadrado, Esther; Tabernero, Carmen; García, Rocío; Luque, Bárbara; Seibert, Jan
2017-01-01
This research analyzes the role of prosocialness and trust in the use of water as a limited resource under situations of competition or cooperation. For this purpose, 107 participants played the role of farmers and made decisions about irrigating their fields in the web-based multiplayer game Irrigania. Before the simulation exercise, participants’ prosocialness and trust levels were evaluated and they were randomly assigned to an experimental condition (competition or cooperation). Repeated measures analysis, using the 10 fields and the experimental conditions as factors, showed that, in the cooperation condition, farmers and their villages used a less selfish strategy to cultivate their fields, which produced greater benefits. Under competition, benefits to farmers and their villages were reduced over time. Mediational analysis shows that the selfish irrigation strategy fully mediated the relationship between prosocialness and accumulated profits; prosocial individuals choose less selfish irrigation strategies and, in turn, accumulated more benefit. Moreover, moderation analysis shows that trust moderated the link between prosocialness and water use strategy by strengthening the negative effect of prosocialness on selection of selfish strategies. The implications of these results highlight the importance of promoting the necessary trust to develop prosocial strategies in collectives; therefore, the efficacy of interventions, such as the creation of cooperative educational contexts or organization of collective actions with groups affected by water scarcity, are discussed. PMID:28533760
NASA Astrophysics Data System (ADS)
Romanovsky, M. Yu; Ebeling, W.; Schimansky-Geier, L.
2005-01-01
The problem of electric and magnetic microfields inside finite spherical systems of stochastically moving ions and outside them is studied. The first possible field of applications is high temperature ion clusters created by laser fields [1]. Other possible applications are nearly spherical liquid systems at room-temperature containing electrolytes. Looking for biological applications we may also think about a cell which is a complicated electrolytic system or even a brain which is a still more complicated system of electrolytic currents. The essential model assumption is the random character of charges motion. We assume in our basic model that we have a finite nearly spherical system of randomly moving charges. Even taking into account that this is at best a caricature of any real system, it might be of interest as a limiting case, which admits a full theoretical treatment. For symmetry reasons, a random configuration of moving charges cannot generate a macroscopic magnetic field, but there will be microscopic fluctuating magnetic fields. Distributions for electric and magnetic microfields inside and outside such space- limited systems are calculated. Spherical systems of randomly distributed moving charges are investigated. Starting from earlier results for infinitely large systems, which lead to Holtsmark- type distributions, we show that the fluctuations in finite charge distributions are larger (in comparison to infinite systems of the same charge density).
ERIC Educational Resources Information Center
Maynard, Brandy R.; Kjellstrand, Elizabeth K.; Thompson, Aaron M.
2014-01-01
Objectives: This study examined the effects of Check & Connect (C&C) on the attendance, behavior, and academic outcomes of at-risk youth in a field-based effectiveness trial. Method: A multisite randomized block design was used, wherein 260 primarily Hispanic (89%) and economically disadvantaged (74%) students were randomized to treatment…
Prospecting of popcorn hybrids for resistance to fall armyworm.
Crubelati-Mulati, N C S; Scapim, C A; Albuquerque, F A; Amaral Junior, A T; Vivas, M; Rodovalho, M A
2014-08-26
The fall armyworm, Spodoptera frugiperda, is the pest that causes the greatest economic losses for both common corn and popcorn crops, and the use of resistant plant genotypes is an important tool for integrated pest management. The goal of the present study was to evaluate the damage caused by S. frugiperda on single-cross popcorn hybrids under field conditions with natural infestation as well as to study the effect of 11 popcorn hybrids on the S. frugiperda life cycle under laboratory conditions. A completely randomized block design with 4 replicates was used for the field experiment, and a completely randomized design with 10 replicates was used for the laboratory experiment. In the field experiment, the damage caused by fall armyworm, grain yield, and popping expansion were quantified, and a diallel analysis was performed to select the best hybrids. For the laboratory experiment, caterpillars were obtained from laboratory cultures kept on an artificial diet and were fed with leaves from the 11 hybrids. Hybrids P7.0 x P9.4, P7.1 x P9.6, P7.2.0 x P9.3, P7.4.0 x P9.1 and P7.4.1 x P9.4 exhibited negative specific combining ability for injury by fall armyworm and positive specific combining ability for yield and popping expansion. In the laboratory experiment, the hybrids influenced the mean larval stage duration, mean larval mass, final larval mass, pupal stage duration, mean pupal mass, and adult longevity.
Anisotropic piezoresistivity characteristics of aligned carbon nanotube-polymer nanocomposites
NASA Astrophysics Data System (ADS)
Sengezer, Engin C.; Seidel, Gary D.; Bodnar, Robert J.
2017-09-01
Dielectrophoresis under the application of AC electric fields is one of the primary fabrication techniques for obtaining aligned carbon nanotube (CNT)-polymer nanocomposites, and is used here to generate long range alignment of CNTs at the structural level. The degree of alignment of CNTs within this long range architecture is observed via polarized Raman spectroscopy so that its influence on the electrical conductivity and piezoresistive response in both the alignment and transverse to alignment directions can be assessed. Nanocomposite samples consisting of randomly oriented, well dispersed single-wall carbon nanotubes (SWCNTs) and of long range electric field aligned SWCNTs in a photopolymerizable monomer blend (urethane dimethacrylate and 1,6-hexanediol dimethacrylate) are quantitatively and qualitatively evaluated. Piezoresistive sensitivities in form of gauge factors were measured for randomly oriented, well dispersed specimens with 0.03, 0.1 and 0.5 wt% SWCNTs and compared with gauge factors in both the axial and transverse to SWCNT alignment directions for electric field aligned 0.03 wt% specimens under both quasi-static monotonic and cyclic tensile loading. Gauge factors in the axial direction were observed to be on the order of 2, while gauge factors in the transverse direction demonstrated a 5 fold increase with values on the order of 10 for aligned specimens. Based on Raman analysis, it is believed the higher sensitivity of the transverse direction is related to architectural evolution of misaligned bridging structures which connect alignment structures under load due to Poisson’s contraction.
Ben Daya, Ibrahim; Chen, Albert I. H.; Shafiee, Mohammad Javad; Wong, Alexander; Yeow, John T. W.
2015-01-01
3-D ultrasound imaging offers unique opportunities in the field of non destructive testing that cannot be easily found in A-mode and B-mode images. To acquire a 3-D ultrasound image without a mechanically moving transducer, a 2-D array can be used. The row column technique is preferred over a fully addressed 2-D array as it requires a significantly lower number of interconnections. Recent advances in 3-D row-column ultrasound imaging systems were largely focused on sensor design. However, these imaging systems face three intrinsic challenges that cannot be addressed by improving sensor design alone: speckle noise, sparsity of data in the imaged volume, and the spatially dependent point spread function of the imaging system. In this paper, we propose a compensated row-column ultrasound image reconstruction system using Fisher-Tippett multilayered conditional random field model. Tests carried out on both simulated and real row-column ultrasound images show the effectiveness of our proposed system as opposed to other published systems. Visual assessment of the results show our proposed system’s potential at preserving detail and reducing speckle. Quantitative analysis shows that our proposed system outperforms previously published systems when evaluated with metrics such as Peak Signal to Noise Ratio, Coefficient of Correlation, and Effective Number of Looks. These results show the potential of our proposed system as an effective tool for enhancing 3-D row-column imaging. PMID:26658577
NASA Astrophysics Data System (ADS)
Antonov, N. V.; Gulitskiy, N. M.; Kostenko, M. M.; Lučivjanský, T.
2017-03-01
We study a model of fully developed turbulence of a compressible fluid, based on the stochastic Navier-Stokes equation, by means of the field-theoretic renormalization group. In this approach, scaling properties are related to the fixed points of the renormalization group equations. Previous analysis of this model near the real-world space dimension 3 identified a scaling regime [N. V. Antonov et al., Theor. Math. Phys. 110, 305 (1997), 10.1007/BF02630456]. The aim of the present paper is to explore the existence of additional regimes, which could not be found using the direct perturbative approach of the previous work, and to analyze the crossover between different regimes. It seems possible to determine them near the special value of space dimension 4 in the framework of double y and ɛ expansion, where y is the exponent associated with the random force and ɛ =4 -d is the deviation from the space dimension 4. Our calculations show that there exists an additional fixed point that governs scaling behavior. Turbulent advection of a passive scalar (density) field by this velocity ensemble is considered as well. We demonstrate that various correlation functions of the scalar field exhibit anomalous scaling behavior in the inertial-convective range. The corresponding anomalous exponents, identified as scaling dimensions of certain composite fields, can be systematically calculated as a series in y and ɛ . All calculations are performed in the leading one-loop approximation.
Ferromagnetic clusters induced by a nonmagnetic random disorder in diluted magnetic semiconductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bui, Dinh-Hoi; Physics Department, Hue University’s College of Education, 34 Le Loi, Hue; Phan, Van-Nham, E-mail: phanvannham@dtu.edu.vn
In this work, we analyze the nonmagnetic random disorder leading to a formation of ferromagnetic clusters in diluted magnetic semiconductors. The nonmagnetic random disorder arises from randomness in the host lattice. Including the disorder to the Kondo lattice model with random distribution of magnetic dopants, the ferromagnetic–paramagnetic transition in the system is investigated in the framework of dynamical mean-field theory. At a certain low temperature one finds a fraction of ferromagnetic sites transiting to the paramagnetic state. Enlarging the nonmagnetic random disorder strength, the paramagnetic regimes expand resulting in the formation of the ferromagnetic clusters.
Sieve-based relation extraction of gene regulatory networks from biological literature
2015-01-01
Background Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. Results We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice of transforming data into skip-mention sequences is appropriate for detecting relations between distant mentions. Conclusions Linear-chain conditional random fields, along with appropriate data transformations, can be efficiently used to extract relations. The sieve-based architecture simplifies the system as new sieves can be easily added or removed and each sieve can utilize the results of previous ones. Furthermore, sieves with conditional random fields can be trained on arbitrary text data and hence are applicable to broad range of relation extraction tasks and data domains. PMID:26551454
Sieve-based relation extraction of gene regulatory networks from biological literature.
Žitnik, Slavko; Žitnik, Marinka; Zupan, Blaž; Bajec, Marko
2015-01-01
Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice of transforming data into skip-mention sequences is appropriate for detecting relations between distant mentions. Linear-chain conditional random fields, along with appropriate data transformations, can be efficiently used to extract relations. The sieve-based architecture simplifies the system as new sieves can be easily added or removed and each sieve can utilize the results of previous ones. Furthermore, sieves with conditional random fields can be trained on arbitrary text data and hence are applicable to broad range of relation extraction tasks and data domains.
Evidence-Based Medicine in Aesthetic Surgery: The Significance of Level to Aesthetic Surgery.
Rohrich, Rod J; Cho, Min-Jeong
2017-05-01
Since its popularization in the 1980s, evidence-based medicine has become the cornerstone of American health care. Many specialties rapidly adapted to the paradigm shift of health care by delivering treatment using the evidence-based guidelines. However, the field of plastic surgery has been slow to implement evidence-based medicine compared with the other specialties because of the challenges of performing randomized controlled trials, such as funding, variability in surgical skills, and difficulty with standardization of techniques. To date, aesthetic surgery has been at the forefront of evidence-based medicine in plastic surgery by having the most randomized controlled trials. Nevertheless, a detailed analysis of these studies has not been previously performed. In this article, the level I and II articles of aesthetic surgery are discussed to increase awareness of high-quality evidence-based medicine in aesthetic surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pingenot, J; Rieben, R; White, D
2005-10-31
We present a computational study of signal propagation and attenuation of a 200 MHz planar loop antenna in a cave environment. The cave is modeled as a straight and lossy random rough wall. To simulate a broad frequency band, the full wave Maxwell equations are solved directly in the time domain via a high order vector finite element discretization using the massively parallel CEM code EMSolve. The numerical technique is first verified against theoretical results for a planar loop antenna in a smooth lossy cave. The simulation is then performed for a series of random rough surface meshes in ordermore » to generate statistical data for the propagation and attenuation properties of the antenna in a cave environment. Results for the mean and variance of the power spectral density of the electric field are presented and discussed.« less
Learning Bayesian Networks from Correlated Data
NASA Astrophysics Data System (ADS)
Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola
2016-05-01
Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.
Design and evaluation of a hybrid storage system in HEP environment
NASA Astrophysics Data System (ADS)
Xu, Qi; Cheng, Yaodong; Chen, Gang
2017-10-01
Nowadays, the High Energy Physics experiments produce a large amount of data. These data are stored in mass storage systems which need to balance the cost, performance and manageability. In this paper, a hybrid storage system including SSDs (Solid-state Drive) and HDDs (Hard Disk Drive) is designed to accelerate data analysis and maintain a low cost. The performance of accessing files is a decisive factor for the HEP computing system. A new deployment model of Hybrid Storage System in High Energy Physics is proposed which is proved to have higher I/O performance. The detailed evaluation methods and the evaluations about SSD/HDD ratio, and the size of the logic block are also given. In all evaluations, sequential-read, sequential-write, random-read and random-write are all tested to get the comprehensive results. The results show the Hybrid Storage System has good performance in some fields such as accessing big files in HEP.
Deformation analysis of boron/aluminum specimens by moire interferometry
NASA Technical Reports Server (NTRS)
Post, Daniel; Guo, Yifan; Czarnek, Robert
1989-01-01
Whole-field surface deformations were measured for two slotted tension specimens from multiply laminates, one with 0 deg fiber orientation in the surface ply and the other with 45 deg orientation. Macromechanical and micromechanical details were revealed using high-sensitivity moire interferometry. Although global deformations of all plies were essentially equal, numerous random or anomalous features were observed. Local deformations of adjacent 0 deg and 45 deg plies were very different, both near the slot and remote from it, requiring large interlaminar shear strains for continuity. Shear strains were concentrated in the aluminum matrix. For 45 deg plies, a major portion of the deformation was by shear; large plastic slip of matrix occurred at random locations in 45 deg plies, wherein groups of fibers slipped relative to other groups. Shear strains in the interior, between adjacent fibers, were larger than the measured surface strains.
Some Aspects of the Investigation of Random Vibration Influence on Ride Comfort
NASA Astrophysics Data System (ADS)
DEMIĆ, M.; LUKIĆ, J.; MILIĆ, Ž.
2002-05-01
Contemporary vehicles must satisfy high ride comfort criteria. This paper attempts to develop criteria for ride comfort improvement. The highest loading levels have been found to be in the vertical direction and the lowest in lateral direction in passenger cars and trucks. These results have formed the basis for further laboratory and field investigations. An investigation of the human body behaviour under random vibrations is reported in this paper. The research included two phases; biodynamic research and ride comfort investigation. A group of 30 subjects was tested. The influence of broadband random vibrations on the human body was examined through the seat-to-head transmissibility function (STHT). Initially, vertical and fore and aft vibrations were considered. Multi-directional vibration was also investigated. In the biodynamic research, subjects were exposed to 0·55, 1·75 and 2·25 m/s2 r.m.s. vibration levels in the 0·5- 40 Hz frequency domain. The influence of sitting position on human body behaviour under two axial vibrations was also examined. Data analysis showed that the human body behaviour under two-directional random vibrations could not be approximated by superposition of one-directional random vibrations. Non-linearity of the seated human body in the vertical and fore and aft directions was observed. Seat-backrest angle also influenced STHT. In the second phase of experimental research, a new method for the assessment of the influence of narrowband random vibration on the human body was formulated and tested. It included determination of equivalent comfort curves in the vertical and fore and aft directions under one- and two-directional narrowband random vibrations. Equivalent comfort curves for durations of 2·5, 4 and 8 h were determined.
Suppression of Dyakonov-Perel Spin Relaxation in High-Mobility n-GaAs
NASA Astrophysics Data System (ADS)
Dzhioev, R. I.; Kavokin, K. V.; Korenev, V. L.; Lazarev, M. V.; Poletaev, N. K.; Zakharchenya, B. P.; Stinaff, E. A.; Gammon, D.; Bracker, A. S.; Ware, M. E.
2004-11-01
We report a large and unexpected suppression of the free electron spin-relaxation in lightly doped n-GaAs bulk crystals. The spin-relaxation rate shows a weak mobility dependence and saturates at a level 30 times less than that predicted by the Dyakonov-Perel theory. The dynamics of the spin-orbit field differs substantially from the usual scheme: although all the experimental data can be self-consistently interpreted as a precessional spin-relaxation induced by a random spin-orbit field, the correlation time of this random field, surprisingly, is much shorter than, and is independent of, the momentum relaxation time determined from transport measurements.
Suppression of Dyakonov-Perel spin relaxation in high-mobility n-GaAs.
Dzhioev, R I; Kavokin, K V; Korenev, V L; Lazarev, M V; Poletaev, N K; Zakharchenya, B P; Stinaff, E A; Gammon, D; Bracker, A S; Ware, M E
2004-11-19
We report a large and unexpected suppression of the free electron spin-relaxation in lightly doped n-GaAs bulk crystals. The spin-relaxation rate shows a weak mobility dependence and saturates at a level 30 times less than that predicted by the Dyakonov-Perel theory. The dynamics of the spin-orbit field differs substantially from the usual scheme: although all the experimental data can be self-consistently interpreted as a precessional spin-relaxation induced by a random spin-orbit field, the correlation time of this random field, surprisingly, is much shorter than, and is independent of, the momentum relaxation time determined from transport measurements.
Coherent Doppler lidar signal covariance including wind shear and wind turbulence
NASA Technical Reports Server (NTRS)
Frehlich, R. G.
1993-01-01
The performance of coherent Doppler lidar is determined by the statistics of the coherent Doppler signal. The derivation and calculation of the covariance of the Doppler lidar signal is presented for random atmospheric wind fields with wind shear. The random component is described by a Kolmogorov turbulence spectrum. The signal parameters are clarified for a general coherent Doppler lidar system. There are two distinct physical regimes: one where the transmitted pulse determines the signal statistics and the other where the wind field dominates the signal statistics. The Doppler shift of the signal is identified in terms of the wind field and system parameters.
Randomized controlled trials and meta-analysis in medical education: what role do they play?
Cook, David A
2012-01-01
Education researchers seek to understand what works, for whom, in what circumstances. Unfortunately, educational environments are complex and research itself is highly context dependent. Faced with these challenges, some have argued that qualitative methods should supplant quantitative methods such as randomized controlled trials (RCTs) and meta-analysis. I disagree. Good qualitative and mixed-methods research are complementary to, rather than exclusive of, quantitative methods. The complexity and challenges we face should not beguile us into ignoring methods that provide strong evidence. What, then, is the proper role for RCTs and meta-analysis in medical education? First, the choice of study design depends on the research question. RCTs and meta-analysis are appropriate for many, but not all, study goals. They have compelling strengths but also numerous limitations. Second, strong methods will not compensate for a pointless question. RCTs do not advance the science when they make confounded comparisons, or make comparison with no intervention. Third, clinical medicine now faces many of the same challenges we encounter in education. We can learn much from other fields about how to handle complexity in RCTs. Finally, no single study will definitively answer any research question. We need carefully planned, theory-building, programmatic research, reflecting a variety of paradigms and approaches, as we accumulate evidence to change the art and science of education.
Global diffusion of cosmic rays in random magnetic fields
NASA Astrophysics Data System (ADS)
Snodin, A. P.; Shukurov, A.; Sarson, G. R.; Bushby, P. J.; Rodrigues, L. F. S.
2016-04-01
The propagation of charged particles, including cosmic rays, in a partially ordered magnetic field is characterized by a diffusion tensor whose components depend on the particle's Larmor radius RL and the degree of order in the magnetic field. Most studies of the particle diffusion presuppose a scale separation between the mean and random magnetic fields (e.g. there being a pronounced minimum in the magnetic power spectrum at intermediate scales). Scale separation is often a good approximation in laboratory plasmas, but not in most astrophysical environments such as the interstellar medium (ISM). Modern simulations of the ISM have numerical resolution of the order of 1 pc, so the Larmor radius of the cosmic rays that dominate in energy density is at least 106 times smaller than the resolved scales. Large-scale simulations of cosmic ray propagation in the ISM thus rely on oversimplified forms of the diffusion tensor. We take the first steps towards a more realistic description of cosmic ray diffusion for such simulations, obtaining direct estimates of the diffusion tensor from test particle simulations in random magnetic fields (with the Larmor radius scale being fully resolved), for a range of particle energies corresponding to 10-2 ≲ RL/lc ≲ 103, where lc is the magnetic correlation length. We obtain explicit expressions for the cosmic ray diffusion tensor for RL/lc ≪ 1, that might be used in a sub-grid model of cosmic ray diffusion. The diffusion coefficients obtained are closely connected with existing transport theories that include the random walk of magnetic lines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chetvertkov, Mikhail A., E-mail: chetvertkov@wayne
2016-10-15
Purpose: To develop standard (SPCA) and regularized (RPCA) principal component analysis models of anatomical changes from daily cone beam CTs (CBCTs) of head and neck (H&N) patients and assess their potential use in adaptive radiation therapy, and for extracting quantitative information for treatment response assessment. Methods: Planning CT images of ten H&N patients were artificially deformed to create “digital phantom” images, which modeled systematic anatomical changes during radiation therapy. Artificial deformations closely mirrored patients’ actual deformations and were interpolated to generate 35 synthetic CBCTs, representing evolving anatomy over 35 fractions. Deformation vector fields (DVFs) were acquired between pCT and syntheticmore » CBCTs (i.e., digital phantoms) and between pCT and clinical CBCTs. Patient-specific SPCA and RPCA models were built from these synthetic and clinical DVF sets. EigenDVFs (EDVFs) having the largest eigenvalues were hypothesized to capture the major anatomical deformations during treatment. Results: Principal component analysis (PCA) models achieve variable results, depending on the size and location of anatomical change. Random changes prevent or degrade PCA’s ability to detect underlying systematic change. RPCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes and is therefore more successful than SPCA at capturing systematic changes early in treatment. SPCA models were less successful at modeling systematic changes in clinical patient images, which contain a wider range of random motion than synthetic CBCTs, while the regularized approach was able to extract major modes of motion. Conclusions: Leading EDVFs from the both PCA approaches have the potential to capture systematic anatomical change during H&N radiotherapy when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the RPCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are established early in a treatment course, or based on population models.« less
Boyd, Courtney; Crawford, Cindy; Paat, Charmagne F; Price, Ashley; Xenakis, Lea; Zhang, Weimin
2016-09-01
Pain is multi-dimensional and may be better addressed through a holistic, biopsychosocial approach. Massage therapy is commonly practiced among patients seeking pain management; however, its efficacy is unclear. This systematic review and meta-analysis is the first to rigorously assess the quality of the evidence for massage therapy's efficacy in treating pain, function-related, and health-related quality of life outcomes in surgical pain populations. Key databases were searched from inception through February 2014. Eligible randomized controlled trials were assessed for methodological quality using SIGN 50 Checklist. Meta-analysis was applied at the outcome level. A professionally diverse steering committee interpreted the results to develop recommendations. Twelve high quality and four low quality studies were included in the review. Results indicate massage therapy is effective for treating pain [standardized mean difference (SMD) = -0.79] and anxiety (SMD = -0.57) compared to active comparators. Based on the available evidence, weak recommendations are suggested for massage therapy, compared to active comparators for reducing pain intensity/severity and anxiety in patients undergoing surgical procedures. This review also discusses massage therapy safety, challenges within this research field, how to address identified research gaps, and next steps for future research. © 2016 American Academy of Pain Medicine.
A numerical approximation to the elastic properties of sphere-reinforced composites
NASA Astrophysics Data System (ADS)
Segurado, J.; Llorca, J.
2002-10-01
Three-dimensional cubic unit cells containing 30 non-overlapping identical spheres randomly distributed were generated using a new, modified random sequential adsortion algorithm suitable for particle volume fractions of up to 50%. The elastic constants of the ensemble of spheres embedded in a continuous and isotropic elastic matrix were computed through the finite element analysis of the three-dimensional periodic unit cells, whose size was chosen as a compromise between the minimum size required to obtain accurate results in the statistical sense and the maximum one imposed by the computational cost. Three types of materials were studied: rigid spheres and spherical voids in an elastic matrix and a typical composite made up of glass spheres in an epoxy resin. The moduli obtained for different unit cells showed very little scatter, and the average values obtained from the analysis of four unit cells could be considered very close to the "exact" solution to the problem, in agreement with the results of Drugan and Willis (J. Mech. Phys. Solids 44 (1996) 497) referring to the size of the representative volume element for elastic composites. They were used to assess the accuracy of three classical analytical models: the Mori-Tanaka mean-field analysis, the generalized self-consistent method, and Torquato's third-order approximation.
Waldman, Boris; Ansquer, Jean-Claude; Sullivan, David R; Jenkins, Alicia J; McGill, Neil; Buizen, Luke; Davis, Timothy M E; Best, James D; Li, Liping; Feher, Michael D; Foucher, Christelle; Kesaniemi, Y Antero; Flack, Jeffrey; d'Emden, Michael C; Scott, Russell S; Hedley, John; Gebski, Val; Keech, Anthony C
2018-04-01
Gout is a painful disorder and is common in type 2 diabetes. Fenofibrate lowers uric acid and reduces gout attacks in small, short-term studies. Whether fenofibrate produces sustained reductions in uric acid and gout attacks is unknown. In the Fenofibrate Intervention and Event Lowering in Diabetes (FIELD) trial, participants aged 50-75 years with type 2 diabetes were randomly assigned to receive either co-micronised fenofibrate 200 mg once per day or matching placebo for a median of 5 years follow-up. We did a post-hoc analysis of recorded on-study gout attacks and plasma uric acid concentrations according to treatment allocation. The outcomes of this analysis were change in uric acid concentrations and risk of on-study gout attacks. The FIELD study is registered with ISRCTN, number ISRCTN64783481. Between Feb 23, 1998, and Nov 3, 2000, 9795 patients were randomly assigned to fenofibrate (n=4895) or placebo (n=4900) in the FIELD study. Uric acid concentrations fell by 20·2% (95% CI 19·9-20·5) during the 6-week active fenofibrate run-in period immediately pre-randomisation (a reduction of 0·06 mmol/L or 1 mg/dL) and remained -20·1% (18·5-21·7, p<0·0001) lower in patients taking fenofibrate than in those on placebo in a random subset re-measured at 1 year. With placebo allocation, there were 151 (3%) first gout events over 5 years, compared with 81 (2%) among those allocated fenofibrate (HR with treatment 0·54, 95% CI 0·41-0·70; p<0·0001). In the placebo group, the cumulative proportion of patients with first gout events was 7·7% in patients with baseline uric acid concentration higher than 0·36 mmol/L and 13·9% in those with baseline uric acid concentration higher than 0·42 mmol/L, compared with 3·4% and 5·7%, respectively, in the fenofibrate group. Risk reductions were similar among men and women and those with dyslipidaemia, on diuretics, and with elevated uric acid concentrations. For participants with elevated baseline uric acid concentrations despite taking allopurinol at study entry, there was no heterogeneity of the treatment effect of fenofibrate on gout risk. Taking account of all gout events, fenofibrate treatment halved the risk (HR 0·48, 95% CI 0·37-0·60; p<0·0001) compared with placebo. Fenofibrate lowered uric acid concentrations by 20%, and almost halved first on-study gout events over 5 years of treatment. Fenofibrate could be a useful adjunct for preventing gout in diabetes. None. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Willgoose, G. R.; Chen, M.; Cohen, S.; Saco, P. M.; Hancock, G. R.
2013-12-01
In humid areas it is generally considered that soil moisture scales spatially according to the wetness index of the landscape. This scaling arises from lateral flow downslope of ground water within the soil zone. However, in semi-arid and drier regions, this lateral flow is small and fluxes are dominated by vertical flows driven by infiltration and evapotranspiration. Thus, in the absence of runon processes, soil moisture at a location is more driven by local factors such as soil and vegetation properties at that location rather than upstream processes draining to that point. The 'apparent' spatial randomness of soil and vegetation properties generally suggests that soil moisture for semi-arid regions is spatially random. In this presentation a new analysis of neutron probe data during summer from the Tarrawarra site near Melbourne, Australia shows persistent spatial organisation of soil moisture over several years. This suggests a link between permanent features of the catchment (e.g. soil properties) and soil moisture distribution, even though the spatial pattern of soil moisture during the 4 summers monitored appears spatially random. This and other data establishes a prima facie case that soil variations drive spatial variation in soil moisture. Accordingly, we used a previously published spatial scaling relationship for soil properties derived using the mARM pedogenesis model to simulate the spatial variation of soil grading. This soil grading distribution was used in the Rosetta pedotransfer model to derive a spatial distribution of soil functional properties (e.g. saturated hydraulic conductivity, porosity). These functional properties were then input into the HYDRUS-1D soil moisture model and soil moisture simulated for 3 years at daily resolution. The HYDRUS model used had previously been calibrated to field observed soil moisture data at our SASMAS field site. The scaling behaviour of soil moisture derived from this modelling will be discussed and compared with observed data from our SASMAS field sites.
Field Line Random Walk in Isotropic Magnetic Turbulence up to Infinite Kubo Number
NASA Astrophysics Data System (ADS)
Sonsrettee, W.; Wongpan, P.; Ruffolo, D. J.; Matthaeus, W. H.; Chuychai, P.; Rowlands, G.
2013-12-01
In astrophysical plasmas, the magnetic field line random walk (FLRW) plays a key role in the transport of energetic particles. In the present, we consider isotropic magnetic turbulence, which is a reasonable model for interstellar space. Theoretical conceptions of the FLRW have been strongly influenced by studies of the limit of weak fluctuations (or a strong mean field) (e.g, Isichenko 1991a, b). In this case, the behavior of FLRW can be characterized by the Kubo number R = (b/B0)(l_∥ /l_ \\bot ) , where l∥ and l_ \\bot are turbulence coherence scales parallel and perpendicular to the mean field, respectively, and b is the root mean squared fluctuation field. In the 2D limit (R ≫ 1), there has been an apparent conflict between concepts of Bohm diffusion, which is based on the Corrsin's independence hypothesis, and percolative diffusion. Here we have used three non-perturbative analytic techniques based on Corrsin's independence hypothesis for B0 = 0 (R = ∞ ): diffusive decorrelation (DD), random ballistic decorrelation (RBD) and a general ordinary differential equation (ODE), and compared them with direct computer simulations. All the analytical models and computer simulations agree that isotropic turbulence for R = ∞ has a field line diffusion coefficient that is consistent with Bohm diffusion. Partially supported by the Thailand Research Fund, NASA, and NSF.
Data analysis and noise prediction for the QF-1B experimental fan stage
NASA Technical Reports Server (NTRS)
Bliss, D. B.; Chandiramani, K. L.; Piersol, A. G.
1976-01-01
The results of a fan noise data analysis and prediction effort using experimental data obtained from tests on the QF-1B research fan are described. Surface pressure measurements were made with flush mounted sensors installed on selected rotor blades and stator vanes and noise measurements were made by microphones located at the far field. Power spectral density analysis, time history studies, and calculation of coherence functions were made. The emphasis of these studies was on the characteristics of tones in the spectra. The amplitude behavior of spectral tones was found to have a large, often predominant, random component, suggesting that turbulent processes play an important role in the generation of tonal as well as broadband noise. Inputs from the data analysis were used in a prediction method which assumes that acoustic dipoles, produced by unsteady blade and van forces, are the important source of fan noise.
Analysis of GPS Data Collected on the Greenland Ice Sheet
NASA Technical Reports Server (NTRS)
Larson, K.; Plumb, J.; Zwally, J.; Abdalati, W.; Koblinsky, Chester J. (Technical Monitor)
2001-01-01
For several years, GPS observations have been made year round at the Swiss Camp, Greenland. The GPS data are recorded for 12 hours every 10-15 days; data are stored in memory and downloaded during the annual field season. Traditional GPS analysis techniques, where the receiver is assumed not to move within a 24 hour period, is not appropriate at the Swiss Camp, where horizontal velocities are on the order of 30 cm/day. Comparison of analysis strategies for these GPS data indicate that a random walk parameterization, with a constraint of 1-2 x 10(exp -7) km/sqrt(sec) minimizes noise due to satellite outages without corrupting the estimated ice velocity. Low elevation angle observations should be included in the analysis in order to increase the number of satellites viewed at each data epoch. Carrier phase ambiguity resolution is important for improving the accuracy of receiver coordinates.
Ezugwu, Sabastine; Ye, Hanyang; Fanchini, Giovanni
2015-01-07
In order to investigate the suitability of random arrays of nanoparticles for plasmonic enhancement in the visible-near infrared range, we introduced three-dimensional scanning near-field optical microscopy (3D-SNOM) imaging as a useful technique to probe the intensity of near-field radiation scattered by random systems of nanoparticles at heights up to several hundred nm from their surface. We demonstrated our technique using random arrays of copper nanoparticles (Cu-NPs) at different particle diameter and concentration. Bright regions in the 3D-SNOM images, corresponding to constructive interference of forward-scattered plasmonic waves, were obtained at heights Δz ≥ 220 nm from the surface for random arrays of Cu-NPs of ∼ 60-100 nm in diameter. These heights are too large to use Cu-NPs in contact of the active layer for light harvesting in thin organic solar cells, which are typically no thicker than 200 nm. Using a 200 nm transparent spacer between the system of Cu-NPs and the solar cell active layer, we demonstrate that forward-scattered light can be conveyed in 200 nm thin film solar cells. This architecture increases the solar cell photoconversion efficiency by a factor of 3. Our 3D-SNOM technique is general enough to be suitable for a large number of other applications in nanoplasmonics.
Improved methods for the measurement and analysis of stellar magnetic fields
NASA Technical Reports Server (NTRS)
Saar, Steven H.
1988-01-01
The paper presents several improved methods for the measurement of magnetic fields on cool stars which take into account simple radiative transfer effects and the exact Zeeman patterns. Using these methods, high-resolution, low-noise data can be fitted with theoretical line profiles to determine the mean magnetic field strength in stellar active regions and a model-dependent fraction of the stellar surface (filling factor) covered by these regions. Random errors in the derived field strength and filling factor are parameterized in terms of signal-to-noise ratio, wavelength, spectral resolution, stellar rotation rate, and the magnetic parameters themselves. Weak line blends, if left uncorrected, can have significant systematic effects on the derived magnetic parameters, and thus several methods are developed to compensate partially for them. The magnetic parameters determined by previous methods likely have systematic errors because of such line blends and because of line saturation effects. Other sources of systematic error are explored in detail. These sources of error currently make it difficult to determine the magnetic parameters of individual stars to better than about + or - 20 percent.
Banerjee, Abhirup; Maji, Pradipta
2015-12-01
The segmentation of brain MR images into different tissue classes is an important task for automatic image analysis technique, particularly due to the presence of intensity inhomogeneity artifact in MR images. In this regard, this paper presents a novel approach for simultaneous segmentation and bias field correction in brain MR images. It integrates judiciously the concept of rough sets and the merit of a novel probability distribution, called stomped normal (SN) distribution. The intensity distribution of a tissue class is represented by SN distribution, where each tissue class consists of a crisp lower approximation and a probabilistic boundary region. The intensity distribution of brain MR image is modeled as a mixture of finite number of SN distributions and one uniform distribution. The proposed method incorporates both the expectation-maximization and hidden Markov random field frameworks to provide an accurate and robust segmentation. The performance of the proposed approach, along with a comparison with related methods, is demonstrated on a set of synthetic and real brain MR images for different bias fields and noise levels.
Tilted hexagonal post arrays: DNA electrophoresis in anisotropic media
Chen, Zhen; Dorfman, Kevin D.
2013-01-01
Using Brownian dynamics simulations, we show that DNA electrophoresis in a hexagonal array of micron-sized posts changes qualitatively when the applied electric field vector is not coincident with the lattice vectors of the array. DNA electrophoresis in such “tilted” post arrays is superior to the standard “un-tilted” approach; while the time required to achieve a resolution of unity in a tilted post array is similar to an un-tilted array at a low electric field strengths, this time (i) decreases exponentially with electric field strength in a tilted array and (ii) increases exponentially with electric field strength in an un-tilted array. Although the DNA dynamics in a post array are complicated, the electrophoretic mobility results indicate that the “free path”, i.e., the average distance of ballistic trajectories of point sized particles launched from random positions in the unit cell until they intersect the next post, is a useful proxy for the detailed DNA trajectories. The analysis of the free path reveals a fundamental connection between anisotropy of the medium and DNA transport therein that goes beyond simply improving the separation device. PMID:23868490
NASA Technical Reports Server (NTRS)
Poole, L. R.
1974-01-01
A study was conducted of an alternate method for storage and use of bathymetry data in the Langley Research Center and Virginia Institute of Marine Science mid-Atlantic continental-shelf wave-refraction computer program. The regional bathymetry array was divided into 105 indexed modules which can be read individually into memory in a nonsequential manner from a peripheral file using special random-access subroutines. In running a sample refraction case, a 75-percent decrease in program field length was achieved by using the random-access storage method in comparison with the conventional method of total regional array storage. This field-length decrease was accompanied by a comparative 5-percent increase in central processing time and a 477-percent increase in the number of operating-system calls. A comparative Langley Research Center computer system cost savings of 68 percent was achieved by using the random-access storage method.
NASA Astrophysics Data System (ADS)
Klise, K. A.; Weissmann, G. S.; McKenna, S. A.; Tidwell, V. C.; Frechette, J. D.; Wawrzyniec, T. F.
2007-12-01
Solute plumes are believed to disperse in a non-Fickian manner due to small-scale heterogeneity and variable velocities that create preferential pathways. In order to accurately predict dispersion in naturally complex geologic media, the connection between heterogeneity and dispersion must be better understood. Since aquifer properties can not be measured at every location, it is common to simulate small-scale heterogeneity with random field generators based on a two-point covariance (e.g., through use of sequential simulation algorithms). While these random fields can produce preferential flow pathways, it is unknown how well the results simulate solute dispersion through natural heterogeneous media. To evaluate the influence that complex heterogeneity has on dispersion, we utilize high-resolution terrestrial lidar to identify and model lithofacies from outcrop for application in particle tracking solute transport simulations using RWHet. The lidar scan data are used to produce a lab (meter) scale two-dimensional model that captures 2-8 mm scale natural heterogeneity. Numerical simulations utilize various methods to populate the outcrop structure captured by the lidar-based image with reasonable hydraulic conductivity values. The particle tracking simulations result in residence time distributions used to evaluate the nature of dispersion through complex media. Particle tracking simulations through conductivity fields produced from the lidar images are then compared to particle tracking simulations through hydraulic conductivity fields produced from sequential simulation algorithms. Based on this comparison, the study aims to quantify the difference in dispersion when using realistic and simplified representations of aquifer heterogeneity. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.
2013-12-01
A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)
Efficient 3D porous microstructure reconstruction via Gaussian random field and hybrid optimization.
Jiang, Z; Chen, W; Burkhart, C
2013-11-01
Obtaining an accurate three-dimensional (3D) structure of a porous microstructure is important for assessing the material properties based on finite element analysis. Whereas directly obtaining 3D images of the microstructure is impractical under many circumstances, two sets of methods have been developed in literature to generate (reconstruct) 3D microstructure from its 2D images: one characterizes the microstructure based on certain statistical descriptors, typically two-point correlation function and cluster correlation function, and then performs an optimization process to build a 3D structure that matches those statistical descriptors; the other method models the microstructure using stochastic models like a Gaussian random field and generates a 3D structure directly from the function. The former obtains a relatively accurate 3D microstructure, but computationally the optimization process can be very intensive, especially for problems with large image size; the latter generates a 3D microstructure quickly but sacrifices the accuracy due to issues in numerical implementations. A hybrid optimization approach of modelling the 3D porous microstructure of random isotropic two-phase materials is proposed in this paper, which combines the two sets of methods and hence maintains the accuracy of the correlation-based method with improved efficiency. The proposed technique is verified for 3D reconstructions based on silica polymer composite images with different volume fractions. A comparison of the reconstructed microstructures and the optimization histories for both the original correlation-based method and our hybrid approach demonstrates the improved efficiency of the approach. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Çeliker, Metin; Özgür, Abdulkadir; Tümkaya, Levent; Terzi, Suat; Yılmaz, Mustafa; Kalkan, Yıldıray; Erdoğan, Ender
The use of mobile phones has become widespread in recent years. Although beneficial from the communication viewpoint, the electromagnetic fields generated by mobile phones may cause unwanted biological changes in the human body. In this study, we aimed to evaluate the effects of 2100MHz Global System for Mobile communication (GSM-like) electromagnetic field, generated by an electromagnetic fields generator, on the auditory system of rats by using electrophysiological, histopathologic and immunohistochemical methods. Fourteen adult Wistar albino rats were included in the study. The rats were divided randomly into two groups of seven rats each. The study group was exposed continuously for 30days to a 2100MHz electromagnetic fields with a signal level (power) of 5.4dBm (3.47mW) to simulate the talk mode on a mobile phone. The control group was not exposed to the aforementioned electromagnetic fields. After 30days, the Auditory Brainstem Responses of both groups were recorded and the rats were sacrificed. The cochlear nuclei were evaluated by histopathologic and immunohistochemical methods. The Auditory Brainstem Responses records of the two groups did not differ significantly. The histopathologic analysis showed increased degeneration signs in the study group (p=0.007). In addition, immunohistochemical analysis revealed increased apoptotic index in the study group compared to that in the control group (p=0.002). The results support that long-term exposure to a GSM-like 2100MHz electromagnetic fields causes an increase in neuronal degeneration and apoptosis in the auditory system. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Model studies of the beam-filling error for rain-rate retrieval with microwave radiometers
NASA Technical Reports Server (NTRS)
Ha, Eunho; North, Gerald R.
1995-01-01
Low-frequency (less than 20 GHz) single-channel microwave retrievals of rain rate encounter the problem of beam-filling error. This error stems from the fact that the relationship between microwave brightness temperature and rain rate is nonlinear, coupled with the fact that the field of view is large or comparable to important scales of variability of the rain field. This means that one may not simply insert the area average of the brightness temperature into the formula for rain rate without incurring both bias and random error. The statistical heterogeneity of the rain-rate field in the footprint of the instrument is key to determining the nature of these errors. This paper makes use of a series of random rain-rate fields to study the size of the bias and random error associated with beam filling. A number of examples are analyzed in detail: the binomially distributed field, the gamma, the Gaussian, the mixed gamma, the lognormal, and the mixed lognormal ('mixed' here means there is a finite probability of no rain rate at a point of space-time). Of particular interest are the applicability of a simple error formula due to Chiu and collaborators and a formula that might hold in the large field of view limit. It is found that the simple formula holds for Gaussian rain-rate fields but begins to fail for highly skewed fields such as the mixed lognormal. While not conclusively demonstrated here, it is suggested that the notionof climatologically adjusting the retrievals to remove the beam-filling bias is a reasonable proposition.
Micromechanics-based magneto-elastic constitutive modeling of particulate composites
NASA Astrophysics Data System (ADS)
Yin, Huiming
Modified Green's functions are derived for three situations: a magnetic field caused by a local magnetization, a displacement field caused by a local body force and a displacement field caused by a local prescribed eigenstrain. Based on these functions, an explicit solution is derived for two magnetic particles embedded in the infinite medium under external magnetic and mechanical loading. A general solution for numerable magnetic particles embedded in an infinite domain is then provided in integral form. Two-phase composites containing spherical magnetic particles of the same size are considered for three kinds of microstructures. With chain-structured composites, particle interactions in the same chain are considered and a transversely isotropic effective elasticity is obtained. For periodic composites, an eight-particle interaction model is developed and provides a cubic symmetric effective elasticity. In the random composite, pair-wise particle interactions are integrated from all possible positions and an isotropic effective property is reached. This method is further extended to functionally graded composites. Magneto-mechanical behavior is studied for the chain-structured composite and the random composite. Effective magnetic permeability, effective magnetostriction and field-dependent effective elasticity are investigated. It is seen that the chain-structured composite is more sensitive to the magnetic field than the random composite; a composite consisting of only 5% of chain-structured particles can provide a larger magnetostriction and a larger change of effective elasticity than an equivalent composite consisting of 30% of random dispersed particles. Moreover, the effective shear modulus of the chain-structured composite rapidly increases with the magnetic field, while that for the random composite decreases. An effective hyperelastic constitutive model is further developed for a magnetostrictive particle-filled elastomer, which is sampled by using a network of body-centered cubic lattices of particles connected by macromolecular chains. The proposed hyperelastic model is able to characterize overall nonlinear elastic stress-stretch relations of the composites under general three-dimensional loading. It is seen that the effective strain energy density is proportional to the length of stretched chains in unit volume and volume fraction of particles.
Wang, Ce-Qun; Chen, Qiang; Zhang, Lu; Xu, Jia-Min; Lin, Long-Nian
2014-12-25
The purpose of this article is to introduce the measurements of phase coupling between spikes and rhythmic oscillations of local field potentials (LFPs). Multi-channel in vivo recording techniques allow us to record ensemble neuronal activity and LFPs simultaneously from the same sites in the brain. Neuronal activity is generally characterized by temporal spike sequences, while LFPs contain oscillatory rhythms in different frequency ranges. Phase coupling analysis can reveal the temporal relationships between neuronal firing and LFP rhythms. As the first step, the instantaneous phase of LFP rhythms can be calculated using Hilbert transform, and then for each time-stamped spike occurred during an oscillatory epoch, we marked instantaneous phase of the LFP at that time stamp. Finally, the phase relationships between the neuronal firing and LFP rhythms were determined by examining the distribution of the firing phase. Phase-locked spikes are revealed by the non-random distribution of spike phase. Theta phase precession is a unique phase relationship between neuronal firing and LFPs, which is one of the basic features of hippocampal place cells. Place cells show rhythmic burst firing following theta oscillation within a place field. And phase precession refers to that rhythmic burst firing shifted in a systematic way during traversal of the field, moving progressively forward on each theta cycle. This relation between phase and position can be described by a linear model, and phase precession is commonly quantified with a circular-linear coefficient. Phase coupling analysis helps us to better understand the temporal information coding between neuronal firing and LFPs.
Meta-analysis of trials of streptococcal throat treatment programs to prevent rheumatic fever.
Lennon, Diana; Kerdemelidis, Melissa; Arroll, Bruce
2009-07-01
Rheumatic fever (RF) is the commonest cause of pediatric heart disease globally. Penicillin for streptococcal pharyngitis prevents RF. Inequitable access to health care persists. To investigate RF prevention by treating streptococcal pharyngitis in school- and/or community-based programs. Medline, Old Medline, the Cochrane Library, DARE, Central, NHS, EED, NICE, NRMC, Clinical Evidence, CDC website, PubMed, and reference lists of retrieved articles. Known researchers in the field were contacted where possible. Randomized, controlled trials or trials of before/after design examining treatment of sore throats in schools or communities with RF as an outcome where data were able to be pooled for analysis. Two authors examined titles, abstracts, selected articles, and extracted data. Disagreements were resolved by consensus. QUANTITATIVE ANALYSIS TOOL: Review Manager version 4.2 to assess pooled relative risks and 95% confidence intervals. Six studies (of 677 screened) which met the criteria and could be pooled were included. Meta-analysis of these trials for RF control produced a relative risk of 0.41 (95% CI: 0.23-0.70). There was statistical heterogeneity (I = 70.5%). Hence a random effects analysis was conducted. Many studies were poor quality. Title and available abstracts of non-English studies were checked. There may be publication bias. This is the best available evidence in an area with imperfect information. It is expected acute RF cases would diminish by about 60% using a school or community clinic to treat streptococcal pharyngitis. This should be considered in high-risk populations.
Quantized vortices in the ideal bose gas: a physical realization of random polynomials.
Castin, Yvan; Hadzibabic, Zoran; Stock, Sabine; Dalibard, Jean; Stringari, Sandro
2006-02-03
We propose a physical system allowing one to experimentally observe the distribution of the complex zeros of a random polynomial. We consider a degenerate, rotating, quasi-ideal atomic Bose gas prepared in the lowest Landau level. Thermal fluctuations provide the randomness of the bosonic field and of the locations of the vortex cores. These vortices can be mapped to zeros of random polynomials, and observed in the density profile of the gas.
Hyperuniformity and its generalizations.
Torquato, Salvatore
2016-08-01
Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystal and liquid: They are like perfect crystals in the way they suppress large-scale density fluctuations and yet are like liquids or glasses in that they are statistically isotropic with no Bragg peaks. These exotic states of matter play a vital role in a number of problems across the physical, mathematical as well as biological sciences and, because they are endowed with novel physical properties, have technological importance. Given the fundamental as well as practical importance of disordered hyperuniform systems elucidated thus far, it is natural to explore the generalizations of the hyperuniformity notion and its consequences. In this paper, we substantially broaden the hyperuniformity concept along four different directions. This includes generalizations to treat fluctuations in the interfacial area (one of the Minkowski functionals) in heterogeneous media and surface-area driven evolving microstructures, random scalar fields, divergence-free random vector fields, and statistically anisotropic many-particle systems and two-phase media. In all cases, the relevant mathematical underpinnings are formulated and illustrative calculations are provided. Interfacial-area fluctuations play a major role in characterizing the microstructure of two-phase systems (e.g., fluid-saturated porous media), physical properties that intimately depend on the geometry of the interface, and evolving two-phase microstructures that depend on interfacial energies (e.g., spinodal decomposition). In the instances of random vector fields and statistically anisotropic structures, we show that the standard definition of hyperuniformity must be generalized such that it accounts for the dependence of the relevant spectral functions on the direction in which the origin in Fourier space is approached (nonanalyticities at the origin). Using this analysis, we place some well-known energy spectra from the theory of isotropic turbulence in the context of this generalization of hyperuniformity. Among other results, we show that there exist many-particle ground-state configurations in which directional hyperuniformity imparts exotic anisotropic physical properties (e.g., elastic, optical, and acoustic characteristics) to these states of matter. Such tunability could have technological relevance for manipulating light and sound waves in ways heretofore not thought possible. We show that disordered many-particle systems that respond to external fields (e.g., magnetic and electric fields) are a natural class of materials to look for directional hyperuniformity. The generalizations of hyperuniformity introduced here provide theoreticians and experimentalists new avenues to understand a very broad range of phenomena across a variety of fields through the hyperuniformity "lens."
Hyperuniformity and its generalizations
NASA Astrophysics Data System (ADS)
Torquato, Salvatore
2016-08-01
Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystal and liquid: They are like perfect crystals in the way they suppress large-scale density fluctuations and yet are like liquids or glasses in that they are statistically isotropic with no Bragg peaks. These exotic states of matter play a vital role in a number of problems across the physical, mathematical as well as biological sciences and, because they are endowed with novel physical properties, have technological importance. Given the fundamental as well as practical importance of disordered hyperuniform systems elucidated thus far, it is natural to explore the generalizations of the hyperuniformity notion and its consequences. In this paper, we substantially broaden the hyperuniformity concept along four different directions. This includes generalizations to treat fluctuations in the interfacial area (one of the Minkowski functionals) in heterogeneous media and surface-area driven evolving microstructures, random scalar fields, divergence-free random vector fields, and statistically anisotropic many-particle systems and two-phase media. In all cases, the relevant mathematical underpinnings are formulated and illustrative calculations are provided. Interfacial-area fluctuations play a major role in characterizing the microstructure of two-phase systems (e.g., fluid-saturated porous media), physical properties that intimately depend on the geometry of the interface, and evolving two-phase microstructures that depend on interfacial energies (e.g., spinodal decomposition). In the instances of random vector fields and statistically anisotropic structures, we show that the standard definition of hyperuniformity must be generalized such that it accounts for the dependence of the relevant spectral functions on the direction in which the origin in Fourier space is approached (nonanalyticities at the origin). Using this analysis, we place some well-known energy spectra from the theory of isotropic turbulence in the context of this generalization of hyperuniformity. Among other results, we show that there exist many-particle ground-state configurations in which directional hyperuniformity imparts exotic anisotropic physical properties (e.g., elastic, optical, and acoustic characteristics) to these states of matter. Such tunability could have technological relevance for manipulating light and sound waves in ways heretofore not thought possible. We show that disordered many-particle systems that respond to external fields (e.g., magnetic and electric fields) are a natural class of materials to look for directional hyperuniformity. The generalizations of hyperuniformity introduced here provide theoreticians and experimentalists new avenues to understand a very broad range of phenomena across a variety of fields through the hyperuniformity "lens."
Gaps between avalanches in one-dimensional random-field Ising models
NASA Astrophysics Data System (ADS)
Nampoothiri, Jishnu N.; Ramola, Kabir; Sabhapandit, Sanjib; Chakraborty, Bulbul
2017-09-01
We analyze the statistics of gaps (Δ H ) between successive avalanches in one-dimensional random-field Ising models (RFIMs) in an external field H at zero temperature. In the first part of the paper we study the nearest-neighbor ferromagnetic RFIM. We map the sequence of avalanches in this system to a nonhomogeneous Poisson process with an H -dependent rate ρ (H ) . We use this to analytically compute the distribution of gaps P (Δ H ) between avalanches as the field is increased monotonically from -∞ to +∞ . We show that P (Δ H ) tends to a constant C (R ) as Δ H →0+ , which displays a nontrivial behavior with the strength of disorder R . We verify our predictions with numerical simulations. In the second part of the paper, motivated by avalanche gap distributions in driven disordered amorphous solids, we study a long-range antiferromagnetic RFIM. This model displays a gapped behavior P (Δ H )=0 up to a system size dependent offset value Δ Hoff , and P (Δ H ) ˜(ΔH -Δ Hoff) θ as Δ H →Hoff+ . We perform numerical simulations on this model and determine θ ≈0.95 (5 ) . We also discuss mechanisms which would lead to a nonzero exponent θ for general spin models with quenched random fields.
Calculating binding free energies of host-guest systems using the AMOEBA polarizable force field.
Bell, David R; Qi, Rui; Jing, Zhifeng; Xiang, Jin Yu; Mejias, Christopher; Schnieders, Michael J; Ponder, Jay W; Ren, Pengyu
2016-11-09
Molecular recognition is of paramount interest in many applications. Here we investigate a series of host-guest systems previously used in the SAMPL4 blind challenge by using molecular simulations and the AMOEBA polarizable force field. The free energy results computed by Bennett's acceptance ratio (BAR) method using the AMOEBA polarizable force field ranked favorably among the entries submitted to the SAMPL4 host-guest competition [Muddana, et al., J. Comput.-Aided Mol. Des., 2014, 28, 305-317]. In this work we conduct an in-depth analysis of the AMOEBA force field host-guest binding thermodynamics by using both BAR and the orthogonal space random walk (OSRW) methods. The binding entropy-enthalpy contributions are analyzed for each host-guest system. For systems of inordinate binding entropy-enthalpy values, we further examine the hydrogen bonding patterns and configurational entropy contribution. The binding mechanism of this series of host-guest systems varies from ligand to ligand, driven by enthalpy and/or entropy changes. Convergence of BAR and OSRW binding free energy methods is discussed. Ultimately, this work illustrates the value of molecular modelling and advanced force fields for the exploration and interpretation of binding thermodynamics.
Gout, Lilian; Eckert, Maria; Rouxel, Thierry; Balesdent, Marie-Hélène
2006-01-01
Leptosphaeria maculans is the most ubiquitous fungal pathogen of Brassica crops and causes the devastating stem canker disease of oilseed rape worldwide. We used minisatellite markers to determine the genetic structure of L. maculans in four field populations from France. Isolates were collected at three different spatial scales (leaf, 2-m2 field plot, and field) enabling the evaluation of spatial distribution of the mating type alleles and of genetic variability within and among field populations. Within each field population, no gametic disequilibrium between the minisatellite loci was detected and the mating type alleles were present at equal frequencies. Both sexual and asexual reproduction occur in the field, but the genetic structure of these populations is consistent with annual cycles of randomly mating sexual reproduction. All L. maculans field populations had a high level of gene diversity (H = 0.68 to 0.75) and genotypic diversity. Within each field population, the number of genotypes often was very close to the number of isolates. Analysis of molecular variance indicated that >99.5% of the total genetic variability was distributed at a small spatial scale, i.e., within 2-m2 field plots. Population differentiation among the four field populations was low (GST < 0.02), suggesting a high degree of gene exchange between these populations. The high gene flow evidenced here in French populations of L. maculans suggests a rapid countrywide diffusion of novel virulence alleles whenever novel resistance sources are used. PMID:16391041
Gradient Boosting for Conditional Random Fields
2014-09-23
221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257...length around 208. We use exactly the same features and data split step as [22]. The resulting data set contains 5600 sequences as training set, 256 ...1097–1104. [16] F. Sha and F. Pereira. Shallow parsing with conditional random fields. In Proceedings of the 2003 Conference of the North American
Condensation of helium in aerogel and athermal dynamics of the random-field Ising model.
Aubry, Geoffroy J; Bonnet, Fabien; Melich, Mathieu; Guyon, Laurent; Spathis, Panayotis; Despetis, Florence; Wolf, Pierre-Etienne
2014-08-22
High resolution measurements reveal that condensation isotherms of (4)He in high porosity silica aerogel become discontinuous below a critical temperature. We show that this behavior does not correspond to an equilibrium phase transition modified by the disorder induced by the aerogel structure, but to the disorder-driven critical point predicted for the athermal out-of-equilibrium dynamics of the random-field Ising model. Our results evidence the key role of nonequilibrium effects in the phase transitions of disordered systems.
Jiang, Xunpeng; Yang, Zengling; Han, Lujia
2014-07-01
Contaminated meat and bone meal (MBM) in animal feedstuff has been the source of bovine spongiform encephalopathy (BSE) disease in cattle, leading to a ban in its use, so methods for its detection are essential. In this study, five pure feed and five pure MBM samples were used to prepare two sets of sample arrangements: set A for investigating the discrimination of individual feed/MBM particles and set B for larger numbers of overlapping particles. The two sets were used to test a Markov random field (MRF)-based approach. A Fourier transform infrared (FT-IR) imaging system was used for data acquisition. The spatial resolution of the near-infrared (NIR) spectroscopic image was 25 μm × 25 μm. Each spectrum was the average of 16 scans across the wavenumber range 7,000-4,000 cm(-1), at intervals of 8 cm(-1). This study introduces an innovative approach to analyzing NIR spectroscopic images: an MRF-based approach has been developed using the iterated conditional mode (ICM) algorithm, integrating initial labeling-derived results from support vector machine discriminant analysis (SVMDA) and observation data derived from the results of principal component analysis (PCA). The results showed that MBM covered by feed could be successfully recognized with an overall accuracy of 86.59% and a Kappa coefficient of 0.68. Compared with conventional methods, the MRF-based approach is capable of extracting spectral information combined with spatial information from NIR spectroscopic images. This new approach enhances the identification of MBM using NIR spectroscopic imaging.
Carbon nanotubes within polymer matrix can synergistically enhance mechanical energy dissipation
NASA Astrophysics Data System (ADS)
Ashraf, Taimoor; Ranaiefar, Meelad; Khatri, Sumit; Kavosi, Jamshid; Gardea, Frank; Glaz, Bryan; Naraghi, Mohammad
2018-03-01
Safe operation and health of structures relies on their ability to effectively dissipate undesired vibrations, which could otherwise significantly reduce the life-time of a structure due to fatigue loads or large deformations. To address this issue, nanoscale fillers, such as carbon nanotubes (CNTs), have been utilized to dissipate mechanical energy in polymer-based nanocomposites through filler-matrix interfacial friction by benefitting from their large interface area with the matrix. In this manuscript, for the first time, we experimentally investigate the effect of CNT alignment with respect to reach other and their orientation with respect to the loading direction on vibrational damping in nanocomposites. The matrix was polystyrene (PS). A new technique was developed to fabricate PS-CNT nanocomposites which allows for controlling the angle of CNTs with respect to the far-field loading direction (misalignment angle). Samples were subjected to dynamic mechanical analysis, and the damping of the samples were measured as the ratio of the loss to storage moduli versus CNT misalignment angle. Our results defied a notion that randomly oriented CNT nanocomposites can be approximated as a combination of matrix-CNT representative volume elements with randomly aligned CNTs. Instead, our results points to major contributions of stress concentration induced by each CNT in the matrix in proximity of other CNTs on vibrational damping. The stress fields around CNTs in PS-CNT nanocomposites were studied via finite element analysis. Our findings provide significant new insights not only on vibrational damping nanocomposites, but also on their failure modes and toughness, in relation to interface phenomena.
Fluctuations of the partition function in the generalized random energy model with external field
NASA Astrophysics Data System (ADS)
Bovier, Anton; Klimovsky, Anton
2008-12-01
We study Derrida's generalized random energy model (GREM) in the presence of uniform external field. We compute the fluctuations of the ground state and of the partition function in the thermodynamic limit for all admissible values of parameters. We find that the fluctuations are described by a hierarchical structure which is obtained by a certain coarse graining of the initial hierarchical structure of the GREM with external field. We provide an explicit formula for the free energy of the model. We also derive some large deviation results providing an expression for the free energy in a class of models with Gaussian Hamiltonians and external field. Finally, we prove that the coarse-grained parts of the system emerging in the thermodynamic limit tend to have a certain optimal magnetization, as prescribed by the strength of the external field and by parameters of the GREM.
Nonlinear wave chaos: statistics of second harmonic fields.
Zhou, Min; Ott, Edward; Antonsen, Thomas M; Anlage, Steven M
2017-10-01
Concepts from the field of wave chaos have been shown to successfully predict the statistical properties of linear electromagnetic fields in electrically large enclosures. The Random Coupling Model (RCM) describes these properties by incorporating both universal features described by Random Matrix Theory and the system-specific features of particular system realizations. In an effort to extend this approach to the nonlinear domain, we add an active nonlinear frequency-doubling circuit to an otherwise linear wave chaotic system, and we measure the statistical properties of the resulting second harmonic fields. We develop an RCM-based model of this system as two linear chaotic cavities coupled by means of a nonlinear transfer function. The harmonic field strengths are predicted to be the product of two statistical quantities and the nonlinearity characteristics. Statistical results from measurement-based calculation, RCM-based simulation, and direct experimental measurements are compared and show good agreement over many decades of power.
Gates, Allison; Hartling, Lisa; Vandermeer, Ben; Caldwell, Patrina; Contopoulos-Ioannidis, Despina G; Curtis, Sarah; Fernandes, Ricardo M; Klassen, Terry P; Williams, Katrina; Dyson, Michele P
2018-02-01
For child health randomized controlled trials (RCTs) published in 2012, we aimed to describe design and reporting characteristics and evaluate changes since 2007; assess the association between trial design and registration and risk of bias (RoB); and assess the association between RoB and effect size. For 300 RCTs, we extracted design and reporting characteristics and assessed RoB. We assessed 5-year changes in design and reporting (based on 300 RCTs we had previously analyzed) using the Fisher exact test. We tested for associations between design and reporting characteristics and overall RoB and registration using the Fisher exact, Cochran-Armitage, Kruskal-Wallis, and Jonckheere-Terpstra tests. We pooled effect sizes and tested for differences by RoB using the χ 2 test for subgroups in meta-analysis. The 2012 and 2007 RCTs differed with respect to many design and reporting characteristics. From 2007 to 2012, RoB did not change for random sequence generation and improved for allocation concealment (P < .001). Fewer 2012 RCTs were rated high overall RoB and more were rated unclear (P = .03). Only 7.3% of 2012 RCTs were rated low overall RoB. Trial registration doubled from 2007 to 2012 (23% to 46%) (P < .001) and was associated with lower RoB (P = .009). Effect size did not differ by RoB (P = .43) CONCLUSIONS: Random sequence generation and allocation concealment were not often reported, and selective reporting was prevalent. Measures to increase trialists' awareness and application of existing reporting guidance, and the prospective registration of RCTs is needed to improve the trustworthiness of findings from this field. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Sullivan, Jason P; O'Brien, Conor S; Barger, Laura K; Rajaratnam, Shantha M W; Czeisler, Charles A; Lockley, Steven W
2017-01-01
Firefighters' schedules include extended shifts and long work weeks which cause sleep deficiency and circadian rhythm disruption. Many firefighters also suffer from undiagnosed sleep disorders, exacerbating fatigue. We tested the hypothesis that a workplace-based Sleep Health Program (SHP) incorporating sleep health education and sleep disorders screening would improve firefighter health and safety compared to standard practice. Prospective station-level randomized, field-based intervention. US fire department. 1189 firefighters. Sleep health education, questionnaire-based sleep disorders screening, and sleep clinic referrals for respondents who screened positive for a sleep disorder. Firefighters were randomized by station. Using departmental records, in an intention-to-treat analysis, firefighters assigned to intervention stations which participated in education sessions and had the opportunity to complete sleep disorders screening reported 46% fewer disability days than those assigned to control stations (1.4 ± 5.9 vs. 2.6 ± 8.5 days/firefighter, respectively; p = .003). There were no significant differences in departmental injury or motor vehicle crash rates between the groups. In post hoc analysis accounting for intervention exposure, firefighters who attended education sessions were 24% less likely to file at least one injury report during the study than those who did not attend, regardless of randomization (OR [95% CI] 0.76 [0.60, 0.98]; χ2 = 4.56; p = .033). There were no significant changes pre- versus post-study in self-reported sleep or sleepiness in those who participated in the intervention. A firefighter workplace-based SHP providing sleep health education and sleep disorders screening opportunity can reduce injuries and work loss due to disability in firefighters. © Sleep Research Society 2016. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
WANG, P. T.
2015-12-01
Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.
Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center
NASA Technical Reports Server (NTRS)
Reinath, Michael S.
1997-01-01
Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.
Liu, Xuan; Ramella-Roman, Jessica C.; Huang, Yong; Guo, Yuan; Kang, Jin U.
2013-01-01
In this study, we proposed a generic speckle simulation for optical coherence tomography (OCT) signal, by convolving the point spread function (PSF) of the OCT system with the numerically synthesized random sample field. We validated our model and used the simulation method to study the statistical properties of cross-correlation coefficients (XCC) between Ascans which have been recently applied in transverse motion analysis by our group. The results of simulation show that over sampling is essential for accurate motion tracking; exponential decay of OCT signal leads to an under estimate of motion which can be corrected; lateral heterogeneity of sample leads to an over estimate of motion for a few pixels corresponding to the structural boundary. PMID:23456001
Data survey on the effect of product features on competitive advantage of selected firms in Nigeria.
Olokundun, Maxwell; Iyiola, Oladele; Ibidunni, Stephen; Falola, Hezekiah; Salau, Odunayo; Amaihian, Augusta; Peter, Fred; Borishade, Taiye
2018-06-01
The main objective of this study was to present a data article that investigates the effect product features on firm's competitive advantage. Few studies have examined how the features of a product could help in driving the competitive advantage of a firm. Descriptive research method was used. Statistical Package for Social Sciences (SPSS 22) was engaged for analysis of one hundred and fifty (150) valid questionnaire which were completed by small business owners registered under small and medium scale enterprises development of Nigeria (SMEDAN). Stratified and simple random sampling techniques were employed; reliability and validity procedures were also confirmed. The field data set is made publicly available to enable critical or extended analysis.